Apr 23 09:30:34.424563 ip-10-0-136-17 systemd[1]: Starting Kubernetes Kubelet... Apr 23 09:30:35.076621 ip-10-0-136-17 kubenswrapper[2566]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 09:30:35.076621 ip-10-0-136-17 kubenswrapper[2566]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 09:30:35.076621 ip-10-0-136-17 kubenswrapper[2566]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 09:30:35.076621 ip-10-0-136-17 kubenswrapper[2566]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 09:30:35.076621 ip-10-0-136-17 kubenswrapper[2566]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 09:30:35.076621 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.933313 2566 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 09:30:35.076621 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.937920 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 09:30:35.076621 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.937931 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 09:30:35.076621 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.937935 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 09:30:35.076621 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.937939 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 09:30:35.076621 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.937942 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 09:30:35.076621 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.937945 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 09:30:35.076621 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.937948 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 09:30:35.076621 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.937951 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 09:30:35.079528 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.937955 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 09:30:35.079528 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.937957 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 09:30:35.079528 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.937960 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 09:30:35.079528 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.937963 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 09:30:35.079528 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.937973 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 09:30:35.079528 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.937976 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 09:30:35.079528 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.937978 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 09:30:35.079528 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.937981 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 09:30:35.079528 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.937984 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 09:30:35.079528 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.937986 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 09:30:35.079528 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.937989 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 09:30:35.079528 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.937991 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 09:30:35.079528 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.937994 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 09:30:35.079528 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.937996 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 09:30:35.079528 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.937999 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 09:30:35.079528 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938002 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 09:30:35.079528 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938005 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 09:30:35.079528 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938007 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 09:30:35.079528 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938010 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 09:30:35.079528 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938012 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 09:30:35.052427 ip-10-0-136-17 systemd[1]: Started Kubernetes Kubelet. Apr 23 09:30:35.080621 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938015 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 09:30:35.080621 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938018 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 09:30:35.080621 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938021 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 09:30:35.080621 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938024 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 09:30:35.080621 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938027 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 09:30:35.080621 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938029 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 09:30:35.080621 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938032 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 09:30:35.080621 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938034 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 09:30:35.080621 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938037 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 09:30:35.080621 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938039 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 09:30:35.080621 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938042 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 09:30:35.080621 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938045 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 09:30:35.080621 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938047 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 09:30:35.080621 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938050 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 09:30:35.080621 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938052 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 09:30:35.080621 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938055 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 09:30:35.080621 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938057 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 09:30:35.080621 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938060 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 09:30:35.080621 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938063 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 09:30:35.080621 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938065 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 09:30:35.323451 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938067 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 09:30:35.323451 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938070 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 09:30:35.323451 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938073 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 09:30:35.323451 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938075 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 09:30:35.323451 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938080 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 09:30:35.323451 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938084 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 09:30:35.323451 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938087 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 09:30:35.323451 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938090 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 09:30:35.323451 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938092 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 09:30:35.323451 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938094 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 09:30:35.323451 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938098 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 09:30:35.323451 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938100 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 09:30:35.323451 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938103 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 09:30:35.323451 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938106 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 09:30:35.323451 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938108 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 09:30:35.323451 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938111 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 23 09:30:35.323451 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938114 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 09:30:35.323451 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938116 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 09:30:35.323451 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938124 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 09:30:35.323451 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938127 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 09:30:35.334841 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938130 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 09:30:35.334841 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938133 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 09:30:35.334841 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938135 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 09:30:35.334841 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938138 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 09:30:35.334841 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938141 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 09:30:35.334841 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938143 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 09:30:35.334841 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938146 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 09:30:35.334841 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938148 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 09:30:35.334841 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938151 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 09:30:35.334841 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938153 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 09:30:35.334841 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938156 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 09:30:35.334841 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938158 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 09:30:35.334841 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938162 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 09:30:35.334841 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938164 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 09:30:35.334841 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938167 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 09:30:35.334841 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938171 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 09:30:35.334841 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938174 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 09:30:35.334841 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938177 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 09:30:35.334841 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938596 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 09:30:35.334841 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938603 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 09:30:35.335453 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938606 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 09:30:35.335453 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938609 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 09:30:35.335453 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938612 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 09:30:35.335453 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938615 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 09:30:35.335453 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938617 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 09:30:35.335453 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938620 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 09:30:35.335453 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938623 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 09:30:35.335453 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938626 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 09:30:35.335453 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938628 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 09:30:35.335453 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938631 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 09:30:35.335453 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938634 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 09:30:35.335453 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938637 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 09:30:35.335453 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938639 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 09:30:35.335453 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938642 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 09:30:35.335453 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938644 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 09:30:35.335453 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938647 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 09:30:35.335453 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938650 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 09:30:35.335453 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938652 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 09:30:35.335453 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938655 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 09:30:35.335453 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938658 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 09:30:35.336162 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938660 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 09:30:35.336162 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938663 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 09:30:35.336162 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938665 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 09:30:35.336162 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938667 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 09:30:35.336162 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938670 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 09:30:35.336162 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938673 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 09:30:35.336162 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938675 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 09:30:35.336162 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938677 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 09:30:35.336162 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938680 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 09:30:35.336162 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938682 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 09:30:35.336162 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938685 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 09:30:35.336162 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938687 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 09:30:35.336162 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938690 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 09:30:35.336162 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938693 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 09:30:35.336162 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938697 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 09:30:35.336162 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938700 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 09:30:35.336162 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938703 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 09:30:35.336162 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938706 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 09:30:35.336162 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938709 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 09:30:35.336162 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938712 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 09:30:35.336782 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938714 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 09:30:35.336782 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938716 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 09:30:35.336782 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938719 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 09:30:35.336782 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938722 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 09:30:35.336782 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938724 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 09:30:35.336782 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938727 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 09:30:35.336782 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938730 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 09:30:35.336782 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938732 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 09:30:35.336782 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938734 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 09:30:35.336782 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938737 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 09:30:35.336782 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938739 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 09:30:35.336782 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938742 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 09:30:35.336782 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938744 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 09:30:35.336782 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938748 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 09:30:35.336782 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938752 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 09:30:35.336782 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938754 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 09:30:35.336782 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938757 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 09:30:35.336782 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938760 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 09:30:35.336782 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938763 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 23 09:30:35.337431 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938765 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 09:30:35.337431 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938768 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 09:30:35.337431 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938771 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 09:30:35.337431 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938774 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 09:30:35.337431 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938777 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 09:30:35.337431 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938779 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 09:30:35.337431 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938782 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 09:30:35.337431 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938784 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 09:30:35.337431 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938787 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 09:30:35.337431 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938790 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 09:30:35.337431 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938793 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 09:30:35.337431 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938795 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 09:30:35.337431 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938799 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 09:30:35.337431 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938802 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 09:30:35.337431 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938805 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 09:30:35.337431 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938807 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 09:30:35.337431 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938810 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 09:30:35.337431 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938813 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 09:30:35.337431 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938815 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 09:30:35.338021 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938818 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 09:30:35.338021 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938820 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 09:30:35.338021 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938823 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 09:30:35.338021 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938825 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 09:30:35.338021 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938828 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 09:30:35.338021 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.938830 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 09:30:35.338021 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940364 2566 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 09:30:35.338021 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940374 2566 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 09:30:35.338021 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940382 2566 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 09:30:35.338021 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940387 2566 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 09:30:35.338021 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940392 2566 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 09:30:35.338021 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940395 2566 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 09:30:35.338021 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940399 2566 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 09:30:35.338021 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940404 2566 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 09:30:35.338021 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940408 2566 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 09:30:35.338021 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940411 2566 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 09:30:35.338021 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940414 2566 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 09:30:35.338021 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940418 2566 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 09:30:35.338021 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940421 2566 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 09:30:35.338021 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940424 2566 flags.go:64] FLAG: --cgroup-root="" Apr 23 09:30:35.338021 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940427 2566 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 09:30:35.338021 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940430 2566 flags.go:64] FLAG: --client-ca-file="" Apr 23 09:30:35.338021 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940433 2566 flags.go:64] FLAG: --cloud-config="" Apr 23 09:30:35.338745 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940436 2566 flags.go:64] FLAG: --cloud-provider="external" Apr 23 09:30:35.338745 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940439 2566 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 09:30:35.338745 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940443 2566 flags.go:64] FLAG: --cluster-domain="" Apr 23 09:30:35.338745 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940445 2566 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 09:30:35.338745 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940449 2566 flags.go:64] FLAG: --config-dir="" Apr 23 09:30:35.338745 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940451 2566 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 09:30:35.338745 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940454 2566 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 09:30:35.338745 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940458 2566 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 09:30:35.338745 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940462 2566 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 09:30:35.338745 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940465 2566 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 09:30:35.338745 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940469 2566 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 09:30:35.338745 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940471 2566 flags.go:64] FLAG: --contention-profiling="false" Apr 23 09:30:35.338745 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940474 2566 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 09:30:35.338745 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940478 2566 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 09:30:35.338745 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940481 2566 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 09:30:35.338745 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940484 2566 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 09:30:35.338745 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940488 2566 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 09:30:35.338745 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940491 2566 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 09:30:35.338745 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940494 2566 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 09:30:35.338745 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940497 2566 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 09:30:35.338745 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940500 2566 flags.go:64] FLAG: --enable-server="true" Apr 23 09:30:35.338745 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940503 2566 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 09:30:35.338745 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940508 2566 flags.go:64] FLAG: --event-burst="100" Apr 23 09:30:35.338745 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940511 2566 flags.go:64] FLAG: --event-qps="50" Apr 23 09:30:35.338745 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940514 2566 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 09:30:35.339593 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940518 2566 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 09:30:35.339593 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940521 2566 flags.go:64] FLAG: --eviction-hard="" Apr 23 09:30:35.339593 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940524 2566 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 09:30:35.339593 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940528 2566 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 09:30:35.339593 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940531 2566 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 09:30:35.339593 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940533 2566 flags.go:64] FLAG: --eviction-soft="" Apr 23 09:30:35.339593 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940536 2566 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 09:30:35.339593 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940539 2566 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 09:30:35.339593 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940542 2566 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 09:30:35.339593 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940545 2566 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 09:30:35.339593 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940548 2566 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 09:30:35.339593 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940550 2566 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 09:30:35.339593 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940556 2566 flags.go:64] FLAG: --feature-gates="" Apr 23 09:30:35.339593 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940559 2566 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 09:30:35.339593 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940562 2566 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 09:30:35.339593 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940565 2566 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 09:30:35.339593 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940569 2566 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 09:30:35.339593 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940572 2566 flags.go:64] FLAG: --healthz-port="10248" Apr 23 09:30:35.339593 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940575 2566 flags.go:64] FLAG: --help="false" Apr 23 09:30:35.339593 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940578 2566 flags.go:64] FLAG: --hostname-override="ip-10-0-136-17.ec2.internal" Apr 23 09:30:35.339593 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940582 2566 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 09:30:35.339593 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940585 2566 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 09:30:35.339593 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940587 2566 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 09:30:35.340231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940591 2566 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 09:30:35.340231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940595 2566 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 09:30:35.340231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940598 2566 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 09:30:35.340231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940601 2566 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 09:30:35.340231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940603 2566 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 09:30:35.340231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940606 2566 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 09:30:35.340231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940609 2566 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 09:30:35.340231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940612 2566 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 09:30:35.340231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940615 2566 flags.go:64] FLAG: --kube-reserved="" Apr 23 09:30:35.340231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940619 2566 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 09:30:35.340231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940622 2566 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 09:30:35.340231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940625 2566 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 09:30:35.340231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940628 2566 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 09:30:35.340231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940631 2566 flags.go:64] FLAG: --lock-file="" Apr 23 09:30:35.340231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940633 2566 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 09:30:35.340231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940636 2566 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 09:30:35.340231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940639 2566 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 09:30:35.340231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940644 2566 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 09:30:35.340231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940647 2566 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 09:30:35.340231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940650 2566 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 09:30:35.340231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940653 2566 flags.go:64] FLAG: --logging-format="text" Apr 23 09:30:35.340231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940657 2566 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 09:30:35.340231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940660 2566 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 09:30:35.340231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940663 2566 flags.go:64] FLAG: --manifest-url="" Apr 23 09:30:35.340917 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940666 2566 flags.go:64] FLAG: --manifest-url-header="" Apr 23 09:30:35.340917 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940670 2566 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 09:30:35.340917 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940674 2566 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 09:30:35.340917 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940678 2566 flags.go:64] FLAG: --max-pods="110" Apr 23 09:30:35.340917 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940681 2566 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 09:30:35.340917 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940684 2566 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 09:30:35.340917 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940687 2566 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 09:30:35.340917 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940690 2566 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 09:30:35.340917 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940693 2566 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 09:30:35.340917 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940696 2566 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 09:30:35.340917 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940698 2566 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 09:30:35.340917 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940706 2566 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 09:30:35.340917 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940709 2566 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 09:30:35.340917 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940712 2566 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 09:30:35.340917 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940715 2566 flags.go:64] FLAG: --pod-cidr="" Apr 23 09:30:35.340917 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940717 2566 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 09:30:35.340917 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940723 2566 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 09:30:35.340917 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940726 2566 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 09:30:35.340917 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940730 2566 flags.go:64] FLAG: --pods-per-core="0" Apr 23 09:30:35.340917 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940734 2566 flags.go:64] FLAG: --port="10250" Apr 23 09:30:35.340917 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940737 2566 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 09:30:35.340917 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940739 2566 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0992854df13ea05bf" Apr 23 09:30:35.340917 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940743 2566 flags.go:64] FLAG: --qos-reserved="" Apr 23 09:30:35.340917 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940746 2566 flags.go:64] FLAG: --read-only-port="10255" Apr 23 09:30:35.341567 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940749 2566 flags.go:64] FLAG: --register-node="true" Apr 23 09:30:35.341567 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940751 2566 flags.go:64] FLAG: --register-schedulable="true" Apr 23 09:30:35.341567 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940754 2566 flags.go:64] FLAG: --register-with-taints="" Apr 23 09:30:35.341567 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940758 2566 flags.go:64] FLAG: --registry-burst="10" Apr 23 09:30:35.341567 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940761 2566 flags.go:64] FLAG: --registry-qps="5" Apr 23 09:30:35.341567 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940765 2566 flags.go:64] FLAG: --reserved-cpus="" Apr 23 09:30:35.341567 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940768 2566 flags.go:64] FLAG: --reserved-memory="" Apr 23 09:30:35.341567 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940772 2566 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 09:30:35.341567 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940775 2566 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 09:30:35.341567 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940778 2566 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 09:30:35.341567 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940780 2566 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 09:30:35.341567 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940788 2566 flags.go:64] FLAG: --runonce="false" Apr 23 09:30:35.341567 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940792 2566 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 09:30:35.341567 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940795 2566 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 09:30:35.341567 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940798 2566 flags.go:64] FLAG: --seccomp-default="false" Apr 23 09:30:35.341567 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940801 2566 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 09:30:35.341567 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940804 2566 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 09:30:35.341567 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940807 2566 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 09:30:35.341567 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940810 2566 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 09:30:35.341567 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940812 2566 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 09:30:35.341567 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940819 2566 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 09:30:35.341567 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940822 2566 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 09:30:35.341567 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940825 2566 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 09:30:35.341567 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940828 2566 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 09:30:35.341567 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940830 2566 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 09:30:35.341567 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940834 2566 flags.go:64] FLAG: --system-cgroups="" Apr 23 09:30:35.342262 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940838 2566 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 09:30:35.342262 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940843 2566 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 09:30:35.342262 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940846 2566 flags.go:64] FLAG: --tls-cert-file="" Apr 23 09:30:35.342262 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940849 2566 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 09:30:35.342262 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940852 2566 flags.go:64] FLAG: --tls-min-version="" Apr 23 09:30:35.342262 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940856 2566 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 09:30:35.342262 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940858 2566 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 09:30:35.342262 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940861 2566 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 09:30:35.342262 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940864 2566 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 09:30:35.342262 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940867 2566 flags.go:64] FLAG: --v="2" Apr 23 09:30:35.342262 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940872 2566 flags.go:64] FLAG: --version="false" Apr 23 09:30:35.342262 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940877 2566 flags.go:64] FLAG: --vmodule="" Apr 23 09:30:35.342262 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940881 2566 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 09:30:35.342262 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.940884 2566 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 09:30:35.342262 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.940981 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 09:30:35.342262 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.940985 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 09:30:35.342262 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.940989 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 09:30:35.342262 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.940992 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 09:30:35.342262 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.940995 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 09:30:35.342262 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.940998 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 09:30:35.342262 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941002 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 09:30:35.342262 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941005 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 09:30:35.342806 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941007 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 09:30:35.342806 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941010 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 09:30:35.342806 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941012 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 09:30:35.342806 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941015 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 09:30:35.342806 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941018 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 09:30:35.342806 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941020 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 09:30:35.342806 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941023 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 09:30:35.342806 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941026 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 09:30:35.342806 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941028 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 09:30:35.342806 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941031 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 09:30:35.342806 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941035 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 09:30:35.342806 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941038 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 09:30:35.342806 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941041 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 09:30:35.342806 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941044 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 09:30:35.342806 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941046 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 09:30:35.342806 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941049 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 09:30:35.342806 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941051 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 09:30:35.342806 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941054 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 23 09:30:35.342806 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941056 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 09:30:35.342806 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941059 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 09:30:35.343317 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941061 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 09:30:35.343317 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941065 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 09:30:35.343317 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941068 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 09:30:35.343317 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941071 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 09:30:35.343317 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941073 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 09:30:35.343317 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941076 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 09:30:35.343317 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941078 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 09:30:35.343317 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941081 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 09:30:35.343317 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941083 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 09:30:35.343317 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941086 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 09:30:35.343317 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941089 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 09:30:35.343317 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941091 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 09:30:35.343317 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941094 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 09:30:35.343317 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941096 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 09:30:35.343317 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941099 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 09:30:35.343317 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941101 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 09:30:35.343317 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941104 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 09:30:35.343317 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941107 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 09:30:35.343317 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941109 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 09:30:35.343317 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941111 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 09:30:35.343800 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941114 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 09:30:35.343800 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941116 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 09:30:35.343800 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941122 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 09:30:35.343800 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941126 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 09:30:35.343800 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941128 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 09:30:35.343800 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941131 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 09:30:35.343800 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941134 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 09:30:35.343800 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941136 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 09:30:35.343800 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941139 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 09:30:35.343800 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941141 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 09:30:35.343800 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941144 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 09:30:35.343800 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941146 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 09:30:35.343800 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941149 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 09:30:35.343800 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941153 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 09:30:35.343800 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941156 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 09:30:35.343800 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941158 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 09:30:35.343800 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941161 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 09:30:35.343800 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941163 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 09:30:35.343800 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941166 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 09:30:35.344474 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941169 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 09:30:35.344474 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941171 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 09:30:35.344474 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941174 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 09:30:35.344474 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941178 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 09:30:35.344474 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941182 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 09:30:35.344474 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941185 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 09:30:35.344474 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941187 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 09:30:35.344474 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941190 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 09:30:35.344474 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941193 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 09:30:35.344474 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941196 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 09:30:35.344474 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941199 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 09:30:35.344474 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941201 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 09:30:35.344474 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941204 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 09:30:35.344474 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941207 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 09:30:35.344474 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941210 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 09:30:35.344474 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941214 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 09:30:35.344474 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941216 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 09:30:35.344474 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941219 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 09:30:35.344474 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.941221 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 09:30:35.345150 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.941227 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 09:30:35.345150 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.947878 2566 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 09:30:35.345150 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.947902 2566 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 09:30:35.345150 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.947961 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 09:30:35.345150 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.947967 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 09:30:35.345150 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.947971 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 09:30:35.345150 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.947975 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 09:30:35.345150 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.947978 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 09:30:35.345150 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.947981 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 09:30:35.345150 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.947984 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 09:30:35.345150 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.947987 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 09:30:35.345150 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.947990 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 09:30:35.345150 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.947993 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 09:30:35.345150 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.947995 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 09:30:35.345150 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.947998 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 09:30:35.345717 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948000 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 09:30:35.345717 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948004 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 09:30:35.345717 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948007 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 09:30:35.345717 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948009 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 09:30:35.345717 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948012 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 09:30:35.345717 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948014 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 09:30:35.345717 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948017 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 09:30:35.345717 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948019 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 09:30:35.345717 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948022 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 09:30:35.345717 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948024 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 09:30:35.345717 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948027 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 09:30:35.345717 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948030 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 09:30:35.345717 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948032 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 09:30:35.345717 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948035 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 09:30:35.345717 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948038 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 09:30:35.345717 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948041 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 09:30:35.345717 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948044 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 09:30:35.345717 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948046 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 09:30:35.345717 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948049 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 09:30:35.345717 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948052 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 09:30:35.346328 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948054 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 09:30:35.346328 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948057 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 09:30:35.346328 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948060 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 09:30:35.346328 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948063 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 09:30:35.346328 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948065 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 09:30:35.346328 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948068 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 09:30:35.346328 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948070 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 09:30:35.346328 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948073 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 09:30:35.346328 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948075 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 09:30:35.346328 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948078 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 09:30:35.346328 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948081 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 09:30:35.346328 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948083 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 09:30:35.346328 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948085 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 09:30:35.346328 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948088 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 09:30:35.346328 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948091 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 09:30:35.346328 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948093 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 09:30:35.346328 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948096 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 09:30:35.346328 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948098 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 09:30:35.346328 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948101 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 09:30:35.346328 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948103 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 09:30:35.346825 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948105 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 09:30:35.346825 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948108 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 09:30:35.346825 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948110 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 09:30:35.346825 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948113 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 09:30:35.346825 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948116 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 23 09:30:35.346825 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948119 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 09:30:35.346825 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948122 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 09:30:35.346825 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948124 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 09:30:35.346825 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948127 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 09:30:35.346825 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948129 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 09:30:35.346825 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948132 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 09:30:35.346825 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948134 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 09:30:35.346825 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948137 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 09:30:35.346825 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948140 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 09:30:35.346825 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948142 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 09:30:35.346825 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948145 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 09:30:35.346825 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948147 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 09:30:35.346825 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948150 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 09:30:35.346825 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948154 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 09:30:35.346825 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948158 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 09:30:35.347313 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948161 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 09:30:35.347313 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948164 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 09:30:35.347313 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948166 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 09:30:35.347313 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948169 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 09:30:35.347313 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948171 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 09:30:35.347313 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948174 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 09:30:35.347313 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948176 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 09:30:35.347313 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948179 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 09:30:35.347313 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948182 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 09:30:35.347313 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948184 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 09:30:35.347313 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948187 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 09:30:35.347313 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948189 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 09:30:35.347313 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948192 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 09:30:35.347313 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948194 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 09:30:35.347313 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.948200 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 09:30:35.347679 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948310 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 09:30:35.347679 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948319 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 09:30:35.347679 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948325 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 09:30:35.347679 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948331 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 09:30:35.347679 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948334 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 09:30:35.347679 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948337 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 09:30:35.347679 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948339 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 09:30:35.347679 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948342 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 09:30:35.347679 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948345 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 09:30:35.347679 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948348 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 09:30:35.347679 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948350 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 09:30:35.347679 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948353 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 09:30:35.347679 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948355 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 09:30:35.347679 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948357 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 09:30:35.347679 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948360 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 09:30:35.347679 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948363 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 09:30:35.347679 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948365 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 09:30:35.347679 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948368 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 09:30:35.347679 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948371 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 09:30:35.348169 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948373 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 09:30:35.348169 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948376 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 09:30:35.348169 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948378 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 09:30:35.348169 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948381 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 09:30:35.348169 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948383 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 09:30:35.348169 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948386 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 09:30:35.348169 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948389 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 09:30:35.348169 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948392 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 09:30:35.348169 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948394 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 09:30:35.348169 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948397 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 09:30:35.348169 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948399 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 09:30:35.348169 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948402 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 09:30:35.348169 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948404 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 09:30:35.348169 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948407 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 09:30:35.348169 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948410 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 09:30:35.348169 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948413 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 09:30:35.348169 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948420 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 09:30:35.348169 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948423 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 09:30:35.348169 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948426 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 23 09:30:35.348169 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948428 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 09:30:35.348702 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948431 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 09:30:35.348702 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948434 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 09:30:35.348702 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948436 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 09:30:35.348702 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948439 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 09:30:35.348702 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948443 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 09:30:35.348702 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948446 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 09:30:35.348702 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948448 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 09:30:35.348702 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948451 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 09:30:35.348702 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948453 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 09:30:35.348702 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948456 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 09:30:35.348702 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948458 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 09:30:35.348702 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948461 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 09:30:35.348702 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948463 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 09:30:35.348702 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948466 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 09:30:35.348702 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948468 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 09:30:35.348702 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948471 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 09:30:35.348702 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948474 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 09:30:35.348702 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948476 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 09:30:35.348702 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948478 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 09:30:35.349158 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948481 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 09:30:35.349158 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948484 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 09:30:35.349158 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948486 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 09:30:35.349158 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948489 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 09:30:35.349158 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948491 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 09:30:35.349158 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948493 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 09:30:35.349158 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948496 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 09:30:35.349158 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948502 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 09:30:35.349158 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948505 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 09:30:35.349158 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948509 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 09:30:35.349158 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948511 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 09:30:35.349158 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948514 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 09:30:35.349158 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948517 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 09:30:35.349158 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948520 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 09:30:35.349158 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948522 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 09:30:35.349158 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948524 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 09:30:35.349158 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948527 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 09:30:35.349158 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948529 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 09:30:35.349158 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948532 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 09:30:35.349158 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948534 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 09:30:35.349653 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948537 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 09:30:35.349653 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948539 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 09:30:35.349653 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948542 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 09:30:35.349653 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948544 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 09:30:35.349653 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948547 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 09:30:35.349653 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948549 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 09:30:35.349653 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948552 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 09:30:35.349653 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:34.948554 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 09:30:35.349653 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.948560 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 09:30:35.349653 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.949418 2566 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 09:30:35.349653 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.951683 2566 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 09:30:35.349653 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.953038 2566 server.go:1019] "Starting client certificate rotation" Apr 23 09:30:35.349653 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.953129 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 09:30:35.349653 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.954019 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 09:30:35.349997 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.982841 2566 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 09:30:35.349997 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:34.989268 2566 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 09:30:35.349997 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.007661 2566 log.go:25] "Validated CRI v1 runtime API" Apr 23 09:30:35.349997 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.014525 2566 log.go:25] "Validated CRI v1 image API" Apr 23 09:30:35.349997 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.015840 2566 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 09:30:35.349997 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.017467 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 09:30:35.349997 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.019397 2566 fs.go:135] Filesystem UUIDs: map[5a23add2-1c3a-4479-844f-e4f3dfd5e5c0:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 e521cd5c-f685-470b-8983-c94b4af388fe:/dev/nvme0n1p3] Apr 23 09:30:35.349997 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.019412 2566 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 09:30:35.350200 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.024822 2566 manager.go:217] Machine: {Timestamp:2026-04-23 09:30:35.023505378 +0000 UTC m=+0.468567697 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099658 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2eff6d141a5eb47e61e306c14fb41d SystemUUID:ec2eff6d-141a-5eb4-7e61-e306c14fb41d BootID:b2748ec1-a853-4005-a517-a75459b4f99c Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:67:69:3f:bf:9f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:67:69:3f:bf:9f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:12:6f:0c:00:8b:d6 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 09:30:35.350200 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.024926 2566 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 09:30:35.350200 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.024997 2566 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 09:30:35.350200 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.027131 2566 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 09:30:35.350200 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.027154 2566 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-17.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 09:30:35.350200 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.027328 2566 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 09:30:35.350200 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.027336 2566 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 09:30:35.350200 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.027348 2566 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 09:30:35.350200 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.028361 2566 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 09:30:35.350200 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.030360 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 23 09:30:35.350200 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.030475 2566 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 09:30:35.350200 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.033315 2566 kubelet.go:491] "Attempting to sync node with API server" Apr 23 09:30:35.350200 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.033333 2566 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 09:30:35.350200 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.033345 2566 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 09:30:35.350200 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.033354 2566 kubelet.go:397] "Adding apiserver pod source" Apr 23 09:30:35.350200 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.033362 2566 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 09:30:35.350200 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.034539 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 09:30:35.350200 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.034553 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.038163 2566 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.044336 2566 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.045914 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.045927 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.045934 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.045939 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.045945 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.045951 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.045957 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.045962 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.045969 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.045975 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.045983 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.045992 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.047926 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.047944 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:35.048951 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:35.048966 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-17.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.051610 2566 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.051645 2566 server.go:1295] "Started kubelet" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.051730 2566 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.051732 2566 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.051783 2566 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.054660 2566 server.go:317] "Adding debug handlers to kubelet server" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.056062 2566 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.058387 2566 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-17.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:35.058460 2566 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-17.ec2.internal.18a8f27478a1396c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-17.ec2.internal,UID:ip-10-0-136-17.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-136-17.ec2.internal,},FirstTimestamp:2026-04-23 09:30:35.051620716 +0000 UTC m=+0.496683030,LastTimestamp:2026-04-23 09:30:35.051620716 +0000 UTC m=+0.496683030,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-17.ec2.internal,}" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.062950 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.062981 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-fxfz2" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.063779 2566 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:35.065509 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-17.ec2.internal\" not found" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.066132 2566 factory.go:55] Registering systemd factory Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.066913 2566 factory.go:223] Registration of the systemd container factory successfully Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.066145 2566 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:35.066741 2566 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-136-17.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:35.066825 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.066617 2566 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 09:30:35.350681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.066997 2566 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.067149 2566 factory.go:153] Registering CRI-O factory Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.067162 2566 factory.go:223] Registration of the crio container factory successfully Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.067166 2566 reconstruct.go:97] "Volume reconstruction finished" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.067175 2566 reconciler.go:26] "Reconciler: start to sync state" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.067216 2566 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.067236 2566 factory.go:103] Registering Raw factory Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.067251 2566 manager.go:1196] Started watching for new ooms in manager Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.067788 2566 manager.go:319] Starting recovery of all containers Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:35.068764 2566 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.070809 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-fxfz2" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.079353 2566 manager.go:324] Recovery completed Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.084543 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.086994 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-17.ec2.internal" event="NodeHasSufficientMemory" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.087018 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-17.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.087028 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-17.ec2.internal" event="NodeHasSufficientPID" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.087573 2566 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.087584 2566 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.087600 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.090663 2566 policy_none.go:49] "None policy: Start" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.090674 2566 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.090683 2566 state_mem.go:35] "Initializing new in-memory state store" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.127075 2566 manager.go:341] "Starting Device Plugin manager" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:35.127115 2566 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.127129 2566 server.go:85] "Starting device plugin registration server" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.127374 2566 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.127401 2566 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.127509 2566 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.127613 2566 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.127624 2566 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:35.129047 2566 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:35.129078 2566 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-17.ec2.internal\" not found" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.172905 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.174123 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.174146 2566 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.174165 2566 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.174173 2566 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:35.174209 2566 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.179481 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.228290 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.230976 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-17.ec2.internal" event="NodeHasSufficientMemory" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.231005 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-17.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.231017 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-17.ec2.internal" event="NodeHasSufficientPID" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.231043 2566 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-17.ec2.internal" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.241821 2566 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-17.ec2.internal" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:35.241837 2566 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-17.ec2.internal\": node \"ip-10-0-136-17.ec2.internal\" not found" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:35.266896 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-17.ec2.internal\" not found" Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.274556 2566 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-17.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-136-17.ec2.internal"] Apr 23 09:30:35.351682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.274623 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 09:30:35.354816 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.276702 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-17.ec2.internal" event="NodeHasSufficientMemory" Apr 23 09:30:35.354816 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.276728 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-17.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 09:30:35.354816 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.276742 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-17.ec2.internal" event="NodeHasSufficientPID" Apr 23 09:30:35.354816 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.277972 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 09:30:35.354816 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.278123 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-17.ec2.internal" Apr 23 09:30:35.354816 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.278147 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 09:30:35.354816 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.278589 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-17.ec2.internal" event="NodeHasSufficientMemory" Apr 23 09:30:35.354816 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.278607 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-17.ec2.internal" event="NodeHasSufficientMemory" Apr 23 09:30:35.354816 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.278617 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-17.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 09:30:35.354816 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.278628 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-17.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 09:30:35.354816 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.278639 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-17.ec2.internal" event="NodeHasSufficientPID" Apr 23 09:30:35.354816 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.278629 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-17.ec2.internal" event="NodeHasSufficientPID" Apr 23 09:30:35.354816 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.279645 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-17.ec2.internal" Apr 23 09:30:35.354816 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.279669 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 09:30:35.354816 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.280254 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-17.ec2.internal" event="NodeHasSufficientMemory" Apr 23 09:30:35.354816 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.280277 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-17.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 09:30:35.354816 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.280305 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-17.ec2.internal" event="NodeHasSufficientPID" Apr 23 09:30:35.354816 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:35.298795 2566 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-17.ec2.internal\" not found" node="ip-10-0-136-17.ec2.internal" Apr 23 09:30:35.354816 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:35.302934 2566 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-17.ec2.internal\" not found" node="ip-10-0-136-17.ec2.internal" Apr 23 09:30:35.367574 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:35.367557 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-17.ec2.internal\" not found" Apr 23 09:30:35.467984 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:35.467962 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-17.ec2.internal\" not found" Apr 23 09:30:35.469106 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.469091 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3940101a0fb12832b381adacfd404c80-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-17.ec2.internal\" (UID: \"3940101a0fb12832b381adacfd404c80\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-17.ec2.internal" Apr 23 09:30:35.469152 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.469116 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3940101a0fb12832b381adacfd404c80-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-17.ec2.internal\" (UID: \"3940101a0fb12832b381adacfd404c80\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-17.ec2.internal" Apr 23 09:30:35.469197 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.469150 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/dde61d294f6e26e3e31c5e2576f80003-config\") pod \"kube-apiserver-proxy-ip-10-0-136-17.ec2.internal\" (UID: \"dde61d294f6e26e3e31c5e2576f80003\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-17.ec2.internal" Apr 23 09:30:35.568718 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:35.568686 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-17.ec2.internal\" not found" Apr 23 09:30:35.569807 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.569792 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3940101a0fb12832b381adacfd404c80-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-17.ec2.internal\" (UID: \"3940101a0fb12832b381adacfd404c80\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-17.ec2.internal" Apr 23 09:30:35.569850 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.569816 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3940101a0fb12832b381adacfd404c80-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-17.ec2.internal\" (UID: \"3940101a0fb12832b381adacfd404c80\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-17.ec2.internal" Apr 23 09:30:35.569850 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.569833 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/dde61d294f6e26e3e31c5e2576f80003-config\") pod \"kube-apiserver-proxy-ip-10-0-136-17.ec2.internal\" (UID: \"dde61d294f6e26e3e31c5e2576f80003\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-17.ec2.internal" Apr 23 09:30:35.569913 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.569857 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/dde61d294f6e26e3e31c5e2576f80003-config\") pod \"kube-apiserver-proxy-ip-10-0-136-17.ec2.internal\" (UID: \"dde61d294f6e26e3e31c5e2576f80003\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-17.ec2.internal" Apr 23 09:30:35.569913 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.569892 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3940101a0fb12832b381adacfd404c80-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-17.ec2.internal\" (UID: \"3940101a0fb12832b381adacfd404c80\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-17.ec2.internal" Apr 23 09:30:35.569967 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.569890 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3940101a0fb12832b381adacfd404c80-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-17.ec2.internal\" (UID: \"3940101a0fb12832b381adacfd404c80\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-17.ec2.internal" Apr 23 09:30:35.600983 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.600927 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-17.ec2.internal" Apr 23 09:30:35.605424 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.605409 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-17.ec2.internal" Apr 23 09:30:35.669365 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:35.669335 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-17.ec2.internal\" not found" Apr 23 09:30:35.769923 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:35.769893 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-17.ec2.internal\" not found" Apr 23 09:30:35.870407 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:35.870360 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-17.ec2.internal\" not found" Apr 23 09:30:35.940712 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.940694 2566 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 09:30:35.952413 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.952391 2566 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 09:30:35.952525 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.952509 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 09:30:35.952576 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.952558 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 09:30:35.965566 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.965543 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-17.ec2.internal" Apr 23 09:30:35.984955 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.984937 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 09:30:35.986397 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.986385 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-17.ec2.internal" Apr 23 09:30:35.992569 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:35.992557 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 09:30:36.033707 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.033684 2566 apiserver.go:52] "Watching apiserver" Apr 23 09:30:36.040489 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.040471 2566 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 09:30:36.042158 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.042139 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-6528f","openshift-network-operator/iptables-alerter-tdc8p","openshift-ovn-kubernetes/ovnkube-node-pf5hs","kube-system/kube-apiserver-proxy-ip-10-0-136-17.ec2.internal","openshift-image-registry/node-ca-nd7df","openshift-multus/multus-nblz4","openshift-network-diagnostics/network-check-target-sb8zp","kube-system/konnectivity-agent-57rnm","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mjhnv","openshift-cluster-node-tuning-operator/tuned-rn5xp","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-17.ec2.internal","openshift-multus/multus-additional-cni-plugins-wkpk8"] Apr 23 09:30:36.044025 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.043993 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:30:36.044441 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:36.044415 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6528f" podUID="c171e7cd-9c69-4ef7-9012-fad9d2b17a46" Apr 23 09:30:36.045425 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.045410 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-tdc8p" Apr 23 09:30:36.047648 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.047626 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 09:30:36.047732 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.047645 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-fqbwj\"" Apr 23 09:30:36.047732 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.047630 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 09:30:36.047878 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.047769 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.048832 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.048819 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nd7df" Apr 23 09:30:36.050018 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.050000 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.050090 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.050050 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 09:30:36.050141 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.050088 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 09:30:36.050213 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.050196 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 09:30:36.050325 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.050309 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 09:30:36.050385 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.050338 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-dpzbh\"" Apr 23 09:30:36.050440 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.050405 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 09:30:36.050576 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.050558 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 09:30:36.050637 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.050578 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 09:30:36.051163 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.051148 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-57kc9\"" Apr 23 09:30:36.051214 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.051177 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb8zp" Apr 23 09:30:36.051264 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.051216 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 09:30:36.051264 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:36.051239 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb8zp" podUID="11f39ddc-72f6-4699-8329-bbb34ab9a9f0" Apr 23 09:30:36.051372 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.051356 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 09:30:36.051540 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.051526 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 09:30:36.052242 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.052223 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-57rnm" Apr 23 09:30:36.052338 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.052243 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-6tbpg\"" Apr 23 09:30:36.052338 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.052285 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 09:30:36.052449 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.052227 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 09:30:36.052581 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.052567 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 09:30:36.052700 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.052687 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 09:30:36.053651 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.053635 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mjhnv" Apr 23 09:30:36.054416 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.054400 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 09:30:36.054506 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.054491 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 09:30:36.054619 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.054606 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-g65s2\"" Apr 23 09:30:36.054825 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.054813 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.055672 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.055656 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 09:30:36.055822 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.055808 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-kwc2k\"" Apr 23 09:30:36.055883 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.055840 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 09:30:36.055883 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.055850 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 09:30:36.056289 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.056273 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wkpk8" Apr 23 09:30:36.056661 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.056646 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 09:30:36.056823 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.056809 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 09:30:36.056953 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.056935 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-qxzzp\"" Apr 23 09:30:36.058342 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.058324 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-fcwn7\"" Apr 23 09:30:36.058406 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.058370 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 09:30:36.058676 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.058663 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 09:30:36.063170 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.063152 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 09:30:36.068178 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.068165 2566 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 09:30:36.072444 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.072425 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-host\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.072550 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.072448 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1793a43e-b740-4967-92f1-e3ecda36b452-host\") pod \"node-ca-nd7df\" (UID: \"1793a43e-b740-4967-92f1-e3ecda36b452\") " pod="openshift-image-registry/node-ca-nd7df" Apr 23 09:30:36.072550 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.072463 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/495c169c-a1fe-4740-b5e4-88f23ef7e5d0-cnibin\") pod \"multus-additional-cni-plugins-wkpk8\" (UID: \"495c169c-a1fe-4740-b5e4-88f23ef7e5d0\") " pod="openshift-multus/multus-additional-cni-plugins-wkpk8" Apr 23 09:30:36.072550 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.072482 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-host-cni-netd\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.072550 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.072500 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9461f0b4-eff2-4028-be01-d417107cb9a8-agent-certs\") pod \"konnectivity-agent-57rnm\" (UID: \"9461f0b4-eff2-4028-be01-d417107cb9a8\") " pod="kube-system/konnectivity-agent-57rnm" Apr 23 09:30:36.072550 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.072535 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-run-systemd\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.072735 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.072580 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-var-lib-openvswitch\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.072735 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.072610 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-host-var-lib-cni-bin\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.072735 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.072636 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/23461d04-1ac2-4f06-bec9-42875ddaa8aa-multus-daemon-config\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.072735 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.072657 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-etc-kubernetes\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.072735 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.072679 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/50de6365-8798-4807-b392-6cc780c49635-etc-tuned\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.072735 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.072702 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/495c169c-a1fe-4740-b5e4-88f23ef7e5d0-cni-binary-copy\") pod \"multus-additional-cni-plugins-wkpk8\" (UID: \"495c169c-a1fe-4740-b5e4-88f23ef7e5d0\") " pod="openshift-multus/multus-additional-cni-plugins-wkpk8" Apr 23 09:30:36.072735 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.072725 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/495c169c-a1fe-4740-b5e4-88f23ef7e5d0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wkpk8\" (UID: \"495c169c-a1fe-4740-b5e4-88f23ef7e5d0\") " pod="openshift-multus/multus-additional-cni-plugins-wkpk8" Apr 23 09:30:36.072999 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.072765 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-systemd-units\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.072999 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.072788 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-host-var-lib-kubelet\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.072999 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.072820 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-hostroot\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.072999 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.072842 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b967352d-3f1f-4e67-b468-5d9326f772ea-registration-dir\") pod \"aws-ebs-csi-driver-node-mjhnv\" (UID: \"b967352d-3f1f-4e67-b468-5d9326f772ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mjhnv" Apr 23 09:30:36.072999 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.072872 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-etc-sysconfig\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.072999 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.072900 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-run\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.072999 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.072951 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-var-lib-kubelet\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.072999 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.072987 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/495c169c-a1fe-4740-b5e4-88f23ef7e5d0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wkpk8\" (UID: \"495c169c-a1fe-4740-b5e4-88f23ef7e5d0\") " pod="openshift-multus/multus-additional-cni-plugins-wkpk8" Apr 23 09:30:36.073405 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.073016 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-cnibin\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.073728 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.073708 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-multus-conf-dir\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.073835 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.073757 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/495c169c-a1fe-4740-b5e4-88f23ef7e5d0-os-release\") pod \"multus-additional-cni-plugins-wkpk8\" (UID: \"495c169c-a1fe-4740-b5e4-88f23ef7e5d0\") " pod="openshift-multus/multus-additional-cni-plugins-wkpk8" Apr 23 09:30:36.073835 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.073788 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxf75\" (UniqueName: \"kubernetes.io/projected/1793a43e-b740-4967-92f1-e3ecda36b452-kube-api-access-dxf75\") pod \"node-ca-nd7df\" (UID: \"1793a43e-b740-4967-92f1-e3ecda36b452\") " pod="openshift-image-registry/node-ca-nd7df" Apr 23 09:30:36.073835 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.073821 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-host-run-netns\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.073981 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.073851 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-run-ovn\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.073981 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.073882 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-host-run-k8s-cni-cncf-io\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.073981 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.073915 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-etc-kubernetes\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.073981 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.073940 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-lib-modules\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.074327 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.074184 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/495c169c-a1fe-4740-b5e4-88f23ef7e5d0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wkpk8\" (UID: \"495c169c-a1fe-4740-b5e4-88f23ef7e5d0\") " pod="openshift-multus/multus-additional-cni-plugins-wkpk8" Apr 23 09:30:36.074327 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.074230 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-env-overrides\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.074327 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.074267 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbw5r\" (UniqueName: \"kubernetes.io/projected/23461d04-1ac2-4f06-bec9-42875ddaa8aa-kube-api-access-kbw5r\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.074327 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.074314 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-etc-sysctl-conf\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.074522 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.074348 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-node-log\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.074522 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.074407 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-host-cni-bin\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.074522 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.074447 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b967352d-3f1f-4e67-b468-5d9326f772ea-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mjhnv\" (UID: \"b967352d-3f1f-4e67-b468-5d9326f772ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mjhnv" Apr 23 09:30:36.074522 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.074487 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-etc-modprobe-d\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.074522 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.074516 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-sys\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.074723 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.074541 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/495c169c-a1fe-4740-b5e4-88f23ef7e5d0-system-cni-dir\") pod \"multus-additional-cni-plugins-wkpk8\" (UID: \"495c169c-a1fe-4740-b5e4-88f23ef7e5d0\") " pod="openshift-multus/multus-additional-cni-plugins-wkpk8" Apr 23 09:30:36.074723 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.074573 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-os-release\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.074723 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.074631 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b967352d-3f1f-4e67-b468-5d9326f772ea-socket-dir\") pod \"aws-ebs-csi-driver-node-mjhnv\" (UID: \"b967352d-3f1f-4e67-b468-5d9326f772ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mjhnv" Apr 23 09:30:36.074723 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.074663 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b967352d-3f1f-4e67-b468-5d9326f772ea-device-dir\") pod \"aws-ebs-csi-driver-node-mjhnv\" (UID: \"b967352d-3f1f-4e67-b468-5d9326f772ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mjhnv" Apr 23 09:30:36.074723 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.074692 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-etc-systemd\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.074929 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.074721 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-host-run-ovn-kubernetes\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.074929 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.074752 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-etc-openvswitch\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.074929 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.074776 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-ovnkube-config\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.074929 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.074846 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-ovn-node-metrics-cert\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.074929 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.074883 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/23461d04-1ac2-4f06-bec9-42875ddaa8aa-cni-binary-copy\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.074929 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.074914 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-host-run-multus-certs\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.075181 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.074944 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9461f0b4-eff2-4028-be01-d417107cb9a8-konnectivity-ca\") pod \"konnectivity-agent-57rnm\" (UID: \"9461f0b4-eff2-4028-be01-d417107cb9a8\") " pod="kube-system/konnectivity-agent-57rnm" Apr 23 09:30:36.075181 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.074974 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/50de6365-8798-4807-b392-6cc780c49635-tmp\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.075181 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.075003 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhpfp\" (UniqueName: \"kubernetes.io/projected/c171e7cd-9c69-4ef7-9012-fad9d2b17a46-kube-api-access-lhpfp\") pod \"network-metrics-daemon-6528f\" (UID: \"c171e7cd-9c69-4ef7-9012-fad9d2b17a46\") " pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:30:36.075181 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.075028 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.075181 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.075058 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-multus-socket-dir-parent\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.075181 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.075088 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxtxx\" (UniqueName: \"kubernetes.io/projected/11f39ddc-72f6-4699-8329-bbb34ab9a9f0-kube-api-access-mxtxx\") pod \"network-check-target-sb8zp\" (UID: \"11f39ddc-72f6-4699-8329-bbb34ab9a9f0\") " pod="openshift-network-diagnostics/network-check-target-sb8zp" Apr 23 09:30:36.075181 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.075117 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7bz9\" (UniqueName: \"kubernetes.io/projected/495c169c-a1fe-4740-b5e4-88f23ef7e5d0-kube-api-access-l7bz9\") pod \"multus-additional-cni-plugins-wkpk8\" (UID: \"495c169c-a1fe-4740-b5e4-88f23ef7e5d0\") " pod="openshift-multus/multus-additional-cni-plugins-wkpk8" Apr 23 09:30:36.075181 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.075148 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4wvz\" (UniqueName: \"kubernetes.io/projected/70691076-656b-493e-863d-2a367fb8eccf-kube-api-access-x4wvz\") pod \"iptables-alerter-tdc8p\" (UID: \"70691076-656b-493e-863d-2a367fb8eccf\") " pod="openshift-network-operator/iptables-alerter-tdc8p" Apr 23 09:30:36.075181 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.075172 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-host-kubelet\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.075572 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.075202 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-host-slash\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.075572 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.075326 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-log-socket\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.075572 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.075353 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b967352d-3f1f-4e67-b468-5d9326f772ea-sys-fs\") pod \"aws-ebs-csi-driver-node-mjhnv\" (UID: \"b967352d-3f1f-4e67-b468-5d9326f772ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mjhnv" Apr 23 09:30:36.075572 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.075382 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjjjt\" (UniqueName: \"kubernetes.io/projected/50de6365-8798-4807-b392-6cc780c49635-kube-api-access-mjjjt\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.075572 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.075412 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1793a43e-b740-4967-92f1-e3ecda36b452-serviceca\") pod \"node-ca-nd7df\" (UID: \"1793a43e-b740-4967-92f1-e3ecda36b452\") " pod="openshift-image-registry/node-ca-nd7df" Apr 23 09:30:36.075572 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.075447 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-run-openvswitch\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.075572 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.075478 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-ovnkube-script-lib\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.075572 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.075506 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-multus-cni-dir\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.075572 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.075535 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-host-run-netns\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.075572 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.075567 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-host-var-lib-cni-multus\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.076033 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.075599 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgq2w\" (UniqueName: \"kubernetes.io/projected/b967352d-3f1f-4e67-b468-5d9326f772ea-kube-api-access-qgq2w\") pod \"aws-ebs-csi-driver-node-mjhnv\" (UID: \"b967352d-3f1f-4e67-b468-5d9326f772ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mjhnv" Apr 23 09:30:36.076033 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.075628 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-etc-sysctl-d\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.076033 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.075656 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c171e7cd-9c69-4ef7-9012-fad9d2b17a46-metrics-certs\") pod \"network-metrics-daemon-6528f\" (UID: \"c171e7cd-9c69-4ef7-9012-fad9d2b17a46\") " pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:30:36.076033 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.075683 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/70691076-656b-493e-863d-2a367fb8eccf-host-slash\") pod \"iptables-alerter-tdc8p\" (UID: \"70691076-656b-493e-863d-2a367fb8eccf\") " pod="openshift-network-operator/iptables-alerter-tdc8p" Apr 23 09:30:36.076033 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.075706 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw25m\" (UniqueName: \"kubernetes.io/projected/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-kube-api-access-qw25m\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.076033 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.075735 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b967352d-3f1f-4e67-b468-5d9326f772ea-etc-selinux\") pod \"aws-ebs-csi-driver-node-mjhnv\" (UID: \"b967352d-3f1f-4e67-b468-5d9326f772ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mjhnv" Apr 23 09:30:36.076033 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.075763 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/70691076-656b-493e-863d-2a367fb8eccf-iptables-alerter-script\") pod \"iptables-alerter-tdc8p\" (UID: \"70691076-656b-493e-863d-2a367fb8eccf\") " pod="openshift-network-operator/iptables-alerter-tdc8p" Apr 23 09:30:36.076033 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.075794 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-system-cni-dir\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.076033 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.075937 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 09:30:36.076374 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.076052 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 09:25:35 +0000 UTC" deadline="2027-10-21 04:33:00.583324236 +0000 UTC" Apr 23 09:30:36.076374 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.076074 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13099h2m24.507252485s" Apr 23 09:30:36.096859 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.096843 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-hbf27" Apr 23 09:30:36.101662 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.101643 2566 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 09:30:36.105093 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.105079 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-hbf27" Apr 23 09:30:36.175965 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.175944 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-host\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.176145 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.175970 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1793a43e-b740-4967-92f1-e3ecda36b452-host\") pod \"node-ca-nd7df\" (UID: \"1793a43e-b740-4967-92f1-e3ecda36b452\") " pod="openshift-image-registry/node-ca-nd7df" Apr 23 09:30:36.176145 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.175995 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/495c169c-a1fe-4740-b5e4-88f23ef7e5d0-cnibin\") pod \"multus-additional-cni-plugins-wkpk8\" (UID: \"495c169c-a1fe-4740-b5e4-88f23ef7e5d0\") " pod="openshift-multus/multus-additional-cni-plugins-wkpk8" Apr 23 09:30:36.176145 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176020 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-host-cni-netd\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.176145 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176046 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9461f0b4-eff2-4028-be01-d417107cb9a8-agent-certs\") pod \"konnectivity-agent-57rnm\" (UID: \"9461f0b4-eff2-4028-be01-d417107cb9a8\") " pod="kube-system/konnectivity-agent-57rnm" Apr 23 09:30:36.176145 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176063 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-run-systemd\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.176145 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176078 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-var-lib-openvswitch\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.176145 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176095 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-host-var-lib-cni-bin\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.176145 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176116 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/23461d04-1ac2-4f06-bec9-42875ddaa8aa-multus-daemon-config\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.176145 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176120 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-host\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.176602 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176159 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-run-systemd\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.176602 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176167 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-var-lib-openvswitch\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.176602 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176188 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-host-cni-netd\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.176602 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176191 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-etc-kubernetes\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.176602 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176213 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-host-var-lib-cni-bin\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.176602 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176223 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1793a43e-b740-4967-92f1-e3ecda36b452-host\") pod \"node-ca-nd7df\" (UID: \"1793a43e-b740-4967-92f1-e3ecda36b452\") " pod="openshift-image-registry/node-ca-nd7df" Apr 23 09:30:36.176602 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176240 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/495c169c-a1fe-4740-b5e4-88f23ef7e5d0-cnibin\") pod \"multus-additional-cni-plugins-wkpk8\" (UID: \"495c169c-a1fe-4740-b5e4-88f23ef7e5d0\") " pod="openshift-multus/multus-additional-cni-plugins-wkpk8" Apr 23 09:30:36.176602 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176274 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/50de6365-8798-4807-b392-6cc780c49635-etc-tuned\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.176602 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176308 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-etc-kubernetes\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.176602 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176326 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/495c169c-a1fe-4740-b5e4-88f23ef7e5d0-cni-binary-copy\") pod \"multus-additional-cni-plugins-wkpk8\" (UID: \"495c169c-a1fe-4740-b5e4-88f23ef7e5d0\") " pod="openshift-multus/multus-additional-cni-plugins-wkpk8" Apr 23 09:30:36.176602 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176355 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/495c169c-a1fe-4740-b5e4-88f23ef7e5d0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wkpk8\" (UID: \"495c169c-a1fe-4740-b5e4-88f23ef7e5d0\") " pod="openshift-multus/multus-additional-cni-plugins-wkpk8" Apr 23 09:30:36.176602 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176383 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-systemd-units\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.176602 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176409 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-host-var-lib-kubelet\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.176602 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176435 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-hostroot\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.176602 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176461 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b967352d-3f1f-4e67-b468-5d9326f772ea-registration-dir\") pod \"aws-ebs-csi-driver-node-mjhnv\" (UID: \"b967352d-3f1f-4e67-b468-5d9326f772ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mjhnv" Apr 23 09:30:36.176602 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176469 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-systemd-units\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.176602 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176485 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-etc-sysconfig\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.176602 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176481 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-host-var-lib-kubelet\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.177450 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176531 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-hostroot\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.177450 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176541 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-etc-sysconfig\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.177450 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176540 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b967352d-3f1f-4e67-b468-5d9326f772ea-registration-dir\") pod \"aws-ebs-csi-driver-node-mjhnv\" (UID: \"b967352d-3f1f-4e67-b468-5d9326f772ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mjhnv" Apr 23 09:30:36.177450 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176612 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-run\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.177450 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176645 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-var-lib-kubelet\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.177450 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176647 2566 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 09:30:36.177450 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176671 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/495c169c-a1fe-4740-b5e4-88f23ef7e5d0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wkpk8\" (UID: \"495c169c-a1fe-4740-b5e4-88f23ef7e5d0\") " pod="openshift-multus/multus-additional-cni-plugins-wkpk8" Apr 23 09:30:36.177450 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176696 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-cnibin\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.177450 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176713 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-run\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.177450 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176759 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-multus-conf-dir\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.177450 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176762 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-var-lib-kubelet\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.177450 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176718 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-multus-conf-dir\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.177450 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176808 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/495c169c-a1fe-4740-b5e4-88f23ef7e5d0-os-release\") pod \"multus-additional-cni-plugins-wkpk8\" (UID: \"495c169c-a1fe-4740-b5e4-88f23ef7e5d0\") " pod="openshift-multus/multus-additional-cni-plugins-wkpk8" Apr 23 09:30:36.177450 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176820 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/23461d04-1ac2-4f06-bec9-42875ddaa8aa-multus-daemon-config\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.177450 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176839 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxf75\" (UniqueName: \"kubernetes.io/projected/1793a43e-b740-4967-92f1-e3ecda36b452-kube-api-access-dxf75\") pod \"node-ca-nd7df\" (UID: \"1793a43e-b740-4967-92f1-e3ecda36b452\") " pod="openshift-image-registry/node-ca-nd7df" Apr 23 09:30:36.177450 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176866 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-host-run-netns\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.177450 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176892 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-run-ovn\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.177450 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176914 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/495c169c-a1fe-4740-b5e4-88f23ef7e5d0-os-release\") pod \"multus-additional-cni-plugins-wkpk8\" (UID: \"495c169c-a1fe-4740-b5e4-88f23ef7e5d0\") " pod="openshift-multus/multus-additional-cni-plugins-wkpk8" Apr 23 09:30:36.178231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176918 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-host-run-k8s-cni-cncf-io\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.178231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176964 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-cnibin\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.178231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176999 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-host-run-netns\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.178231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177004 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-run-ovn\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.178231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.176966 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-host-run-k8s-cni-cncf-io\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.178231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177026 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-etc-kubernetes\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.178231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177041 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/495c169c-a1fe-4740-b5e4-88f23ef7e5d0-cni-binary-copy\") pod \"multus-additional-cni-plugins-wkpk8\" (UID: \"495c169c-a1fe-4740-b5e4-88f23ef7e5d0\") " pod="openshift-multus/multus-additional-cni-plugins-wkpk8" Apr 23 09:30:36.178231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177051 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-lib-modules\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.178231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177062 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/495c169c-a1fe-4740-b5e4-88f23ef7e5d0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wkpk8\" (UID: \"495c169c-a1fe-4740-b5e4-88f23ef7e5d0\") " pod="openshift-multus/multus-additional-cni-plugins-wkpk8" Apr 23 09:30:36.178231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177077 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/495c169c-a1fe-4740-b5e4-88f23ef7e5d0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wkpk8\" (UID: \"495c169c-a1fe-4740-b5e4-88f23ef7e5d0\") " pod="openshift-multus/multus-additional-cni-plugins-wkpk8" Apr 23 09:30:36.178231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177103 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-env-overrides\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.178231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177106 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-etc-kubernetes\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.178231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177127 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbw5r\" (UniqueName: \"kubernetes.io/projected/23461d04-1ac2-4f06-bec9-42875ddaa8aa-kube-api-access-kbw5r\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.178231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177164 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-etc-sysctl-conf\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.178231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177164 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/495c169c-a1fe-4740-b5e4-88f23ef7e5d0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wkpk8\" (UID: \"495c169c-a1fe-4740-b5e4-88f23ef7e5d0\") " pod="openshift-multus/multus-additional-cni-plugins-wkpk8" Apr 23 09:30:36.178231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177175 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-lib-modules\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.178231 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177179 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-node-log\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.179034 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177208 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-host-cni-bin\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.179034 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177222 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b967352d-3f1f-4e67-b468-5d9326f772ea-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mjhnv\" (UID: \"b967352d-3f1f-4e67-b468-5d9326f772ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mjhnv" Apr 23 09:30:36.179034 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177226 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/495c169c-a1fe-4740-b5e4-88f23ef7e5d0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wkpk8\" (UID: \"495c169c-a1fe-4740-b5e4-88f23ef7e5d0\") " pod="openshift-multus/multus-additional-cni-plugins-wkpk8" Apr 23 09:30:36.179034 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177236 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-etc-modprobe-d\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.179034 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177267 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-sys\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.179034 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177291 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-etc-modprobe-d\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.179034 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177316 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/495c169c-a1fe-4740-b5e4-88f23ef7e5d0-system-cni-dir\") pod \"multus-additional-cni-plugins-wkpk8\" (UID: \"495c169c-a1fe-4740-b5e4-88f23ef7e5d0\") " pod="openshift-multus/multus-additional-cni-plugins-wkpk8" Apr 23 09:30:36.179034 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177339 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-host-cni-bin\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.179034 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177343 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-os-release\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.179034 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177366 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b967352d-3f1f-4e67-b468-5d9326f772ea-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mjhnv\" (UID: \"b967352d-3f1f-4e67-b468-5d9326f772ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mjhnv" Apr 23 09:30:36.179034 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177372 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b967352d-3f1f-4e67-b468-5d9326f772ea-socket-dir\") pod \"aws-ebs-csi-driver-node-mjhnv\" (UID: \"b967352d-3f1f-4e67-b468-5d9326f772ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mjhnv" Apr 23 09:30:36.179034 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177385 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-etc-sysctl-conf\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.179034 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177392 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-sys\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.179034 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177397 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-node-log\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.179034 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177403 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b967352d-3f1f-4e67-b468-5d9326f772ea-device-dir\") pod \"aws-ebs-csi-driver-node-mjhnv\" (UID: \"b967352d-3f1f-4e67-b468-5d9326f772ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mjhnv" Apr 23 09:30:36.179034 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177421 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-os-release\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.179034 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177432 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-etc-systemd\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.179852 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177443 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/495c169c-a1fe-4740-b5e4-88f23ef7e5d0-system-cni-dir\") pod \"multus-additional-cni-plugins-wkpk8\" (UID: \"495c169c-a1fe-4740-b5e4-88f23ef7e5d0\") " pod="openshift-multus/multus-additional-cni-plugins-wkpk8" Apr 23 09:30:36.179852 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177454 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b967352d-3f1f-4e67-b468-5d9326f772ea-device-dir\") pod \"aws-ebs-csi-driver-node-mjhnv\" (UID: \"b967352d-3f1f-4e67-b468-5d9326f772ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mjhnv" Apr 23 09:30:36.179852 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177458 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-host-run-ovn-kubernetes\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.179852 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177489 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-etc-systemd\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.179852 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177567 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-host-run-ovn-kubernetes\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.179852 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177914 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-env-overrides\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.179852 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177832 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-etc-openvswitch\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.179852 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177886 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-etc-openvswitch\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.179852 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177968 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-ovnkube-config\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.179852 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.177995 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-ovn-node-metrics-cert\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.179852 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.178025 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/23461d04-1ac2-4f06-bec9-42875ddaa8aa-cni-binary-copy\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.179852 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.178052 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-host-run-multus-certs\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.179852 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.178080 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9461f0b4-eff2-4028-be01-d417107cb9a8-konnectivity-ca\") pod \"konnectivity-agent-57rnm\" (UID: \"9461f0b4-eff2-4028-be01-d417107cb9a8\") " pod="kube-system/konnectivity-agent-57rnm" Apr 23 09:30:36.179852 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.178111 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/50de6365-8798-4807-b392-6cc780c49635-tmp\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.179852 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.178137 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhpfp\" (UniqueName: \"kubernetes.io/projected/c171e7cd-9c69-4ef7-9012-fad9d2b17a46-kube-api-access-lhpfp\") pod \"network-metrics-daemon-6528f\" (UID: \"c171e7cd-9c69-4ef7-9012-fad9d2b17a46\") " pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:30:36.179852 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.178181 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.179852 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.178334 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-host-run-multus-certs\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.180680 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.178417 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.180680 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.178697 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/23461d04-1ac2-4f06-bec9-42875ddaa8aa-cni-binary-copy\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.180680 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.178758 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-multus-socket-dir-parent\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.180680 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.178772 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9461f0b4-eff2-4028-be01-d417107cb9a8-konnectivity-ca\") pod \"konnectivity-agent-57rnm\" (UID: \"9461f0b4-eff2-4028-be01-d417107cb9a8\") " pod="kube-system/konnectivity-agent-57rnm" Apr 23 09:30:36.180680 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.178803 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxtxx\" (UniqueName: \"kubernetes.io/projected/11f39ddc-72f6-4699-8329-bbb34ab9a9f0-kube-api-access-mxtxx\") pod \"network-check-target-sb8zp\" (UID: \"11f39ddc-72f6-4699-8329-bbb34ab9a9f0\") " pod="openshift-network-diagnostics/network-check-target-sb8zp" Apr 23 09:30:36.180680 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.178843 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7bz9\" (UniqueName: \"kubernetes.io/projected/495c169c-a1fe-4740-b5e4-88f23ef7e5d0-kube-api-access-l7bz9\") pod \"multus-additional-cni-plugins-wkpk8\" (UID: \"495c169c-a1fe-4740-b5e4-88f23ef7e5d0\") " pod="openshift-multus/multus-additional-cni-plugins-wkpk8" Apr 23 09:30:36.180680 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.178886 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x4wvz\" (UniqueName: \"kubernetes.io/projected/70691076-656b-493e-863d-2a367fb8eccf-kube-api-access-x4wvz\") pod \"iptables-alerter-tdc8p\" (UID: \"70691076-656b-493e-863d-2a367fb8eccf\") " pod="openshift-network-operator/iptables-alerter-tdc8p" Apr 23 09:30:36.180680 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.178921 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-host-kubelet\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.180680 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.178955 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-host-slash\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.180680 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.178990 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-log-socket\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.180680 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.179026 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b967352d-3f1f-4e67-b468-5d9326f772ea-sys-fs\") pod \"aws-ebs-csi-driver-node-mjhnv\" (UID: \"b967352d-3f1f-4e67-b468-5d9326f772ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mjhnv" Apr 23 09:30:36.180680 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.179059 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mjjjt\" (UniqueName: \"kubernetes.io/projected/50de6365-8798-4807-b392-6cc780c49635-kube-api-access-mjjjt\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.180680 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.179094 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1793a43e-b740-4967-92f1-e3ecda36b452-serviceca\") pod \"node-ca-nd7df\" (UID: \"1793a43e-b740-4967-92f1-e3ecda36b452\") " pod="openshift-image-registry/node-ca-nd7df" Apr 23 09:30:36.180680 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.179110 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-ovnkube-config\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.180680 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.179130 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-run-openvswitch\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.180680 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.179187 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-run-openvswitch\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.180680 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.179197 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-host-kubelet\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.181471 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.179228 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-ovnkube-script-lib\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.181471 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.179265 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-multus-cni-dir\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.181471 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.179321 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-host-run-netns\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.181471 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.179352 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-host-var-lib-cni-multus\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.181471 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.179388 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qgq2w\" (UniqueName: \"kubernetes.io/projected/b967352d-3f1f-4e67-b468-5d9326f772ea-kube-api-access-qgq2w\") pod \"aws-ebs-csi-driver-node-mjhnv\" (UID: \"b967352d-3f1f-4e67-b468-5d9326f772ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mjhnv" Apr 23 09:30:36.181471 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.179426 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-etc-sysctl-d\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.181471 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.179460 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c171e7cd-9c69-4ef7-9012-fad9d2b17a46-metrics-certs\") pod \"network-metrics-daemon-6528f\" (UID: \"c171e7cd-9c69-4ef7-9012-fad9d2b17a46\") " pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:30:36.181471 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.179495 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/70691076-656b-493e-863d-2a367fb8eccf-host-slash\") pod \"iptables-alerter-tdc8p\" (UID: \"70691076-656b-493e-863d-2a367fb8eccf\") " pod="openshift-network-operator/iptables-alerter-tdc8p" Apr 23 09:30:36.181471 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.179507 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-multus-socket-dir-parent\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.181471 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.179531 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qw25m\" (UniqueName: \"kubernetes.io/projected/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-kube-api-access-qw25m\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.181471 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.179567 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b967352d-3f1f-4e67-b468-5d9326f772ea-etc-selinux\") pod \"aws-ebs-csi-driver-node-mjhnv\" (UID: \"b967352d-3f1f-4e67-b468-5d9326f772ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mjhnv" Apr 23 09:30:36.181471 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.179603 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/70691076-656b-493e-863d-2a367fb8eccf-iptables-alerter-script\") pod \"iptables-alerter-tdc8p\" (UID: \"70691076-656b-493e-863d-2a367fb8eccf\") " pod="openshift-network-operator/iptables-alerter-tdc8p" Apr 23 09:30:36.181471 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.179638 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-system-cni-dir\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.181471 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.179737 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-system-cni-dir\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.181471 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.179844 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-host-slash\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.181471 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.179914 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-log-socket\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.181471 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.179976 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b967352d-3f1f-4e67-b468-5d9326f772ea-sys-fs\") pod \"aws-ebs-csi-driver-node-mjhnv\" (UID: \"b967352d-3f1f-4e67-b468-5d9326f772ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mjhnv" Apr 23 09:30:36.182307 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.180271 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-ovnkube-script-lib\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.182307 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.180389 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-multus-cni-dir\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.182307 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.180457 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-host-run-netns\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.182307 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.180512 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/23461d04-1ac2-4f06-bec9-42875ddaa8aa-host-var-lib-cni-multus\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.182307 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.180622 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1793a43e-b740-4967-92f1-e3ecda36b452-serviceca\") pod \"node-ca-nd7df\" (UID: \"1793a43e-b740-4967-92f1-e3ecda36b452\") " pod="openshift-image-registry/node-ca-nd7df" Apr 23 09:30:36.182307 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.180848 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/50de6365-8798-4807-b392-6cc780c49635-etc-sysctl-d\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.182307 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:36.180871 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:36.182307 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:36.180953 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c171e7cd-9c69-4ef7-9012-fad9d2b17a46-metrics-certs podName:c171e7cd-9c69-4ef7-9012-fad9d2b17a46 nodeName:}" failed. No retries permitted until 2026-04-23 09:30:36.680933706 +0000 UTC m=+2.125996030 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c171e7cd-9c69-4ef7-9012-fad9d2b17a46-metrics-certs") pod "network-metrics-daemon-6528f" (UID: "c171e7cd-9c69-4ef7-9012-fad9d2b17a46") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:36.182307 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.181152 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b967352d-3f1f-4e67-b468-5d9326f772ea-etc-selinux\") pod \"aws-ebs-csi-driver-node-mjhnv\" (UID: \"b967352d-3f1f-4e67-b468-5d9326f772ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mjhnv" Apr 23 09:30:36.182307 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.181330 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/70691076-656b-493e-863d-2a367fb8eccf-host-slash\") pod \"iptables-alerter-tdc8p\" (UID: \"70691076-656b-493e-863d-2a367fb8eccf\") " pod="openshift-network-operator/iptables-alerter-tdc8p" Apr 23 09:30:36.182307 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.181543 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-ovn-node-metrics-cert\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.182307 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.181735 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b967352d-3f1f-4e67-b468-5d9326f772ea-socket-dir\") pod \"aws-ebs-csi-driver-node-mjhnv\" (UID: \"b967352d-3f1f-4e67-b468-5d9326f772ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mjhnv" Apr 23 09:30:36.182307 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.181741 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9461f0b4-eff2-4028-be01-d417107cb9a8-agent-certs\") pod \"konnectivity-agent-57rnm\" (UID: \"9461f0b4-eff2-4028-be01-d417107cb9a8\") " pod="kube-system/konnectivity-agent-57rnm" Apr 23 09:30:36.182307 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.181839 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/70691076-656b-493e-863d-2a367fb8eccf-iptables-alerter-script\") pod \"iptables-alerter-tdc8p\" (UID: \"70691076-656b-493e-863d-2a367fb8eccf\") " pod="openshift-network-operator/iptables-alerter-tdc8p" Apr 23 09:30:36.182307 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.182255 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/50de6365-8798-4807-b392-6cc780c49635-tmp\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.185038 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.184894 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/50de6365-8798-4807-b392-6cc780c49635-etc-tuned\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.187729 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.187694 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbw5r\" (UniqueName: \"kubernetes.io/projected/23461d04-1ac2-4f06-bec9-42875ddaa8aa-kube-api-access-kbw5r\") pod \"multus-nblz4\" (UID: \"23461d04-1ac2-4f06-bec9-42875ddaa8aa\") " pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.190635 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:36.190092 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 09:30:36.190635 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:36.190115 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 09:30:36.190635 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:36.190127 2566 projected.go:194] Error preparing data for projected volume kube-api-access-mxtxx for pod openshift-network-diagnostics/network-check-target-sb8zp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:36.190635 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:36.190186 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/11f39ddc-72f6-4699-8329-bbb34ab9a9f0-kube-api-access-mxtxx podName:11f39ddc-72f6-4699-8329-bbb34ab9a9f0 nodeName:}" failed. No retries permitted until 2026-04-23 09:30:36.690169912 +0000 UTC m=+2.135232226 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-mxtxx" (UniqueName: "kubernetes.io/projected/11f39ddc-72f6-4699-8329-bbb34ab9a9f0-kube-api-access-mxtxx") pod "network-check-target-sb8zp" (UID: "11f39ddc-72f6-4699-8329-bbb34ab9a9f0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:36.192378 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.192336 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7bz9\" (UniqueName: \"kubernetes.io/projected/495c169c-a1fe-4740-b5e4-88f23ef7e5d0-kube-api-access-l7bz9\") pod \"multus-additional-cni-plugins-wkpk8\" (UID: \"495c169c-a1fe-4740-b5e4-88f23ef7e5d0\") " pod="openshift-multus/multus-additional-cni-plugins-wkpk8" Apr 23 09:30:36.192690 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.192661 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhpfp\" (UniqueName: \"kubernetes.io/projected/c171e7cd-9c69-4ef7-9012-fad9d2b17a46-kube-api-access-lhpfp\") pod \"network-metrics-daemon-6528f\" (UID: \"c171e7cd-9c69-4ef7-9012-fad9d2b17a46\") " pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:30:36.193476 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.193436 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxf75\" (UniqueName: \"kubernetes.io/projected/1793a43e-b740-4967-92f1-e3ecda36b452-kube-api-access-dxf75\") pod \"node-ca-nd7df\" (UID: \"1793a43e-b740-4967-92f1-e3ecda36b452\") " pod="openshift-image-registry/node-ca-nd7df" Apr 23 09:30:36.193556 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.193497 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4wvz\" (UniqueName: \"kubernetes.io/projected/70691076-656b-493e-863d-2a367fb8eccf-kube-api-access-x4wvz\") pod \"iptables-alerter-tdc8p\" (UID: \"70691076-656b-493e-863d-2a367fb8eccf\") " pod="openshift-network-operator/iptables-alerter-tdc8p" Apr 23 09:30:36.193917 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.193896 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgq2w\" (UniqueName: \"kubernetes.io/projected/b967352d-3f1f-4e67-b468-5d9326f772ea-kube-api-access-qgq2w\") pod \"aws-ebs-csi-driver-node-mjhnv\" (UID: \"b967352d-3f1f-4e67-b468-5d9326f772ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mjhnv" Apr 23 09:30:36.194850 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.194825 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw25m\" (UniqueName: \"kubernetes.io/projected/cbbe5fc8-e02f-40b7-8e74-ca04c14c0260-kube-api-access-qw25m\") pod \"ovnkube-node-pf5hs\" (UID: \"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260\") " pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.194850 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.194840 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjjjt\" (UniqueName: \"kubernetes.io/projected/50de6365-8798-4807-b392-6cc780c49635-kube-api-access-mjjjt\") pod \"tuned-rn5xp\" (UID: \"50de6365-8798-4807-b392-6cc780c49635\") " pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.196474 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.196456 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wkpk8" Apr 23 09:30:36.260899 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:36.260873 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddde61d294f6e26e3e31c5e2576f80003.slice/crio-ca44824e04e84b24806b11e6576f3ba5762e35ed6a9a6f65021107e33787c571 WatchSource:0}: Error finding container ca44824e04e84b24806b11e6576f3ba5762e35ed6a9a6f65021107e33787c571: Status 404 returned error can't find the container with id ca44824e04e84b24806b11e6576f3ba5762e35ed6a9a6f65021107e33787c571 Apr 23 09:30:36.264841 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.264826 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 09:30:36.384331 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.384259 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-tdc8p" Apr 23 09:30:36.391726 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:36.391699 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70691076_656b_493e_863d_2a367fb8eccf.slice/crio-0cdf271faec1a50f0bdf136719d870878344a6624dc3a0257bafc1c4e6c8af3e WatchSource:0}: Error finding container 0cdf271faec1a50f0bdf136719d870878344a6624dc3a0257bafc1c4e6c8af3e: Status 404 returned error can't find the container with id 0cdf271faec1a50f0bdf136719d870878344a6624dc3a0257bafc1c4e6c8af3e Apr 23 09:30:36.398635 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.398618 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:30:36.404695 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:36.404668 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbbe5fc8_e02f_40b7_8e74_ca04c14c0260.slice/crio-0d310aedb95a2ad3a6d49a01c0019f1b83fbff05331692eb7ed21465d37edb99 WatchSource:0}: Error finding container 0d310aedb95a2ad3a6d49a01c0019f1b83fbff05331692eb7ed21465d37edb99: Status 404 returned error can't find the container with id 0d310aedb95a2ad3a6d49a01c0019f1b83fbff05331692eb7ed21465d37edb99 Apr 23 09:30:36.412371 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.412354 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nd7df" Apr 23 09:30:36.417770 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:36.417752 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1793a43e_b740_4967_92f1_e3ecda36b452.slice/crio-d8dbafbe647e0a6cc9858d6a48231e612f57bd252cd893b394c2082de11f2fa7 WatchSource:0}: Error finding container d8dbafbe647e0a6cc9858d6a48231e612f57bd252cd893b394c2082de11f2fa7: Status 404 returned error can't find the container with id d8dbafbe647e0a6cc9858d6a48231e612f57bd252cd893b394c2082de11f2fa7 Apr 23 09:30:36.429027 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.429011 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-nblz4" Apr 23 09:30:36.434131 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:36.434110 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23461d04_1ac2_4f06_bec9_42875ddaa8aa.slice/crio-a03dd05a429385f77196aba92c8fb4fa176a01bc07f34b48c7883e4f3c47868f WatchSource:0}: Error finding container a03dd05a429385f77196aba92c8fb4fa176a01bc07f34b48c7883e4f3c47868f: Status 404 returned error can't find the container with id a03dd05a429385f77196aba92c8fb4fa176a01bc07f34b48c7883e4f3c47868f Apr 23 09:30:36.449775 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.449757 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-57rnm" Apr 23 09:30:36.454772 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:36.454752 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9461f0b4_eff2_4028_be01_d417107cb9a8.slice/crio-a9719934e31c9bece5743a4f9e69dbd5e172d2df99e18ba4a45b63b5197e14f8 WatchSource:0}: Error finding container a9719934e31c9bece5743a4f9e69dbd5e172d2df99e18ba4a45b63b5197e14f8: Status 404 returned error can't find the container with id a9719934e31c9bece5743a4f9e69dbd5e172d2df99e18ba4a45b63b5197e14f8 Apr 23 09:30:36.458395 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.458377 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mjhnv" Apr 23 09:30:36.463783 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:36.463763 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb967352d_3f1f_4e67_b468_5d9326f772ea.slice/crio-eb8b8afe0dcdf99e5c14928331ca383cc55d66a4218e5b9e91ad6e4beddb3e6a WatchSource:0}: Error finding container eb8b8afe0dcdf99e5c14928331ca383cc55d66a4218e5b9e91ad6e4beddb3e6a: Status 404 returned error can't find the container with id eb8b8afe0dcdf99e5c14928331ca383cc55d66a4218e5b9e91ad6e4beddb3e6a Apr 23 09:30:36.475438 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.475420 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" Apr 23 09:30:36.480318 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:36.480280 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50de6365_8798_4807_b392_6cc780c49635.slice/crio-bcb4d5e19f7c7fc3ef711dbda00986ea58ab9a93933f19c4ca7d4fb020b7230a WatchSource:0}: Error finding container bcb4d5e19f7c7fc3ef711dbda00986ea58ab9a93933f19c4ca7d4fb020b7230a: Status 404 returned error can't find the container with id bcb4d5e19f7c7fc3ef711dbda00986ea58ab9a93933f19c4ca7d4fb020b7230a Apr 23 09:30:36.542709 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.542680 2566 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 09:30:36.681994 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.681920 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c171e7cd-9c69-4ef7-9012-fad9d2b17a46-metrics-certs\") pod \"network-metrics-daemon-6528f\" (UID: \"c171e7cd-9c69-4ef7-9012-fad9d2b17a46\") " pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:30:36.682111 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:36.682020 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:36.682111 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:36.682067 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c171e7cd-9c69-4ef7-9012-fad9d2b17a46-metrics-certs podName:c171e7cd-9c69-4ef7-9012-fad9d2b17a46 nodeName:}" failed. No retries permitted until 2026-04-23 09:30:37.682053638 +0000 UTC m=+3.127115939 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c171e7cd-9c69-4ef7-9012-fad9d2b17a46-metrics-certs") pod "network-metrics-daemon-6528f" (UID: "c171e7cd-9c69-4ef7-9012-fad9d2b17a46") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:36.782927 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:36.782896 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxtxx\" (UniqueName: \"kubernetes.io/projected/11f39ddc-72f6-4699-8329-bbb34ab9a9f0-kube-api-access-mxtxx\") pod \"network-check-target-sb8zp\" (UID: \"11f39ddc-72f6-4699-8329-bbb34ab9a9f0\") " pod="openshift-network-diagnostics/network-check-target-sb8zp" Apr 23 09:30:36.783060 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:36.783042 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 09:30:36.783111 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:36.783062 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 09:30:36.783111 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:36.783077 2566 projected.go:194] Error preparing data for projected volume kube-api-access-mxtxx for pod openshift-network-diagnostics/network-check-target-sb8zp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:36.783190 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:36.783130 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/11f39ddc-72f6-4699-8329-bbb34ab9a9f0-kube-api-access-mxtxx podName:11f39ddc-72f6-4699-8329-bbb34ab9a9f0 nodeName:}" failed. No retries permitted until 2026-04-23 09:30:37.783113878 +0000 UTC m=+3.228176185 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-mxtxx" (UniqueName: "kubernetes.io/projected/11f39ddc-72f6-4699-8329-bbb34ab9a9f0-kube-api-access-mxtxx") pod "network-check-target-sb8zp" (UID: "11f39ddc-72f6-4699-8329-bbb34ab9a9f0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:37.066741 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:37.066714 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 09:30:37.107262 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:37.107228 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 09:25:36 +0000 UTC" deadline="2028-01-06 22:02:52.524351952 +0000 UTC" Apr 23 09:30:37.107898 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:37.107270 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14964h32m15.417093573s" Apr 23 09:30:37.182372 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:37.182318 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-tdc8p" event={"ID":"70691076-656b-493e-863d-2a367fb8eccf","Type":"ContainerStarted","Data":"0cdf271faec1a50f0bdf136719d870878344a6624dc3a0257bafc1c4e6c8af3e"} Apr 23 09:30:37.184022 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:37.183993 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wkpk8" event={"ID":"495c169c-a1fe-4740-b5e4-88f23ef7e5d0","Type":"ContainerStarted","Data":"8ac2512c94d4fce9135086c68288b74f15444097c4ec7591e67aa539341dde65"} Apr 23 09:30:37.186579 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:37.186553 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" event={"ID":"50de6365-8798-4807-b392-6cc780c49635","Type":"ContainerStarted","Data":"bcb4d5e19f7c7fc3ef711dbda00986ea58ab9a93933f19c4ca7d4fb020b7230a"} Apr 23 09:30:37.188613 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:37.188591 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mjhnv" event={"ID":"b967352d-3f1f-4e67-b468-5d9326f772ea","Type":"ContainerStarted","Data":"eb8b8afe0dcdf99e5c14928331ca383cc55d66a4218e5b9e91ad6e4beddb3e6a"} Apr 23 09:30:37.191783 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:37.191738 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nblz4" event={"ID":"23461d04-1ac2-4f06-bec9-42875ddaa8aa","Type":"ContainerStarted","Data":"a03dd05a429385f77196aba92c8fb4fa176a01bc07f34b48c7883e4f3c47868f"} Apr 23 09:30:37.194092 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:37.194070 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nd7df" event={"ID":"1793a43e-b740-4967-92f1-e3ecda36b452","Type":"ContainerStarted","Data":"d8dbafbe647e0a6cc9858d6a48231e612f57bd252cd893b394c2082de11f2fa7"} Apr 23 09:30:37.196126 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:37.196105 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" event={"ID":"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260","Type":"ContainerStarted","Data":"0d310aedb95a2ad3a6d49a01c0019f1b83fbff05331692eb7ed21465d37edb99"} Apr 23 09:30:37.198559 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:37.198531 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-17.ec2.internal" event={"ID":"dde61d294f6e26e3e31c5e2576f80003","Type":"ContainerStarted","Data":"ca44824e04e84b24806b11e6576f3ba5762e35ed6a9a6f65021107e33787c571"} Apr 23 09:30:37.201220 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:37.201200 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-57rnm" event={"ID":"9461f0b4-eff2-4028-be01-d417107cb9a8","Type":"ContainerStarted","Data":"a9719934e31c9bece5743a4f9e69dbd5e172d2df99e18ba4a45b63b5197e14f8"} Apr 23 09:30:37.306262 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:30:37.306232 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3940101a0fb12832b381adacfd404c80.slice/crio-c0a9d4a20ced3a018cf0d0b244074006085e37352734621095ee67397d8018a9 WatchSource:0}: Error finding container c0a9d4a20ced3a018cf0d0b244074006085e37352734621095ee67397d8018a9: Status 404 returned error can't find the container with id c0a9d4a20ced3a018cf0d0b244074006085e37352734621095ee67397d8018a9 Apr 23 09:30:37.690852 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:37.690772 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c171e7cd-9c69-4ef7-9012-fad9d2b17a46-metrics-certs\") pod \"network-metrics-daemon-6528f\" (UID: \"c171e7cd-9c69-4ef7-9012-fad9d2b17a46\") " pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:30:37.691013 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:37.690925 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:37.691013 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:37.690988 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c171e7cd-9c69-4ef7-9012-fad9d2b17a46-metrics-certs podName:c171e7cd-9c69-4ef7-9012-fad9d2b17a46 nodeName:}" failed. No retries permitted until 2026-04-23 09:30:39.6909694 +0000 UTC m=+5.136031706 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c171e7cd-9c69-4ef7-9012-fad9d2b17a46-metrics-certs") pod "network-metrics-daemon-6528f" (UID: "c171e7cd-9c69-4ef7-9012-fad9d2b17a46") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:37.791597 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:37.791561 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxtxx\" (UniqueName: \"kubernetes.io/projected/11f39ddc-72f6-4699-8329-bbb34ab9a9f0-kube-api-access-mxtxx\") pod \"network-check-target-sb8zp\" (UID: \"11f39ddc-72f6-4699-8329-bbb34ab9a9f0\") " pod="openshift-network-diagnostics/network-check-target-sb8zp" Apr 23 09:30:37.791773 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:37.791708 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 09:30:37.791773 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:37.791729 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 09:30:37.791773 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:37.791741 2566 projected.go:194] Error preparing data for projected volume kube-api-access-mxtxx for pod openshift-network-diagnostics/network-check-target-sb8zp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:37.791951 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:37.791802 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/11f39ddc-72f6-4699-8329-bbb34ab9a9f0-kube-api-access-mxtxx podName:11f39ddc-72f6-4699-8329-bbb34ab9a9f0 nodeName:}" failed. No retries permitted until 2026-04-23 09:30:39.791783473 +0000 UTC m=+5.236845775 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-mxtxx" (UniqueName: "kubernetes.io/projected/11f39ddc-72f6-4699-8329-bbb34ab9a9f0-kube-api-access-mxtxx") pod "network-check-target-sb8zp" (UID: "11f39ddc-72f6-4699-8329-bbb34ab9a9f0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:38.108194 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:38.108105 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 09:25:36 +0000 UTC" deadline="2027-09-17 17:19:28.081940479 +0000 UTC" Apr 23 09:30:38.108194 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:38.108143 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12295h48m49.973801154s" Apr 23 09:30:38.175387 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:38.174555 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:30:38.175387 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:38.174701 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6528f" podUID="c171e7cd-9c69-4ef7-9012-fad9d2b17a46" Apr 23 09:30:38.175387 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:38.175203 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb8zp" Apr 23 09:30:38.175387 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:38.175330 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb8zp" podUID="11f39ddc-72f6-4699-8329-bbb34ab9a9f0" Apr 23 09:30:38.220193 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:38.220096 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-17.ec2.internal" event={"ID":"3940101a0fb12832b381adacfd404c80","Type":"ContainerStarted","Data":"c0a9d4a20ced3a018cf0d0b244074006085e37352734621095ee67397d8018a9"} Apr 23 09:30:38.843801 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:38.843769 2566 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 09:30:39.705943 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:39.705906 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c171e7cd-9c69-4ef7-9012-fad9d2b17a46-metrics-certs\") pod \"network-metrics-daemon-6528f\" (UID: \"c171e7cd-9c69-4ef7-9012-fad9d2b17a46\") " pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:30:39.706436 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:39.706039 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:39.706436 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:39.706097 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c171e7cd-9c69-4ef7-9012-fad9d2b17a46-metrics-certs podName:c171e7cd-9c69-4ef7-9012-fad9d2b17a46 nodeName:}" failed. No retries permitted until 2026-04-23 09:30:43.706079085 +0000 UTC m=+9.151141389 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c171e7cd-9c69-4ef7-9012-fad9d2b17a46-metrics-certs") pod "network-metrics-daemon-6528f" (UID: "c171e7cd-9c69-4ef7-9012-fad9d2b17a46") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:39.806355 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:39.806315 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxtxx\" (UniqueName: \"kubernetes.io/projected/11f39ddc-72f6-4699-8329-bbb34ab9a9f0-kube-api-access-mxtxx\") pod \"network-check-target-sb8zp\" (UID: \"11f39ddc-72f6-4699-8329-bbb34ab9a9f0\") " pod="openshift-network-diagnostics/network-check-target-sb8zp" Apr 23 09:30:39.806557 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:39.806532 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 09:30:39.806557 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:39.806556 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 09:30:39.806682 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:39.806569 2566 projected.go:194] Error preparing data for projected volume kube-api-access-mxtxx for pod openshift-network-diagnostics/network-check-target-sb8zp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:39.806682 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:39.806620 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/11f39ddc-72f6-4699-8329-bbb34ab9a9f0-kube-api-access-mxtxx podName:11f39ddc-72f6-4699-8329-bbb34ab9a9f0 nodeName:}" failed. No retries permitted until 2026-04-23 09:30:43.806602793 +0000 UTC m=+9.251665094 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-mxtxx" (UniqueName: "kubernetes.io/projected/11f39ddc-72f6-4699-8329-bbb34ab9a9f0-kube-api-access-mxtxx") pod "network-check-target-sb8zp" (UID: "11f39ddc-72f6-4699-8329-bbb34ab9a9f0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:40.176339 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:40.175628 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb8zp" Apr 23 09:30:40.176339 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:40.175737 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb8zp" podUID="11f39ddc-72f6-4699-8329-bbb34ab9a9f0" Apr 23 09:30:40.176339 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:40.176143 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:30:40.176339 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:40.176248 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6528f" podUID="c171e7cd-9c69-4ef7-9012-fad9d2b17a46" Apr 23 09:30:42.174968 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:42.174921 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:30:42.174968 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:42.174970 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb8zp" Apr 23 09:30:42.175511 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:42.175090 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6528f" podUID="c171e7cd-9c69-4ef7-9012-fad9d2b17a46" Apr 23 09:30:42.175511 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:42.175214 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb8zp" podUID="11f39ddc-72f6-4699-8329-bbb34ab9a9f0" Apr 23 09:30:42.602259 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:42.601735 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-459kk"] Apr 23 09:30:42.604513 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:42.604484 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:30:42.604639 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:42.604586 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-459kk" podUID="05ab6e35-f200-483a-b4f6-fee2629df7f2" Apr 23 09:30:42.629462 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:42.629432 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/05ab6e35-f200-483a-b4f6-fee2629df7f2-dbus\") pod \"global-pull-secret-syncer-459kk\" (UID: \"05ab6e35-f200-483a-b4f6-fee2629df7f2\") " pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:30:42.629593 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:42.629471 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05ab6e35-f200-483a-b4f6-fee2629df7f2-original-pull-secret\") pod \"global-pull-secret-syncer-459kk\" (UID: \"05ab6e35-f200-483a-b4f6-fee2629df7f2\") " pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:30:42.629593 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:42.629562 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/05ab6e35-f200-483a-b4f6-fee2629df7f2-kubelet-config\") pod \"global-pull-secret-syncer-459kk\" (UID: \"05ab6e35-f200-483a-b4f6-fee2629df7f2\") " pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:30:42.730045 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:42.730011 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/05ab6e35-f200-483a-b4f6-fee2629df7f2-kubelet-config\") pod \"global-pull-secret-syncer-459kk\" (UID: \"05ab6e35-f200-483a-b4f6-fee2629df7f2\") " pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:30:42.730201 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:42.730091 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/05ab6e35-f200-483a-b4f6-fee2629df7f2-dbus\") pod \"global-pull-secret-syncer-459kk\" (UID: \"05ab6e35-f200-483a-b4f6-fee2629df7f2\") " pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:30:42.730201 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:42.730121 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05ab6e35-f200-483a-b4f6-fee2629df7f2-original-pull-secret\") pod \"global-pull-secret-syncer-459kk\" (UID: \"05ab6e35-f200-483a-b4f6-fee2629df7f2\") " pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:30:42.730201 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:42.730129 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/05ab6e35-f200-483a-b4f6-fee2629df7f2-kubelet-config\") pod \"global-pull-secret-syncer-459kk\" (UID: \"05ab6e35-f200-483a-b4f6-fee2629df7f2\") " pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:30:42.730389 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:42.730243 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 09:30:42.730389 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:42.730319 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05ab6e35-f200-483a-b4f6-fee2629df7f2-original-pull-secret podName:05ab6e35-f200-483a-b4f6-fee2629df7f2 nodeName:}" failed. No retries permitted until 2026-04-23 09:30:43.230285243 +0000 UTC m=+8.675347549 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/05ab6e35-f200-483a-b4f6-fee2629df7f2-original-pull-secret") pod "global-pull-secret-syncer-459kk" (UID: "05ab6e35-f200-483a-b4f6-fee2629df7f2") : object "kube-system"/"original-pull-secret" not registered Apr 23 09:30:42.730389 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:42.730317 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/05ab6e35-f200-483a-b4f6-fee2629df7f2-dbus\") pod \"global-pull-secret-syncer-459kk\" (UID: \"05ab6e35-f200-483a-b4f6-fee2629df7f2\") " pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:30:43.233559 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:43.233495 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05ab6e35-f200-483a-b4f6-fee2629df7f2-original-pull-secret\") pod \"global-pull-secret-syncer-459kk\" (UID: \"05ab6e35-f200-483a-b4f6-fee2629df7f2\") " pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:30:43.233990 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:43.233717 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 09:30:43.233990 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:43.233777 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05ab6e35-f200-483a-b4f6-fee2629df7f2-original-pull-secret podName:05ab6e35-f200-483a-b4f6-fee2629df7f2 nodeName:}" failed. No retries permitted until 2026-04-23 09:30:44.233759307 +0000 UTC m=+9.678821613 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/05ab6e35-f200-483a-b4f6-fee2629df7f2-original-pull-secret") pod "global-pull-secret-syncer-459kk" (UID: "05ab6e35-f200-483a-b4f6-fee2629df7f2") : object "kube-system"/"original-pull-secret" not registered Apr 23 09:30:43.737684 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:43.737650 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c171e7cd-9c69-4ef7-9012-fad9d2b17a46-metrics-certs\") pod \"network-metrics-daemon-6528f\" (UID: \"c171e7cd-9c69-4ef7-9012-fad9d2b17a46\") " pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:30:43.737873 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:43.737813 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:43.737938 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:43.737884 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c171e7cd-9c69-4ef7-9012-fad9d2b17a46-metrics-certs podName:c171e7cd-9c69-4ef7-9012-fad9d2b17a46 nodeName:}" failed. No retries permitted until 2026-04-23 09:30:51.737863808 +0000 UTC m=+17.182926125 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c171e7cd-9c69-4ef7-9012-fad9d2b17a46-metrics-certs") pod "network-metrics-daemon-6528f" (UID: "c171e7cd-9c69-4ef7-9012-fad9d2b17a46") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:43.838686 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:43.838655 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxtxx\" (UniqueName: \"kubernetes.io/projected/11f39ddc-72f6-4699-8329-bbb34ab9a9f0-kube-api-access-mxtxx\") pod \"network-check-target-sb8zp\" (UID: \"11f39ddc-72f6-4699-8329-bbb34ab9a9f0\") " pod="openshift-network-diagnostics/network-check-target-sb8zp" Apr 23 09:30:43.838862 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:43.838792 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 09:30:43.838862 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:43.838806 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 09:30:43.838862 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:43.838814 2566 projected.go:194] Error preparing data for projected volume kube-api-access-mxtxx for pod openshift-network-diagnostics/network-check-target-sb8zp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:43.838862 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:43.838853 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/11f39ddc-72f6-4699-8329-bbb34ab9a9f0-kube-api-access-mxtxx podName:11f39ddc-72f6-4699-8329-bbb34ab9a9f0 nodeName:}" failed. No retries permitted until 2026-04-23 09:30:51.83884206 +0000 UTC m=+17.283904361 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-mxtxx" (UniqueName: "kubernetes.io/projected/11f39ddc-72f6-4699-8329-bbb34ab9a9f0-kube-api-access-mxtxx") pod "network-check-target-sb8zp" (UID: "11f39ddc-72f6-4699-8329-bbb34ab9a9f0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:44.174849 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:44.174773 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:30:44.174849 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:44.174831 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb8zp" Apr 23 09:30:44.175047 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:44.174923 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:30:44.175047 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:44.174929 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb8zp" podUID="11f39ddc-72f6-4699-8329-bbb34ab9a9f0" Apr 23 09:30:44.175165 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:44.175042 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6528f" podUID="c171e7cd-9c69-4ef7-9012-fad9d2b17a46" Apr 23 09:30:44.175165 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:44.175134 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-459kk" podUID="05ab6e35-f200-483a-b4f6-fee2629df7f2" Apr 23 09:30:44.240887 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:44.240854 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05ab6e35-f200-483a-b4f6-fee2629df7f2-original-pull-secret\") pod \"global-pull-secret-syncer-459kk\" (UID: \"05ab6e35-f200-483a-b4f6-fee2629df7f2\") " pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:30:44.241343 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:44.240975 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 09:30:44.241343 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:44.241039 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05ab6e35-f200-483a-b4f6-fee2629df7f2-original-pull-secret podName:05ab6e35-f200-483a-b4f6-fee2629df7f2 nodeName:}" failed. No retries permitted until 2026-04-23 09:30:46.241021877 +0000 UTC m=+11.686084198 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/05ab6e35-f200-483a-b4f6-fee2629df7f2-original-pull-secret") pod "global-pull-secret-syncer-459kk" (UID: "05ab6e35-f200-483a-b4f6-fee2629df7f2") : object "kube-system"/"original-pull-secret" not registered Apr 23 09:30:46.175435 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:46.175190 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:30:46.175870 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:46.175224 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:30:46.175870 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:46.175541 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6528f" podUID="c171e7cd-9c69-4ef7-9012-fad9d2b17a46" Apr 23 09:30:46.175870 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:46.175599 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-459kk" podUID="05ab6e35-f200-483a-b4f6-fee2629df7f2" Apr 23 09:30:46.175870 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:46.175282 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb8zp" Apr 23 09:30:46.175870 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:46.175692 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb8zp" podUID="11f39ddc-72f6-4699-8329-bbb34ab9a9f0" Apr 23 09:30:46.255925 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:46.255705 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05ab6e35-f200-483a-b4f6-fee2629df7f2-original-pull-secret\") pod \"global-pull-secret-syncer-459kk\" (UID: \"05ab6e35-f200-483a-b4f6-fee2629df7f2\") " pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:30:46.255925 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:46.255843 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 09:30:46.255925 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:46.255907 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05ab6e35-f200-483a-b4f6-fee2629df7f2-original-pull-secret podName:05ab6e35-f200-483a-b4f6-fee2629df7f2 nodeName:}" failed. No retries permitted until 2026-04-23 09:30:50.25588009 +0000 UTC m=+15.700942391 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/05ab6e35-f200-483a-b4f6-fee2629df7f2-original-pull-secret") pod "global-pull-secret-syncer-459kk" (UID: "05ab6e35-f200-483a-b4f6-fee2629df7f2") : object "kube-system"/"original-pull-secret" not registered Apr 23 09:30:48.175079 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:48.175041 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:30:48.175079 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:48.175088 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:30:48.175626 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:48.175085 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb8zp" Apr 23 09:30:48.175626 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:48.175181 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-459kk" podUID="05ab6e35-f200-483a-b4f6-fee2629df7f2" Apr 23 09:30:48.175626 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:48.175587 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6528f" podUID="c171e7cd-9c69-4ef7-9012-fad9d2b17a46" Apr 23 09:30:48.175773 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:48.175672 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb8zp" podUID="11f39ddc-72f6-4699-8329-bbb34ab9a9f0" Apr 23 09:30:50.174414 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:50.174336 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:30:50.174414 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:50.174364 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:30:50.174868 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:50.174341 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb8zp" Apr 23 09:30:50.174868 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:50.174455 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-459kk" podUID="05ab6e35-f200-483a-b4f6-fee2629df7f2" Apr 23 09:30:50.174868 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:50.174530 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb8zp" podUID="11f39ddc-72f6-4699-8329-bbb34ab9a9f0" Apr 23 09:30:50.174868 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:50.174613 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6528f" podUID="c171e7cd-9c69-4ef7-9012-fad9d2b17a46" Apr 23 09:30:50.289428 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:50.289395 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05ab6e35-f200-483a-b4f6-fee2629df7f2-original-pull-secret\") pod \"global-pull-secret-syncer-459kk\" (UID: \"05ab6e35-f200-483a-b4f6-fee2629df7f2\") " pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:30:50.289583 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:50.289531 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 09:30:50.289646 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:50.289594 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05ab6e35-f200-483a-b4f6-fee2629df7f2-original-pull-secret podName:05ab6e35-f200-483a-b4f6-fee2629df7f2 nodeName:}" failed. No retries permitted until 2026-04-23 09:30:58.289576552 +0000 UTC m=+23.734638858 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/05ab6e35-f200-483a-b4f6-fee2629df7f2-original-pull-secret") pod "global-pull-secret-syncer-459kk" (UID: "05ab6e35-f200-483a-b4f6-fee2629df7f2") : object "kube-system"/"original-pull-secret" not registered Apr 23 09:30:51.799760 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:51.799721 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c171e7cd-9c69-4ef7-9012-fad9d2b17a46-metrics-certs\") pod \"network-metrics-daemon-6528f\" (UID: \"c171e7cd-9c69-4ef7-9012-fad9d2b17a46\") " pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:30:51.800168 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:51.799851 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:51.800168 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:51.799920 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c171e7cd-9c69-4ef7-9012-fad9d2b17a46-metrics-certs podName:c171e7cd-9c69-4ef7-9012-fad9d2b17a46 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:07.799901266 +0000 UTC m=+33.244963586 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c171e7cd-9c69-4ef7-9012-fad9d2b17a46-metrics-certs") pod "network-metrics-daemon-6528f" (UID: "c171e7cd-9c69-4ef7-9012-fad9d2b17a46") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:30:51.900307 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:51.900259 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxtxx\" (UniqueName: \"kubernetes.io/projected/11f39ddc-72f6-4699-8329-bbb34ab9a9f0-kube-api-access-mxtxx\") pod \"network-check-target-sb8zp\" (UID: \"11f39ddc-72f6-4699-8329-bbb34ab9a9f0\") " pod="openshift-network-diagnostics/network-check-target-sb8zp" Apr 23 09:30:51.900470 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:51.900412 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 09:30:51.900470 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:51.900434 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 09:30:51.900470 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:51.900448 2566 projected.go:194] Error preparing data for projected volume kube-api-access-mxtxx for pod openshift-network-diagnostics/network-check-target-sb8zp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:51.900575 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:51.900504 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/11f39ddc-72f6-4699-8329-bbb34ab9a9f0-kube-api-access-mxtxx podName:11f39ddc-72f6-4699-8329-bbb34ab9a9f0 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:07.900488241 +0000 UTC m=+33.345550542 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-mxtxx" (UniqueName: "kubernetes.io/projected/11f39ddc-72f6-4699-8329-bbb34ab9a9f0-kube-api-access-mxtxx") pod "network-check-target-sb8zp" (UID: "11f39ddc-72f6-4699-8329-bbb34ab9a9f0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 09:30:52.174618 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:52.174504 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:30:52.174618 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:52.174503 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:30:52.174821 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:52.174633 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6528f" podUID="c171e7cd-9c69-4ef7-9012-fad9d2b17a46" Apr 23 09:30:52.174821 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:52.174503 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb8zp" Apr 23 09:30:52.174821 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:52.174714 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-459kk" podUID="05ab6e35-f200-483a-b4f6-fee2629df7f2" Apr 23 09:30:52.174821 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:52.174775 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb8zp" podUID="11f39ddc-72f6-4699-8329-bbb34ab9a9f0" Apr 23 09:30:52.954836 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:52.954806 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-k5h5x"] Apr 23 09:30:52.978053 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:52.978026 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-k5h5x" Apr 23 09:30:52.980582 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:52.980554 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-c65vd\"" Apr 23 09:30:52.980708 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:52.980567 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 09:30:52.980708 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:52.980585 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 09:30:53.111425 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:53.111390 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/aac600fc-205b-40e3-aa53-d90a5274d7e4-tmp-dir\") pod \"node-resolver-k5h5x\" (UID: \"aac600fc-205b-40e3-aa53-d90a5274d7e4\") " pod="openshift-dns/node-resolver-k5h5x" Apr 23 09:30:53.111601 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:53.111455 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/aac600fc-205b-40e3-aa53-d90a5274d7e4-hosts-file\") pod \"node-resolver-k5h5x\" (UID: \"aac600fc-205b-40e3-aa53-d90a5274d7e4\") " pod="openshift-dns/node-resolver-k5h5x" Apr 23 09:30:53.111601 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:53.111528 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8djk\" (UniqueName: \"kubernetes.io/projected/aac600fc-205b-40e3-aa53-d90a5274d7e4-kube-api-access-w8djk\") pod \"node-resolver-k5h5x\" (UID: \"aac600fc-205b-40e3-aa53-d90a5274d7e4\") " pod="openshift-dns/node-resolver-k5h5x" Apr 23 09:30:53.212427 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:53.212354 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/aac600fc-205b-40e3-aa53-d90a5274d7e4-hosts-file\") pod \"node-resolver-k5h5x\" (UID: \"aac600fc-205b-40e3-aa53-d90a5274d7e4\") " pod="openshift-dns/node-resolver-k5h5x" Apr 23 09:30:53.212427 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:53.212402 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w8djk\" (UniqueName: \"kubernetes.io/projected/aac600fc-205b-40e3-aa53-d90a5274d7e4-kube-api-access-w8djk\") pod \"node-resolver-k5h5x\" (UID: \"aac600fc-205b-40e3-aa53-d90a5274d7e4\") " pod="openshift-dns/node-resolver-k5h5x" Apr 23 09:30:53.212617 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:53.212478 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/aac600fc-205b-40e3-aa53-d90a5274d7e4-hosts-file\") pod \"node-resolver-k5h5x\" (UID: \"aac600fc-205b-40e3-aa53-d90a5274d7e4\") " pod="openshift-dns/node-resolver-k5h5x" Apr 23 09:30:53.212617 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:53.212485 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/aac600fc-205b-40e3-aa53-d90a5274d7e4-tmp-dir\") pod \"node-resolver-k5h5x\" (UID: \"aac600fc-205b-40e3-aa53-d90a5274d7e4\") " pod="openshift-dns/node-resolver-k5h5x" Apr 23 09:30:53.212848 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:53.212830 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/aac600fc-205b-40e3-aa53-d90a5274d7e4-tmp-dir\") pod \"node-resolver-k5h5x\" (UID: \"aac600fc-205b-40e3-aa53-d90a5274d7e4\") " pod="openshift-dns/node-resolver-k5h5x" Apr 23 09:30:53.222026 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:53.222000 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8djk\" (UniqueName: \"kubernetes.io/projected/aac600fc-205b-40e3-aa53-d90a5274d7e4-kube-api-access-w8djk\") pod \"node-resolver-k5h5x\" (UID: \"aac600fc-205b-40e3-aa53-d90a5274d7e4\") " pod="openshift-dns/node-resolver-k5h5x" Apr 23 09:30:53.287003 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:53.286975 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-k5h5x" Apr 23 09:30:54.174582 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:54.174561 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb8zp" Apr 23 09:30:54.174582 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:54.174571 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:30:54.174902 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:54.174571 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:30:54.174902 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:54.174668 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-459kk" podUID="05ab6e35-f200-483a-b4f6-fee2629df7f2" Apr 23 09:30:54.174902 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:54.174733 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb8zp" podUID="11f39ddc-72f6-4699-8329-bbb34ab9a9f0" Apr 23 09:30:54.174902 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:54.174800 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6528f" podUID="c171e7cd-9c69-4ef7-9012-fad9d2b17a46" Apr 23 09:30:54.244688 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:54.244668 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-k5h5x" event={"ID":"aac600fc-205b-40e3-aa53-d90a5274d7e4","Type":"ContainerStarted","Data":"56179b48fe731635b65d2ebd63cf8e808190cc4e86c55d4c5e5348c5801a2c63"} Apr 23 09:30:55.251901 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:55.251623 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" event={"ID":"50de6365-8798-4807-b392-6cc780c49635","Type":"ContainerStarted","Data":"3c43f9f776e69823353a83cbe41c2709da195a2fdbc9a6332d686499981c27ae"} Apr 23 09:30:55.253632 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:55.253602 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-nblz4" event={"ID":"23461d04-1ac2-4f06-bec9-42875ddaa8aa","Type":"ContainerStarted","Data":"49988ec1ba21f268a58b10de8e0977e82ceb65f0d9d8345910c153560bf36c71"} Apr 23 09:30:55.258115 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:55.258003 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" event={"ID":"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260","Type":"ContainerStarted","Data":"7902d2bdb5894817a876947765cfd2bea89445a2967fea1484b972a487866cc9"} Apr 23 09:30:55.258115 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:55.258042 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" event={"ID":"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260","Type":"ContainerStarted","Data":"13bd5a17363edadc728751c3a28b9091a12ddc09434244d16e77ad046276d7da"} Apr 23 09:30:55.258115 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:55.258057 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" event={"ID":"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260","Type":"ContainerStarted","Data":"dc1cb414bf48125772ad96ff633db85255724ec57095011eb527baa1292fddb4"} Apr 23 09:30:55.258115 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:55.258069 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" event={"ID":"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260","Type":"ContainerStarted","Data":"d12ddedf1f930515b9323d2083cfbd8bdf10c1c4287503b4ca7307344a52150c"} Apr 23 09:30:55.258115 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:55.258083 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" event={"ID":"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260","Type":"ContainerStarted","Data":"827bd876549a995405f9b2e11a2b314be31ad9446390d6a808b70bd867d1829d"} Apr 23 09:30:55.258115 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:55.258096 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" event={"ID":"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260","Type":"ContainerStarted","Data":"2729a12581de597ffee98fdeaddd802bbe36acfd5f1af354421c27770a67033c"} Apr 23 09:30:55.259616 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:55.259591 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-17.ec2.internal" event={"ID":"dde61d294f6e26e3e31c5e2576f80003","Type":"ContainerStarted","Data":"5b54ff1197c2ed2e37138d8a34602f956005cdc84aba40c02bce4c4d6851175a"} Apr 23 09:30:55.266134 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:55.266098 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-rn5xp" podStartSLOduration=2.592565591 podStartE2EDuration="20.266086682s" podCreationTimestamp="2026-04-23 09:30:35 +0000 UTC" firstStartedPulling="2026-04-23 09:30:36.481817452 +0000 UTC m=+1.926879752" lastFinishedPulling="2026-04-23 09:30:54.155338537 +0000 UTC m=+19.600400843" observedRunningTime="2026-04-23 09:30:55.265119024 +0000 UTC m=+20.710181350" watchObservedRunningTime="2026-04-23 09:30:55.266086682 +0000 UTC m=+20.711149004" Apr 23 09:30:55.276415 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:55.276383 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-17.ec2.internal" podStartSLOduration=20.276373707 podStartE2EDuration="20.276373707s" podCreationTimestamp="2026-04-23 09:30:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:30:55.27619228 +0000 UTC m=+20.721254602" watchObservedRunningTime="2026-04-23 09:30:55.276373707 +0000 UTC m=+20.721436030" Apr 23 09:30:55.289911 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:55.289880 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-nblz4" podStartSLOduration=2.5317171309999997 podStartE2EDuration="20.289870129s" podCreationTimestamp="2026-04-23 09:30:35 +0000 UTC" firstStartedPulling="2026-04-23 09:30:36.435600337 +0000 UTC m=+1.880662640" lastFinishedPulling="2026-04-23 09:30:54.193753336 +0000 UTC m=+19.638815638" observedRunningTime="2026-04-23 09:30:55.289829517 +0000 UTC m=+20.734891840" watchObservedRunningTime="2026-04-23 09:30:55.289870129 +0000 UTC m=+20.734932451" Apr 23 09:30:56.175184 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:56.174999 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:30:56.175342 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:56.175058 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb8zp" Apr 23 09:30:56.175342 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:56.175241 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-459kk" podUID="05ab6e35-f200-483a-b4f6-fee2629df7f2" Apr 23 09:30:56.175342 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:56.175058 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:30:56.175342 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:56.175330 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb8zp" podUID="11f39ddc-72f6-4699-8329-bbb34ab9a9f0" Apr 23 09:30:56.175498 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:56.175398 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6528f" podUID="c171e7cd-9c69-4ef7-9012-fad9d2b17a46" Apr 23 09:30:56.261737 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:56.261710 2566 generic.go:358] "Generic (PLEG): container finished" podID="495c169c-a1fe-4740-b5e4-88f23ef7e5d0" containerID="a3f54e0f636860def0620ed38d079f68f07ad23f7e79a998cad2c16dd26dabff" exitCode=0 Apr 23 09:30:56.262043 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:56.261773 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wkpk8" event={"ID":"495c169c-a1fe-4740-b5e4-88f23ef7e5d0","Type":"ContainerDied","Data":"a3f54e0f636860def0620ed38d079f68f07ad23f7e79a998cad2c16dd26dabff"} Apr 23 09:30:56.263140 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:56.263042 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mjhnv" event={"ID":"b967352d-3f1f-4e67-b468-5d9326f772ea","Type":"ContainerStarted","Data":"a38424bb9fc8306c3480db6866fb319c8a8288f82658d337f051ef9a0252da61"} Apr 23 09:30:56.264226 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:56.264206 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nd7df" event={"ID":"1793a43e-b740-4967-92f1-e3ecda36b452","Type":"ContainerStarted","Data":"0608d234b76b054330caf628cbd461b9a84a48640090b277b24b6ab6c86f0180"} Apr 23 09:30:56.265673 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:56.265426 2566 generic.go:358] "Generic (PLEG): container finished" podID="3940101a0fb12832b381adacfd404c80" containerID="c421fe14e956f8cc5e228ef4454152573fc3dc2a3bc8288c842815fe56b109ba" exitCode=0 Apr 23 09:30:56.265673 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:56.265523 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-17.ec2.internal" event={"ID":"3940101a0fb12832b381adacfd404c80","Type":"ContainerDied","Data":"c421fe14e956f8cc5e228ef4454152573fc3dc2a3bc8288c842815fe56b109ba"} Apr 23 09:30:56.266917 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:56.266893 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-57rnm" event={"ID":"9461f0b4-eff2-4028-be01-d417107cb9a8","Type":"ContainerStarted","Data":"55c653e266b9c9a332c992b7c54c54d03c21b614468d2954a0f7a3587cb9ea37"} Apr 23 09:30:56.268312 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:56.268276 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-tdc8p" event={"ID":"70691076-656b-493e-863d-2a367fb8eccf","Type":"ContainerStarted","Data":"b97f4b7efb12957614f40ef7e69d19b18f7351866f85adc89ff20ac68885e2d3"} Apr 23 09:30:56.269452 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:56.269429 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-k5h5x" event={"ID":"aac600fc-205b-40e3-aa53-d90a5274d7e4","Type":"ContainerStarted","Data":"07adff6104835d2a9aa2478ae2d9932a90aa7a20d9ee88576fe2d6cd7185b8bb"} Apr 23 09:30:56.290599 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:56.290561 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-tdc8p" podStartSLOduration=7.966054061 podStartE2EDuration="21.290548465s" podCreationTimestamp="2026-04-23 09:30:35 +0000 UTC" firstStartedPulling="2026-04-23 09:30:36.393168888 +0000 UTC m=+1.838231190" lastFinishedPulling="2026-04-23 09:30:49.717663284 +0000 UTC m=+15.162725594" observedRunningTime="2026-04-23 09:30:56.290347292 +0000 UTC m=+21.735409626" watchObservedRunningTime="2026-04-23 09:30:56.290548465 +0000 UTC m=+21.735610789" Apr 23 09:30:56.302067 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:56.302024 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-nd7df" podStartSLOduration=3.564920204 podStartE2EDuration="21.30201082s" podCreationTimestamp="2026-04-23 09:30:35 +0000 UTC" firstStartedPulling="2026-04-23 09:30:36.419513545 +0000 UTC m=+1.864575846" lastFinishedPulling="2026-04-23 09:30:54.156604161 +0000 UTC m=+19.601666462" observedRunningTime="2026-04-23 09:30:56.301550394 +0000 UTC m=+21.746612718" watchObservedRunningTime="2026-04-23 09:30:56.30201082 +0000 UTC m=+21.747073144" Apr 23 09:30:56.314803 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:56.314468 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-k5h5x" podStartSLOduration=4.314455009 podStartE2EDuration="4.314455009s" podCreationTimestamp="2026-04-23 09:30:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:30:56.314128796 +0000 UTC m=+21.759191120" watchObservedRunningTime="2026-04-23 09:30:56.314455009 +0000 UTC m=+21.759517334" Apr 23 09:30:56.317361 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:56.317062 2566 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 09:30:56.335693 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:56.335660 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-57rnm" podStartSLOduration=3.650893651 podStartE2EDuration="21.335649192s" podCreationTimestamp="2026-04-23 09:30:35 +0000 UTC" firstStartedPulling="2026-04-23 09:30:36.456034548 +0000 UTC m=+1.901096852" lastFinishedPulling="2026-04-23 09:30:54.140790091 +0000 UTC m=+19.585852393" observedRunningTime="2026-04-23 09:30:56.335364087 +0000 UTC m=+21.780426415" watchObservedRunningTime="2026-04-23 09:30:56.335649192 +0000 UTC m=+21.780711514" Apr 23 09:30:56.612398 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:56.612326 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-57rnm" Apr 23 09:30:56.612921 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:56.612906 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-57rnm" Apr 23 09:30:57.138289 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:57.138207 2566 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T09:30:56.317085397Z","UUID":"b0769adc-a0e0-4a37-a5a7-bce91d840c19","Handler":null,"Name":"","Endpoint":""} Apr 23 09:30:57.139689 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:57.139666 2566 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 09:30:57.139689 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:57.139690 2566 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 09:30:57.272807 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:57.272781 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mjhnv" event={"ID":"b967352d-3f1f-4e67-b468-5d9326f772ea","Type":"ContainerStarted","Data":"eb2afeadf04ca01b90ff6a93ad1c73a56ef9f4fd330f865fd7187f2cecf3f7bb"} Apr 23 09:30:57.275389 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:57.275356 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" event={"ID":"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260","Type":"ContainerStarted","Data":"320f76e35274ef80feb64fe05415c70a00bd929aa076219b5e4d594e6635754d"} Apr 23 09:30:57.276827 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:57.276807 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-17.ec2.internal" event={"ID":"3940101a0fb12832b381adacfd404c80","Type":"ContainerStarted","Data":"f1814291d4cb4bec42bd2a7c1c6dadf75f6b80f783583dc11e11ddc058e401cd"} Apr 23 09:30:57.277153 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:57.277136 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-57rnm" Apr 23 09:30:57.277626 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:57.277610 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-57rnm" Apr 23 09:30:57.359805 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:57.359764 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-17.ec2.internal" podStartSLOduration=22.359754155 podStartE2EDuration="22.359754155s" podCreationTimestamp="2026-04-23 09:30:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:30:57.335999946 +0000 UTC m=+22.781062268" watchObservedRunningTime="2026-04-23 09:30:57.359754155 +0000 UTC m=+22.804816477" Apr 23 09:30:58.175423 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:58.175393 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:30:58.175601 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:58.175431 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb8zp" Apr 23 09:30:58.175601 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:58.175500 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-459kk" podUID="05ab6e35-f200-483a-b4f6-fee2629df7f2" Apr 23 09:30:58.175601 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:58.175558 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:30:58.175754 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:58.175662 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6528f" podUID="c171e7cd-9c69-4ef7-9012-fad9d2b17a46" Apr 23 09:30:58.175754 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:58.175737 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb8zp" podUID="11f39ddc-72f6-4699-8329-bbb34ab9a9f0" Apr 23 09:30:58.354673 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:58.354644 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05ab6e35-f200-483a-b4f6-fee2629df7f2-original-pull-secret\") pod \"global-pull-secret-syncer-459kk\" (UID: \"05ab6e35-f200-483a-b4f6-fee2629df7f2\") " pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:30:58.355041 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:58.354771 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 09:30:58.355041 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:30:58.354839 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05ab6e35-f200-483a-b4f6-fee2629df7f2-original-pull-secret podName:05ab6e35-f200-483a-b4f6-fee2629df7f2 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:14.354820952 +0000 UTC m=+39.799883261 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/05ab6e35-f200-483a-b4f6-fee2629df7f2-original-pull-secret") pod "global-pull-secret-syncer-459kk" (UID: "05ab6e35-f200-483a-b4f6-fee2629df7f2") : object "kube-system"/"original-pull-secret" not registered Apr 23 09:30:59.290451 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:59.290238 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mjhnv" event={"ID":"b967352d-3f1f-4e67-b468-5d9326f772ea","Type":"ContainerStarted","Data":"8ddaca0914550e9d530f999a8f4846227e4728fa1393d35902682ab9817dd1ab"} Apr 23 09:30:59.307950 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:30:59.307863 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mjhnv" podStartSLOduration=2.001934577 podStartE2EDuration="24.307845115s" podCreationTimestamp="2026-04-23 09:30:35 +0000 UTC" firstStartedPulling="2026-04-23 09:30:36.465132093 +0000 UTC m=+1.910194394" lastFinishedPulling="2026-04-23 09:30:58.771042622 +0000 UTC m=+24.216104932" observedRunningTime="2026-04-23 09:30:59.306912727 +0000 UTC m=+24.751975085" watchObservedRunningTime="2026-04-23 09:30:59.307845115 +0000 UTC m=+24.752907534" Apr 23 09:31:00.175410 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:00.175381 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb8zp" Apr 23 09:31:00.175410 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:00.175405 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:31:00.176162 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:00.175387 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:31:00.176162 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:00.175482 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb8zp" podUID="11f39ddc-72f6-4699-8329-bbb34ab9a9f0" Apr 23 09:31:00.176162 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:00.175561 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6528f" podUID="c171e7cd-9c69-4ef7-9012-fad9d2b17a46" Apr 23 09:31:00.176162 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:00.175663 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-459kk" podUID="05ab6e35-f200-483a-b4f6-fee2629df7f2" Apr 23 09:31:00.295400 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:00.295356 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" event={"ID":"cbbe5fc8-e02f-40b7-8e74-ca04c14c0260","Type":"ContainerStarted","Data":"a840e2bd82a7c283aeefc1a2645424a9daf5b35916609555c0f761c24903923c"} Apr 23 09:31:00.327552 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:00.327502 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" podStartSLOduration=7.048440285 podStartE2EDuration="25.327488625s" podCreationTimestamp="2026-04-23 09:30:35 +0000 UTC" firstStartedPulling="2026-04-23 09:30:36.407931045 +0000 UTC m=+1.852993346" lastFinishedPulling="2026-04-23 09:30:54.686979381 +0000 UTC m=+20.132041686" observedRunningTime="2026-04-23 09:31:00.327021057 +0000 UTC m=+25.772083381" watchObservedRunningTime="2026-04-23 09:31:00.327488625 +0000 UTC m=+25.772550947" Apr 23 09:31:01.217914 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:01.217657 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6528f"] Apr 23 09:31:01.218393 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:01.217974 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:31:01.218393 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:01.218076 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6528f" podUID="c171e7cd-9c69-4ef7-9012-fad9d2b17a46" Apr 23 09:31:01.220513 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:01.220485 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-sb8zp"] Apr 23 09:31:01.220645 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:01.220573 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb8zp" Apr 23 09:31:01.220713 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:01.220665 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb8zp" podUID="11f39ddc-72f6-4699-8329-bbb34ab9a9f0" Apr 23 09:31:01.221311 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:01.221270 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-459kk"] Apr 23 09:31:01.221400 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:01.221384 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:31:01.221525 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:01.221503 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-459kk" podUID="05ab6e35-f200-483a-b4f6-fee2629df7f2" Apr 23 09:31:01.297272 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:01.297233 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:31:01.297272 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:01.297265 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:31:01.297272 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:01.297276 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:31:01.313193 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:01.313174 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:31:01.313374 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:01.313353 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:31:03.174857 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:03.174781 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:31:03.174857 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:03.174821 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb8zp" Apr 23 09:31:03.175330 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:03.174912 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:31:03.175330 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:03.174919 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb8zp" podUID="11f39ddc-72f6-4699-8329-bbb34ab9a9f0" Apr 23 09:31:03.175330 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:03.175000 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-459kk" podUID="05ab6e35-f200-483a-b4f6-fee2629df7f2" Apr 23 09:31:03.175330 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:03.175074 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6528f" podUID="c171e7cd-9c69-4ef7-9012-fad9d2b17a46" Apr 23 09:31:03.301515 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:03.301490 2566 generic.go:358] "Generic (PLEG): container finished" podID="495c169c-a1fe-4740-b5e4-88f23ef7e5d0" containerID="6b94d311ffc8c6270b5dec63e8d32bf84b164280b522cded4c01c292fb998447" exitCode=0 Apr 23 09:31:03.301634 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:03.301572 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wkpk8" event={"ID":"495c169c-a1fe-4740-b5e4-88f23ef7e5d0","Type":"ContainerDied","Data":"6b94d311ffc8c6270b5dec63e8d32bf84b164280b522cded4c01c292fb998447"} Apr 23 09:31:05.175045 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:05.175018 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb8zp" Apr 23 09:31:05.175412 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:05.175101 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb8zp" podUID="11f39ddc-72f6-4699-8329-bbb34ab9a9f0" Apr 23 09:31:05.175412 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:05.175176 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:31:05.175412 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:05.175258 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6528f" podUID="c171e7cd-9c69-4ef7-9012-fad9d2b17a46" Apr 23 09:31:05.175412 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:05.175306 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:31:05.175412 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:05.175358 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-459kk" podUID="05ab6e35-f200-483a-b4f6-fee2629df7f2" Apr 23 09:31:05.307385 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:05.307353 2566 generic.go:358] "Generic (PLEG): container finished" podID="495c169c-a1fe-4740-b5e4-88f23ef7e5d0" containerID="e53d0592d2d4415a43a7367d088b329129c9b90383dab9ff362ba2594dd6adfa" exitCode=0 Apr 23 09:31:05.307537 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:05.307412 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wkpk8" event={"ID":"495c169c-a1fe-4740-b5e4-88f23ef7e5d0","Type":"ContainerDied","Data":"e53d0592d2d4415a43a7367d088b329129c9b90383dab9ff362ba2594dd6adfa"} Apr 23 09:31:06.311580 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:06.311364 2566 generic.go:358] "Generic (PLEG): container finished" podID="495c169c-a1fe-4740-b5e4-88f23ef7e5d0" containerID="112f634f13e3184ce4ec29f3247b1a3b7342f27bb0256737b6367c1c423ca097" exitCode=0 Apr 23 09:31:06.311580 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:06.311449 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wkpk8" event={"ID":"495c169c-a1fe-4740-b5e4-88f23ef7e5d0","Type":"ContainerDied","Data":"112f634f13e3184ce4ec29f3247b1a3b7342f27bb0256737b6367c1c423ca097"} Apr 23 09:31:07.175068 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.175035 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:31:07.175240 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.175035 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:31:07.175240 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.175164 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb8zp" Apr 23 09:31:07.175240 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:07.175163 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6528f" podUID="c171e7cd-9c69-4ef7-9012-fad9d2b17a46" Apr 23 09:31:07.175428 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:07.175239 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-459kk" podUID="05ab6e35-f200-483a-b4f6-fee2629df7f2" Apr 23 09:31:07.175428 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:07.175327 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sb8zp" podUID="11f39ddc-72f6-4699-8329-bbb34ab9a9f0" Apr 23 09:31:07.370861 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.370833 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-17.ec2.internal" event="NodeReady" Apr 23 09:31:07.371286 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.370963 2566 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 09:31:07.420492 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.415938 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-bpp9r"] Apr 23 09:31:07.422003 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.421977 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ftn6j"] Apr 23 09:31:07.422133 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.422091 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-bpp9r" Apr 23 09:31:07.425259 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.425196 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-86d78b875b-rbksj"] Apr 23 09:31:07.427217 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.426851 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 23 09:31:07.427217 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.426879 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 23 09:31:07.429045 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.427819 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 23 09:31:07.429045 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.428039 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-jfc6g\"" Apr 23 09:31:07.429045 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.428128 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:31:07.429045 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.428262 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 23 09:31:07.429045 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.428039 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-zllgx"] Apr 23 09:31:07.429045 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.428565 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ftn6j" Apr 23 09:31:07.432155 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.431508 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 23 09:31:07.432155 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.431537 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 23 09:31:07.432155 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.431630 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-554d8f4b86-p7vqw"] Apr 23 09:31:07.432155 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.431647 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 23 09:31:07.432155 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.431719 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-zllgx" Apr 23 09:31:07.434404 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.432622 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 23 09:31:07.434404 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.433044 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 23 09:31:07.434404 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.433551 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 23 09:31:07.434404 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.433797 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 23 09:31:07.434624 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.434471 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-fhpfg\"" Apr 23 09:31:07.434852 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.434833 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 23 09:31:07.435308 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.435269 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:07.435398 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.435381 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-9swqv\"" Apr 23 09:31:07.438567 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.434935 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 23 09:31:07.438567 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.435180 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nxsf6"] Apr 23 09:31:07.441421 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.441236 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 23 09:31:07.442052 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.441505 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 23 09:31:07.442052 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.441870 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-2db27\"" Apr 23 09:31:07.442052 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.441894 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 09:31:07.443749 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.443725 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-bpp9r"] Apr 23 09:31:07.444346 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.443768 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-2mjdq"] Apr 23 09:31:07.444527 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.444501 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nxsf6" Apr 23 09:31:07.445171 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.445150 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 09:31:07.445744 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.445169 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 09:31:07.447223 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.447202 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-shblk\"" Apr 23 09:31:07.447381 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.447359 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 09:31:07.447452 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.447422 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 09:31:07.448191 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.448168 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 23 09:31:07.451835 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.451685 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 23 09:31:07.454686 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.454665 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-npwpf\"" Apr 23 09:31:07.455007 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.454991 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 23 09:31:07.455846 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.455670 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 23 09:31:07.456117 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.456097 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-zxwg9"] Apr 23 09:31:07.460370 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.459152 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rj967"] Apr 23 09:31:07.460370 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.459784 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zxwg9" Apr 23 09:31:07.460370 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.459786 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 09:31:07.462118 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.462015 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 09:31:07.462216 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.462143 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 23 09:31:07.462461 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.462431 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qbzv8"] Apr 23 09:31:07.462573 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.462557 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2mjdq" Apr 23 09:31:07.463385 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.463367 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 23 09:31:07.463498 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.463399 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 09:31:07.463785 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.463752 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-9gsjj\"" Apr 23 09:31:07.465238 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.465218 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 09:31:07.465580 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.465560 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mvkwp\"" Apr 23 09:31:07.465682 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.465667 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 09:31:07.470755 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.470739 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4b5f9c8c-mfnsg"] Apr 23 09:31:07.470925 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.470906 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rj967" Apr 23 09:31:07.471311 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.471281 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qbzv8" Apr 23 09:31:07.474158 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.474131 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 09:31:07.474593 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.474178 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 09:31:07.474986 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.474187 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mw7t9\"" Apr 23 09:31:07.475767 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.474235 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 09:31:07.475947 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.474328 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-g4cmd\"" Apr 23 09:31:07.478947 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.478927 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-zvsn4"] Apr 23 09:31:07.479642 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.479603 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4b5f9c8c-mfnsg" Apr 23 09:31:07.480543 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.479927 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 23 09:31:07.481089 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.481070 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 23 09:31:07.481489 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.479286 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 23 09:31:07.482805 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.482768 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 23 09:31:07.482941 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.482926 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 23 09:31:07.483282 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.483267 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 23 09:31:07.483465 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.483451 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 23 09:31:07.484154 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.484135 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-2t2tv\"" Apr 23 09:31:07.484334 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.484138 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-vqvls"] Apr 23 09:31:07.484611 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.484229 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-zvsn4" Apr 23 09:31:07.484870 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.484853 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 23 09:31:07.486930 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.486914 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 09:31:07.487183 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.487164 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b5955c777-bgbzm"] Apr 23 09:31:07.487409 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.487389 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vqvls" Apr 23 09:31:07.487481 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.487465 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 09:31:07.487898 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.487880 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-45vcn\"" Apr 23 09:31:07.489816 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.489792 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-vvx66\"" Apr 23 09:31:07.490124 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.490108 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 23 09:31:07.490451 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.490434 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n72bx"] Apr 23 09:31:07.490553 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.490524 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 23 09:31:07.490706 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.490692 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b5955c777-bgbzm" Apr 23 09:31:07.493412 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.493398 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9"] Apr 23 09:31:07.493534 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.493515 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n72bx" Apr 23 09:31:07.493632 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.493593 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 23 09:31:07.496235 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.496216 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 23 09:31:07.496235 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.496216 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 23 09:31:07.496401 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.496373 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 23 09:31:07.496466 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.496446 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 23 09:31:07.496623 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.496605 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-zllgx"] Apr 23 09:31:07.496753 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.496735 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nxsf6"] Apr 23 09:31:07.496841 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.496761 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-554d8f4b86-p7vqw"] Apr 23 09:31:07.496841 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.496775 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-86d78b875b-rbksj"] Apr 23 09:31:07.496841 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.496645 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-sr9k2\"" Apr 23 09:31:07.496841 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.496788 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qbzv8"] Apr 23 09:31:07.496841 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.496801 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ftn6j"] Apr 23 09:31:07.496841 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.496814 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4b5f9c8c-mfnsg"] Apr 23 09:31:07.496841 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.496827 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-vqvls"] Apr 23 09:31:07.496841 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.496841 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-zxwg9"] Apr 23 09:31:07.497192 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.496857 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n72bx"] Apr 23 09:31:07.497192 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.496871 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2mjdq"] Apr 23 09:31:07.497192 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.496686 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" Apr 23 09:31:07.497192 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.496884 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rj967"] Apr 23 09:31:07.497192 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.496983 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9"] Apr 23 09:31:07.497192 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.496997 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b5955c777-bgbzm"] Apr 23 09:31:07.497192 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.497010 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-zvsn4"] Apr 23 09:31:07.499246 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.499221 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 23 09:31:07.499642 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.499619 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 23 09:31:07.499873 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.499847 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 23 09:31:07.500013 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.499994 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 23 09:31:07.527748 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.527730 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91f0276c-d1f7-4977-a25e-107ac0756380-service-ca-bundle\") pod \"router-default-86d78b875b-rbksj\" (UID: \"91f0276c-d1f7-4977-a25e-107ac0756380\") " pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:31:07.527850 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.527757 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/25c6abb3-0bf5-4a51-9450-a29341379573-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ftn6j\" (UID: \"25c6abb3-0bf5-4a51-9450-a29341379573\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ftn6j" Apr 23 09:31:07.527850 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.527783 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5696\" (UniqueName: \"kubernetes.io/projected/25c6abb3-0bf5-4a51-9450-a29341379573-kube-api-access-z5696\") pod \"cluster-samples-operator-6dc5bdb6b4-ftn6j\" (UID: \"25c6abb3-0bf5-4a51-9450-a29341379573\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ftn6j" Apr 23 09:31:07.527850 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.527822 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/91f0276c-d1f7-4977-a25e-107ac0756380-default-certificate\") pod \"router-default-86d78b875b-rbksj\" (UID: \"91f0276c-d1f7-4977-a25e-107ac0756380\") " pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:31:07.527850 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.527848 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91f0276c-d1f7-4977-a25e-107ac0756380-metrics-certs\") pod \"router-default-86d78b875b-rbksj\" (UID: \"91f0276c-d1f7-4977-a25e-107ac0756380\") " pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:31:07.528027 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.527898 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/164f4228-f916-4e95-9a43-29cf40bacf4a-ca-trust-extracted\") pod \"image-registry-554d8f4b86-p7vqw\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:07.528027 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.527944 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3b73d76-bb0a-4807-ba45-02945da31336-serving-cert\") pod \"console-operator-9d4b6777b-bpp9r\" (UID: \"e3b73d76-bb0a-4807-ba45-02945da31336\") " pod="openshift-console-operator/console-operator-9d4b6777b-bpp9r" Apr 23 09:31:07.528027 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.527986 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/164f4228-f916-4e95-9a43-29cf40bacf4a-registry-certificates\") pod \"image-registry-554d8f4b86-p7vqw\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:07.528027 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.528018 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f9mk\" (UniqueName: \"kubernetes.io/projected/e3b73d76-bb0a-4807-ba45-02945da31336-kube-api-access-7f9mk\") pod \"console-operator-9d4b6777b-bpp9r\" (UID: \"e3b73d76-bb0a-4807-ba45-02945da31336\") " pod="openshift-console-operator/console-operator-9d4b6777b-bpp9r" Apr 23 09:31:07.528176 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.528057 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/bfdda918-82b7-430c-9e8e-f2555930fa85-snapshots\") pod \"insights-operator-585dfdc468-zllgx\" (UID: \"bfdda918-82b7-430c-9e8e-f2555930fa85\") " pod="openshift-insights/insights-operator-585dfdc468-zllgx" Apr 23 09:31:07.528176 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.528121 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfdda918-82b7-430c-9e8e-f2555930fa85-service-ca-bundle\") pod \"insights-operator-585dfdc468-zllgx\" (UID: \"bfdda918-82b7-430c-9e8e-f2555930fa85\") " pod="openshift-insights/insights-operator-585dfdc468-zllgx" Apr 23 09:31:07.528176 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.528153 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfdda918-82b7-430c-9e8e-f2555930fa85-serving-cert\") pod \"insights-operator-585dfdc468-zllgx\" (UID: \"bfdda918-82b7-430c-9e8e-f2555930fa85\") " pod="openshift-insights/insights-operator-585dfdc468-zllgx" Apr 23 09:31:07.528339 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.528192 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4vjk\" (UniqueName: \"kubernetes.io/projected/91f0276c-d1f7-4977-a25e-107ac0756380-kube-api-access-x4vjk\") pod \"router-default-86d78b875b-rbksj\" (UID: \"91f0276c-d1f7-4977-a25e-107ac0756380\") " pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:31:07.528339 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.528217 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfdda918-82b7-430c-9e8e-f2555930fa85-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-zllgx\" (UID: \"bfdda918-82b7-430c-9e8e-f2555930fa85\") " pod="openshift-insights/insights-operator-585dfdc468-zllgx" Apr 23 09:31:07.528339 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.528242 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h855\" (UniqueName: \"kubernetes.io/projected/d01a5541-6b84-4253-a5a5-e0d82b86f84d-kube-api-access-9h855\") pod \"volume-data-source-validator-7c6cbb6c87-nxsf6\" (UID: \"d01a5541-6b84-4253-a5a5-e0d82b86f84d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nxsf6" Apr 23 09:31:07.528339 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.528271 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c82kr\" (UniqueName: \"kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-kube-api-access-c82kr\") pod \"image-registry-554d8f4b86-p7vqw\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:07.528339 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.528337 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e3b73d76-bb0a-4807-ba45-02945da31336-trusted-ca\") pod \"console-operator-9d4b6777b-bpp9r\" (UID: \"e3b73d76-bb0a-4807-ba45-02945da31336\") " pod="openshift-console-operator/console-operator-9d4b6777b-bpp9r" Apr 23 09:31:07.528553 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.528380 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/164f4228-f916-4e95-9a43-29cf40bacf4a-trusted-ca\") pod \"image-registry-554d8f4b86-p7vqw\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:07.528553 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.528403 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bfdda918-82b7-430c-9e8e-f2555930fa85-tmp\") pod \"insights-operator-585dfdc468-zllgx\" (UID: \"bfdda918-82b7-430c-9e8e-f2555930fa85\") " pod="openshift-insights/insights-operator-585dfdc468-zllgx" Apr 23 09:31:07.528553 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.528466 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/164f4228-f916-4e95-9a43-29cf40bacf4a-installation-pull-secrets\") pod \"image-registry-554d8f4b86-p7vqw\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:07.528553 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.528540 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-registry-tls\") pod \"image-registry-554d8f4b86-p7vqw\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:07.528738 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.528565 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b73d76-bb0a-4807-ba45-02945da31336-config\") pod \"console-operator-9d4b6777b-bpp9r\" (UID: \"e3b73d76-bb0a-4807-ba45-02945da31336\") " pod="openshift-console-operator/console-operator-9d4b6777b-bpp9r" Apr 23 09:31:07.528738 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.528588 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/91f0276c-d1f7-4977-a25e-107ac0756380-stats-auth\") pod \"router-default-86d78b875b-rbksj\" (UID: \"91f0276c-d1f7-4977-a25e-107ac0756380\") " pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:31:07.528738 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.528609 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/164f4228-f916-4e95-9a43-29cf40bacf4a-image-registry-private-configuration\") pod \"image-registry-554d8f4b86-p7vqw\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:07.528738 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.528650 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-bound-sa-token\") pod \"image-registry-554d8f4b86-p7vqw\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:07.528738 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.528718 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxsdg\" (UniqueName: \"kubernetes.io/projected/bfdda918-82b7-430c-9e8e-f2555930fa85-kube-api-access-lxsdg\") pod \"insights-operator-585dfdc468-zllgx\" (UID: \"bfdda918-82b7-430c-9e8e-f2555930fa85\") " pod="openshift-insights/insights-operator-585dfdc468-zllgx" Apr 23 09:31:07.629823 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.629795 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-bound-sa-token\") pod \"image-registry-554d8f4b86-p7vqw\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:07.629978 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.629837 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5f78147d-0dbc-471a-a1c9-05e2a3cb5333-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-vqvls\" (UID: \"5f78147d-0dbc-471a-a1c9-05e2a3cb5333\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vqvls" Apr 23 09:31:07.629978 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.629866 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/fe118deb-6994-428b-9927-b07aafd254b6-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-zxwg9\" (UID: \"fe118deb-6994-428b-9927-b07aafd254b6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zxwg9" Apr 23 09:31:07.630094 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.629975 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b73d76-bb0a-4807-ba45-02945da31336-config\") pod \"console-operator-9d4b6777b-bpp9r\" (UID: \"e3b73d76-bb0a-4807-ba45-02945da31336\") " pod="openshift-console-operator/console-operator-9d4b6777b-bpp9r" Apr 23 09:31:07.630094 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.630045 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/eaa17ea4-444a-410e-b077-79a2168b8f71-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6b858989cb-mf2j9\" (UID: \"eaa17ea4-444a-410e-b077-79a2168b8f71\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" Apr 23 09:31:07.630094 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.630068 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/164f4228-f916-4e95-9a43-29cf40bacf4a-image-registry-private-configuration\") pod \"image-registry-554d8f4b86-p7vqw\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:07.630225 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.630093 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnxrm\" (UniqueName: \"kubernetes.io/projected/fe118deb-6994-428b-9927-b07aafd254b6-kube-api-access-bnxrm\") pod \"cluster-monitoring-operator-75587bd455-zxwg9\" (UID: \"fe118deb-6994-428b-9927-b07aafd254b6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zxwg9" Apr 23 09:31:07.630225 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.630119 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/eaa17ea4-444a-410e-b077-79a2168b8f71-ca\") pod \"cluster-proxy-proxy-agent-6b858989cb-mf2j9\" (UID: \"eaa17ea4-444a-410e-b077-79a2168b8f71\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" Apr 23 09:31:07.630225 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.630159 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/eaa17ea4-444a-410e-b077-79a2168b8f71-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6b858989cb-mf2j9\" (UID: \"eaa17ea4-444a-410e-b077-79a2168b8f71\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" Apr 23 09:31:07.630225 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.630193 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe118deb-6994-428b-9927-b07aafd254b6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zxwg9\" (UID: \"fe118deb-6994-428b-9927-b07aafd254b6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zxwg9" Apr 23 09:31:07.630437 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.630230 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91f0276c-d1f7-4977-a25e-107ac0756380-service-ca-bundle\") pod \"router-default-86d78b875b-rbksj\" (UID: \"91f0276c-d1f7-4977-a25e-107ac0756380\") " pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:31:07.630437 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.630258 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c46256a3-d81a-45a4-b658-3a9db962e17a-serving-cert\") pod \"service-ca-operator-d6fc45fc5-n72bx\" (UID: \"c46256a3-d81a-45a4-b658-3a9db962e17a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n72bx" Apr 23 09:31:07.630437 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.630281 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c46256a3-d81a-45a4-b658-3a9db962e17a-config\") pod \"service-ca-operator-d6fc45fc5-n72bx\" (UID: \"c46256a3-d81a-45a4-b658-3a9db962e17a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n72bx" Apr 23 09:31:07.630437 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.630325 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/eaa17ea4-444a-410e-b077-79a2168b8f71-hub\") pod \"cluster-proxy-proxy-agent-6b858989cb-mf2j9\" (UID: \"eaa17ea4-444a-410e-b077-79a2168b8f71\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" Apr 23 09:31:07.630437 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.630377 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/91f0276c-d1f7-4977-a25e-107ac0756380-default-certificate\") pod \"router-default-86d78b875b-rbksj\" (UID: \"91f0276c-d1f7-4977-a25e-107ac0756380\") " pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:31:07.630437 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:07.630388 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/91f0276c-d1f7-4977-a25e-107ac0756380-service-ca-bundle podName:91f0276c-d1f7-4977-a25e-107ac0756380 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:08.130369836 +0000 UTC m=+33.575432136 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/91f0276c-d1f7-4977-a25e-107ac0756380-service-ca-bundle") pod "router-default-86d78b875b-rbksj" (UID: "91f0276c-d1f7-4977-a25e-107ac0756380") : configmap references non-existent config key: service-ca.crt Apr 23 09:31:07.630729 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.630639 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91f0276c-d1f7-4977-a25e-107ac0756380-metrics-certs\") pod \"router-default-86d78b875b-rbksj\" (UID: \"91f0276c-d1f7-4977-a25e-107ac0756380\") " pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:31:07.630729 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.630669 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfdda918-82b7-430c-9e8e-f2555930fa85-service-ca-bundle\") pod \"insights-operator-585dfdc468-zllgx\" (UID: \"bfdda918-82b7-430c-9e8e-f2555930fa85\") " pod="openshift-insights/insights-operator-585dfdc468-zllgx" Apr 23 09:31:07.630729 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.630696 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfdda918-82b7-430c-9e8e-f2555930fa85-serving-cert\") pod \"insights-operator-585dfdc468-zllgx\" (UID: \"bfdda918-82b7-430c-9e8e-f2555930fa85\") " pod="openshift-insights/insights-operator-585dfdc468-zllgx" Apr 23 09:31:07.630729 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.630725 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/bfdda918-82b7-430c-9e8e-f2555930fa85-snapshots\") pod \"insights-operator-585dfdc468-zllgx\" (UID: \"bfdda918-82b7-430c-9e8e-f2555930fa85\") " pod="openshift-insights/insights-operator-585dfdc468-zllgx" Apr 23 09:31:07.630928 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.630752 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfftj\" (UniqueName: \"kubernetes.io/projected/541d1fdf-75b3-4058-97be-3140f4c7fdb2-kube-api-access-vfftj\") pod \"kube-storage-version-migrator-operator-6769c5d45-qbzv8\" (UID: \"541d1fdf-75b3-4058-97be-3140f4c7fdb2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qbzv8" Apr 23 09:31:07.630928 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.630781 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/541d1fdf-75b3-4058-97be-3140f4c7fdb2-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-qbzv8\" (UID: \"541d1fdf-75b3-4058-97be-3140f4c7fdb2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qbzv8" Apr 23 09:31:07.630928 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.630808 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5f78147d-0dbc-471a-a1c9-05e2a3cb5333-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vqvls\" (UID: \"5f78147d-0dbc-471a-a1c9-05e2a3cb5333\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vqvls" Apr 23 09:31:07.630928 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.630835 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c82kr\" (UniqueName: \"kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-kube-api-access-c82kr\") pod \"image-registry-554d8f4b86-p7vqw\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:07.630928 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.630862 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e3b73d76-bb0a-4807-ba45-02945da31336-trusted-ca\") pod \"console-operator-9d4b6777b-bpp9r\" (UID: \"e3b73d76-bb0a-4807-ba45-02945da31336\") " pod="openshift-console-operator/console-operator-9d4b6777b-bpp9r" Apr 23 09:31:07.630928 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.630891 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/164f4228-f916-4e95-9a43-29cf40bacf4a-trusted-ca\") pod \"image-registry-554d8f4b86-p7vqw\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:07.630928 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.630921 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/164f4228-f916-4e95-9a43-29cf40bacf4a-installation-pull-secrets\") pod \"image-registry-554d8f4b86-p7vqw\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:07.631258 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.630950 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6qt7\" (UniqueName: \"kubernetes.io/projected/6b8bf9f1-73a1-4060-b09a-55a78aed4470-kube-api-access-w6qt7\") pod \"dns-default-2mjdq\" (UID: \"6b8bf9f1-73a1-4060-b09a-55a78aed4470\") " pod="openshift-dns/dns-default-2mjdq" Apr 23 09:31:07.631258 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.630984 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lxsdg\" (UniqueName: \"kubernetes.io/projected/bfdda918-82b7-430c-9e8e-f2555930fa85-kube-api-access-lxsdg\") pod \"insights-operator-585dfdc468-zllgx\" (UID: \"bfdda918-82b7-430c-9e8e-f2555930fa85\") " pod="openshift-insights/insights-operator-585dfdc468-zllgx" Apr 23 09:31:07.631258 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.631013 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-registry-tls\") pod \"image-registry-554d8f4b86-p7vqw\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:07.631258 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.631062 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/91f0276c-d1f7-4977-a25e-107ac0756380-stats-auth\") pod \"router-default-86d78b875b-rbksj\" (UID: \"91f0276c-d1f7-4977-a25e-107ac0756380\") " pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:31:07.631258 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.631092 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30c9a344-56d8-4716-84da-8665f9ee4946-cert\") pod \"ingress-canary-rj967\" (UID: \"30c9a344-56d8-4716-84da-8665f9ee4946\") " pod="openshift-ingress-canary/ingress-canary-rj967" Apr 23 09:31:07.631258 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.631118 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-969t5\" (UniqueName: \"kubernetes.io/projected/c46256a3-d81a-45a4-b658-3a9db962e17a-kube-api-access-969t5\") pod \"service-ca-operator-d6fc45fc5-n72bx\" (UID: \"c46256a3-d81a-45a4-b658-3a9db962e17a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n72bx" Apr 23 09:31:07.631258 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.631146 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqt2r\" (UniqueName: \"kubernetes.io/projected/eaa17ea4-444a-410e-b077-79a2168b8f71-kube-api-access-fqt2r\") pod \"cluster-proxy-proxy-agent-6b858989cb-mf2j9\" (UID: \"eaa17ea4-444a-410e-b077-79a2168b8f71\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" Apr 23 09:31:07.631258 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.631180 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b8bf9f1-73a1-4060-b09a-55a78aed4470-config-volume\") pod \"dns-default-2mjdq\" (UID: \"6b8bf9f1-73a1-4060-b09a-55a78aed4470\") " pod="openshift-dns/dns-default-2mjdq" Apr 23 09:31:07.631258 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.631210 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/541d1fdf-75b3-4058-97be-3140f4c7fdb2-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-qbzv8\" (UID: \"541d1fdf-75b3-4058-97be-3140f4c7fdb2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qbzv8" Apr 23 09:31:07.631258 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.631234 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b73d76-bb0a-4807-ba45-02945da31336-config\") pod \"console-operator-9d4b6777b-bpp9r\" (UID: \"e3b73d76-bb0a-4807-ba45-02945da31336\") " pod="openshift-console-operator/console-operator-9d4b6777b-bpp9r" Apr 23 09:31:07.631258 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.631236 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn4x4\" (UniqueName: \"kubernetes.io/projected/d24244c3-869e-47a6-92c7-a26c11be6b66-kube-api-access-qn4x4\") pod \"managed-serviceaccount-addon-agent-5f4b5f9c8c-mfnsg\" (UID: \"d24244c3-869e-47a6-92c7-a26c11be6b66\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4b5f9c8c-mfnsg" Apr 23 09:31:07.631752 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.631310 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rrx5\" (UniqueName: \"kubernetes.io/projected/30c9a344-56d8-4716-84da-8665f9ee4946-kube-api-access-4rrx5\") pod \"ingress-canary-rj967\" (UID: \"30c9a344-56d8-4716-84da-8665f9ee4946\") " pod="openshift-ingress-canary/ingress-canary-rj967" Apr 23 09:31:07.631752 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:07.631342 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 09:31:07.631752 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.631344 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/25c6abb3-0bf5-4a51-9450-a29341379573-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ftn6j\" (UID: \"25c6abb3-0bf5-4a51-9450-a29341379573\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ftn6j" Apr 23 09:31:07.631752 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.631373 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z5696\" (UniqueName: \"kubernetes.io/projected/25c6abb3-0bf5-4a51-9450-a29341379573-kube-api-access-z5696\") pod \"cluster-samples-operator-6dc5bdb6b4-ftn6j\" (UID: \"25c6abb3-0bf5-4a51-9450-a29341379573\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ftn6j" Apr 23 09:31:07.631752 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:07.631388 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91f0276c-d1f7-4977-a25e-107ac0756380-metrics-certs podName:91f0276c-d1f7-4977-a25e-107ac0756380 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:08.131372484 +0000 UTC m=+33.576434788 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91f0276c-d1f7-4977-a25e-107ac0756380-metrics-certs") pod "router-default-86d78b875b-rbksj" (UID: "91f0276c-d1f7-4977-a25e-107ac0756380") : secret "router-metrics-certs-default" not found Apr 23 09:31:07.636994 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.636180 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfdda918-82b7-430c-9e8e-f2555930fa85-service-ca-bundle\") pod \"insights-operator-585dfdc468-zllgx\" (UID: \"bfdda918-82b7-430c-9e8e-f2555930fa85\") " pod="openshift-insights/insights-operator-585dfdc468-zllgx" Apr 23 09:31:07.636994 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.636240 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/bfdda918-82b7-430c-9e8e-f2555930fa85-snapshots\") pod \"insights-operator-585dfdc468-zllgx\" (UID: \"bfdda918-82b7-430c-9e8e-f2555930fa85\") " pod="openshift-insights/insights-operator-585dfdc468-zllgx" Apr 23 09:31:07.636994 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.636579 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/164f4228-f916-4e95-9a43-29cf40bacf4a-image-registry-private-configuration\") pod \"image-registry-554d8f4b86-p7vqw\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:07.636994 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.636675 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/91f0276c-d1f7-4977-a25e-107ac0756380-default-certificate\") pod \"router-default-86d78b875b-rbksj\" (UID: \"91f0276c-d1f7-4977-a25e-107ac0756380\") " pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:31:07.636994 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:07.636792 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 09:31:07.636994 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:07.636807 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-554d8f4b86-p7vqw: secret "image-registry-tls" not found Apr 23 09:31:07.636994 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:07.636867 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-registry-tls podName:164f4228-f916-4e95-9a43-29cf40bacf4a nodeName:}" failed. No retries permitted until 2026-04-23 09:31:08.136843606 +0000 UTC m=+33.581905907 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-registry-tls") pod "image-registry-554d8f4b86-p7vqw" (UID: "164f4228-f916-4e95-9a43-29cf40bacf4a") : secret "image-registry-tls" not found Apr 23 09:31:07.637405 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.631417 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6b8bf9f1-73a1-4060-b09a-55a78aed4470-tmp-dir\") pod \"dns-default-2mjdq\" (UID: \"6b8bf9f1-73a1-4060-b09a-55a78aed4470\") " pod="openshift-dns/dns-default-2mjdq" Apr 23 09:31:07.637457 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.637435 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/164f4228-f916-4e95-9a43-29cf40bacf4a-ca-trust-extracted\") pod \"image-registry-554d8f4b86-p7vqw\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:07.637509 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.637474 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3b73d76-bb0a-4807-ba45-02945da31336-serving-cert\") pod \"console-operator-9d4b6777b-bpp9r\" (UID: \"e3b73d76-bb0a-4807-ba45-02945da31336\") " pod="openshift-console-operator/console-operator-9d4b6777b-bpp9r" Apr 23 09:31:07.637562 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.637514 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6b8bf9f1-73a1-4060-b09a-55a78aed4470-metrics-tls\") pod \"dns-default-2mjdq\" (UID: \"6b8bf9f1-73a1-4060-b09a-55a78aed4470\") " pod="openshift-dns/dns-default-2mjdq" Apr 23 09:31:07.637612 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.637557 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7f9mk\" (UniqueName: \"kubernetes.io/projected/e3b73d76-bb0a-4807-ba45-02945da31336-kube-api-access-7f9mk\") pod \"console-operator-9d4b6777b-bpp9r\" (UID: \"e3b73d76-bb0a-4807-ba45-02945da31336\") " pod="openshift-console-operator/console-operator-9d4b6777b-bpp9r" Apr 23 09:31:07.637612 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.637596 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/c455ca72-f57f-4a17-a701-1d5db923cb29-klusterlet-config\") pod \"klusterlet-addon-workmgr-6b5955c777-bgbzm\" (UID: \"c455ca72-f57f-4a17-a701-1d5db923cb29\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b5955c777-bgbzm" Apr 23 09:31:07.637700 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.637627 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/eaa17ea4-444a-410e-b077-79a2168b8f71-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6b858989cb-mf2j9\" (UID: \"eaa17ea4-444a-410e-b077-79a2168b8f71\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" Apr 23 09:31:07.637700 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.637664 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/164f4228-f916-4e95-9a43-29cf40bacf4a-registry-certificates\") pod \"image-registry-554d8f4b86-p7vqw\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:07.637795 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.637702 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c455ca72-f57f-4a17-a701-1d5db923cb29-tmp\") pod \"klusterlet-addon-workmgr-6b5955c777-bgbzm\" (UID: \"c455ca72-f57f-4a17-a701-1d5db923cb29\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b5955c777-bgbzm" Apr 23 09:31:07.637795 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.637736 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d24244c3-869e-47a6-92c7-a26c11be6b66-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5f4b5f9c8c-mfnsg\" (UID: \"d24244c3-869e-47a6-92c7-a26c11be6b66\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4b5f9c8c-mfnsg" Apr 23 09:31:07.637795 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.637773 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x4vjk\" (UniqueName: \"kubernetes.io/projected/91f0276c-d1f7-4977-a25e-107ac0756380-kube-api-access-x4vjk\") pod \"router-default-86d78b875b-rbksj\" (UID: \"91f0276c-d1f7-4977-a25e-107ac0756380\") " pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:31:07.637929 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.637803 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfdda918-82b7-430c-9e8e-f2555930fa85-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-zllgx\" (UID: \"bfdda918-82b7-430c-9e8e-f2555930fa85\") " pod="openshift-insights/insights-operator-585dfdc468-zllgx" Apr 23 09:31:07.637929 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.637822 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e3b73d76-bb0a-4807-ba45-02945da31336-trusted-ca\") pod \"console-operator-9d4b6777b-bpp9r\" (UID: \"e3b73d76-bb0a-4807-ba45-02945da31336\") " pod="openshift-console-operator/console-operator-9d4b6777b-bpp9r" Apr 23 09:31:07.637929 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.637838 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9h855\" (UniqueName: \"kubernetes.io/projected/d01a5541-6b84-4253-a5a5-e0d82b86f84d-kube-api-access-9h855\") pod \"volume-data-source-validator-7c6cbb6c87-nxsf6\" (UID: \"d01a5541-6b84-4253-a5a5-e0d82b86f84d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nxsf6" Apr 23 09:31:07.637929 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.637874 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc8vs\" (UniqueName: \"kubernetes.io/projected/b2f33469-6027-43ec-be9b-4bebb5500631-kube-api-access-dc8vs\") pod \"network-check-source-8894fc9bd-zvsn4\" (UID: \"b2f33469-6027-43ec-be9b-4bebb5500631\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-zvsn4" Apr 23 09:31:07.637929 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.637917 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s65wg\" (UniqueName: \"kubernetes.io/projected/c455ca72-f57f-4a17-a701-1d5db923cb29-kube-api-access-s65wg\") pod \"klusterlet-addon-workmgr-6b5955c777-bgbzm\" (UID: \"c455ca72-f57f-4a17-a701-1d5db923cb29\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b5955c777-bgbzm" Apr 23 09:31:07.638151 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.637967 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bfdda918-82b7-430c-9e8e-f2555930fa85-tmp\") pod \"insights-operator-585dfdc468-zllgx\" (UID: \"bfdda918-82b7-430c-9e8e-f2555930fa85\") " pod="openshift-insights/insights-operator-585dfdc468-zllgx" Apr 23 09:31:07.638151 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.637989 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/164f4228-f916-4e95-9a43-29cf40bacf4a-trusted-ca\") pod \"image-registry-554d8f4b86-p7vqw\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:07.638244 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.638169 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfdda918-82b7-430c-9e8e-f2555930fa85-serving-cert\") pod \"insights-operator-585dfdc468-zllgx\" (UID: \"bfdda918-82b7-430c-9e8e-f2555930fa85\") " pod="openshift-insights/insights-operator-585dfdc468-zllgx" Apr 23 09:31:07.638771 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.638411 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/164f4228-f916-4e95-9a43-29cf40bacf4a-ca-trust-extracted\") pod \"image-registry-554d8f4b86-p7vqw\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:07.638771 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.638536 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/164f4228-f916-4e95-9a43-29cf40bacf4a-registry-certificates\") pod \"image-registry-554d8f4b86-p7vqw\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:07.638971 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.638810 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bfdda918-82b7-430c-9e8e-f2555930fa85-tmp\") pod \"insights-operator-585dfdc468-zllgx\" (UID: \"bfdda918-82b7-430c-9e8e-f2555930fa85\") " pod="openshift-insights/insights-operator-585dfdc468-zllgx" Apr 23 09:31:07.639109 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.639032 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfdda918-82b7-430c-9e8e-f2555930fa85-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-zllgx\" (UID: \"bfdda918-82b7-430c-9e8e-f2555930fa85\") " pod="openshift-insights/insights-operator-585dfdc468-zllgx" Apr 23 09:31:07.639109 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:07.639037 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 09:31:07.639257 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:07.639143 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25c6abb3-0bf5-4a51-9450-a29341379573-samples-operator-tls podName:25c6abb3-0bf5-4a51-9450-a29341379573 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:08.139121214 +0000 UTC m=+33.584183518 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/25c6abb3-0bf5-4a51-9450-a29341379573-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-ftn6j" (UID: "25c6abb3-0bf5-4a51-9450-a29341379573") : secret "samples-operator-tls" not found Apr 23 09:31:07.641205 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.641183 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/91f0276c-d1f7-4977-a25e-107ac0756380-stats-auth\") pod \"router-default-86d78b875b-rbksj\" (UID: \"91f0276c-d1f7-4977-a25e-107ac0756380\") " pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:31:07.642127 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.642086 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3b73d76-bb0a-4807-ba45-02945da31336-serving-cert\") pod \"console-operator-9d4b6777b-bpp9r\" (UID: \"e3b73d76-bb0a-4807-ba45-02945da31336\") " pod="openshift-console-operator/console-operator-9d4b6777b-bpp9r" Apr 23 09:31:07.642378 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.642361 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/164f4228-f916-4e95-9a43-29cf40bacf4a-installation-pull-secrets\") pod \"image-registry-554d8f4b86-p7vqw\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:07.644518 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.644496 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-bound-sa-token\") pod \"image-registry-554d8f4b86-p7vqw\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:07.648201 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.648177 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4vjk\" (UniqueName: \"kubernetes.io/projected/91f0276c-d1f7-4977-a25e-107ac0756380-kube-api-access-x4vjk\") pod \"router-default-86d78b875b-rbksj\" (UID: \"91f0276c-d1f7-4977-a25e-107ac0756380\") " pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:31:07.648922 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.648876 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxsdg\" (UniqueName: \"kubernetes.io/projected/bfdda918-82b7-430c-9e8e-f2555930fa85-kube-api-access-lxsdg\") pod \"insights-operator-585dfdc468-zllgx\" (UID: \"bfdda918-82b7-430c-9e8e-f2555930fa85\") " pod="openshift-insights/insights-operator-585dfdc468-zllgx" Apr 23 09:31:07.654185 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.654160 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h855\" (UniqueName: \"kubernetes.io/projected/d01a5541-6b84-4253-a5a5-e0d82b86f84d-kube-api-access-9h855\") pod \"volume-data-source-validator-7c6cbb6c87-nxsf6\" (UID: \"d01a5541-6b84-4253-a5a5-e0d82b86f84d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nxsf6" Apr 23 09:31:07.654373 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.654353 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c82kr\" (UniqueName: \"kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-kube-api-access-c82kr\") pod \"image-registry-554d8f4b86-p7vqw\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:07.655391 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.655368 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5696\" (UniqueName: \"kubernetes.io/projected/25c6abb3-0bf5-4a51-9450-a29341379573-kube-api-access-z5696\") pod \"cluster-samples-operator-6dc5bdb6b4-ftn6j\" (UID: \"25c6abb3-0bf5-4a51-9450-a29341379573\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ftn6j" Apr 23 09:31:07.655476 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.655414 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f9mk\" (UniqueName: \"kubernetes.io/projected/e3b73d76-bb0a-4807-ba45-02945da31336-kube-api-access-7f9mk\") pod \"console-operator-9d4b6777b-bpp9r\" (UID: \"e3b73d76-bb0a-4807-ba45-02945da31336\") " pod="openshift-console-operator/console-operator-9d4b6777b-bpp9r" Apr 23 09:31:07.738348 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.738274 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/541d1fdf-75b3-4058-97be-3140f4c7fdb2-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-qbzv8\" (UID: \"541d1fdf-75b3-4058-97be-3140f4c7fdb2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qbzv8" Apr 23 09:31:07.738472 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.738394 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5f78147d-0dbc-471a-a1c9-05e2a3cb5333-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vqvls\" (UID: \"5f78147d-0dbc-471a-a1c9-05e2a3cb5333\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vqvls" Apr 23 09:31:07.738472 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.738420 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w6qt7\" (UniqueName: \"kubernetes.io/projected/6b8bf9f1-73a1-4060-b09a-55a78aed4470-kube-api-access-w6qt7\") pod \"dns-default-2mjdq\" (UID: \"6b8bf9f1-73a1-4060-b09a-55a78aed4470\") " pod="openshift-dns/dns-default-2mjdq" Apr 23 09:31:07.738472 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.738452 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30c9a344-56d8-4716-84da-8665f9ee4946-cert\") pod \"ingress-canary-rj967\" (UID: \"30c9a344-56d8-4716-84da-8665f9ee4946\") " pod="openshift-ingress-canary/ingress-canary-rj967" Apr 23 09:31:07.738619 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.738488 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-969t5\" (UniqueName: \"kubernetes.io/projected/c46256a3-d81a-45a4-b658-3a9db962e17a-kube-api-access-969t5\") pod \"service-ca-operator-d6fc45fc5-n72bx\" (UID: \"c46256a3-d81a-45a4-b658-3a9db962e17a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n72bx" Apr 23 09:31:07.738619 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.738545 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fqt2r\" (UniqueName: \"kubernetes.io/projected/eaa17ea4-444a-410e-b077-79a2168b8f71-kube-api-access-fqt2r\") pod \"cluster-proxy-proxy-agent-6b858989cb-mf2j9\" (UID: \"eaa17ea4-444a-410e-b077-79a2168b8f71\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" Apr 23 09:31:07.738619 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.738566 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b8bf9f1-73a1-4060-b09a-55a78aed4470-config-volume\") pod \"dns-default-2mjdq\" (UID: \"6b8bf9f1-73a1-4060-b09a-55a78aed4470\") " pod="openshift-dns/dns-default-2mjdq" Apr 23 09:31:07.738619 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.738584 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/541d1fdf-75b3-4058-97be-3140f4c7fdb2-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-qbzv8\" (UID: \"541d1fdf-75b3-4058-97be-3140f4c7fdb2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qbzv8" Apr 23 09:31:07.738619 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.738601 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qn4x4\" (UniqueName: \"kubernetes.io/projected/d24244c3-869e-47a6-92c7-a26c11be6b66-kube-api-access-qn4x4\") pod \"managed-serviceaccount-addon-agent-5f4b5f9c8c-mfnsg\" (UID: \"d24244c3-869e-47a6-92c7-a26c11be6b66\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4b5f9c8c-mfnsg" Apr 23 09:31:07.738619 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:07.738606 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 09:31:07.738906 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.738623 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rrx5\" (UniqueName: \"kubernetes.io/projected/30c9a344-56d8-4716-84da-8665f9ee4946-kube-api-access-4rrx5\") pod \"ingress-canary-rj967\" (UID: \"30c9a344-56d8-4716-84da-8665f9ee4946\") " pod="openshift-ingress-canary/ingress-canary-rj967" Apr 23 09:31:07.738906 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:07.738673 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30c9a344-56d8-4716-84da-8665f9ee4946-cert podName:30c9a344-56d8-4716-84da-8665f9ee4946 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:08.238648373 +0000 UTC m=+33.683710682 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/30c9a344-56d8-4716-84da-8665f9ee4946-cert") pod "ingress-canary-rj967" (UID: "30c9a344-56d8-4716-84da-8665f9ee4946") : secret "canary-serving-cert" not found Apr 23 09:31:07.738906 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.738699 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6b8bf9f1-73a1-4060-b09a-55a78aed4470-tmp-dir\") pod \"dns-default-2mjdq\" (UID: \"6b8bf9f1-73a1-4060-b09a-55a78aed4470\") " pod="openshift-dns/dns-default-2mjdq" Apr 23 09:31:07.738906 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:07.738718 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 09:31:07.738906 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.738740 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6b8bf9f1-73a1-4060-b09a-55a78aed4470-metrics-tls\") pod \"dns-default-2mjdq\" (UID: \"6b8bf9f1-73a1-4060-b09a-55a78aed4470\") " pod="openshift-dns/dns-default-2mjdq" Apr 23 09:31:07.738906 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:07.738770 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f78147d-0dbc-471a-a1c9-05e2a3cb5333-networking-console-plugin-cert podName:5f78147d-0dbc-471a-a1c9-05e2a3cb5333 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:08.23875849 +0000 UTC m=+33.683820792 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5f78147d-0dbc-471a-a1c9-05e2a3cb5333-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-vqvls" (UID: "5f78147d-0dbc-471a-a1c9-05e2a3cb5333") : secret "networking-console-plugin-cert" not found Apr 23 09:31:07.738906 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.738789 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/c455ca72-f57f-4a17-a701-1d5db923cb29-klusterlet-config\") pod \"klusterlet-addon-workmgr-6b5955c777-bgbzm\" (UID: \"c455ca72-f57f-4a17-a701-1d5db923cb29\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b5955c777-bgbzm" Apr 23 09:31:07.738906 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:07.738813 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 09:31:07.738906 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.738818 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/eaa17ea4-444a-410e-b077-79a2168b8f71-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6b858989cb-mf2j9\" (UID: \"eaa17ea4-444a-410e-b077-79a2168b8f71\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" Apr 23 09:31:07.738906 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.738820 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/541d1fdf-75b3-4058-97be-3140f4c7fdb2-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-qbzv8\" (UID: \"541d1fdf-75b3-4058-97be-3140f4c7fdb2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qbzv8" Apr 23 09:31:07.738906 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:07.738849 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b8bf9f1-73a1-4060-b09a-55a78aed4470-metrics-tls podName:6b8bf9f1-73a1-4060-b09a-55a78aed4470 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:08.238839037 +0000 UTC m=+33.683901339 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6b8bf9f1-73a1-4060-b09a-55a78aed4470-metrics-tls") pod "dns-default-2mjdq" (UID: "6b8bf9f1-73a1-4060-b09a-55a78aed4470") : secret "dns-default-metrics-tls" not found Apr 23 09:31:07.738906 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.738869 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c455ca72-f57f-4a17-a701-1d5db923cb29-tmp\") pod \"klusterlet-addon-workmgr-6b5955c777-bgbzm\" (UID: \"c455ca72-f57f-4a17-a701-1d5db923cb29\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b5955c777-bgbzm" Apr 23 09:31:07.738906 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.738900 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d24244c3-869e-47a6-92c7-a26c11be6b66-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5f4b5f9c8c-mfnsg\" (UID: \"d24244c3-869e-47a6-92c7-a26c11be6b66\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4b5f9c8c-mfnsg" Apr 23 09:31:07.739539 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.738933 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dc8vs\" (UniqueName: \"kubernetes.io/projected/b2f33469-6027-43ec-be9b-4bebb5500631-kube-api-access-dc8vs\") pod \"network-check-source-8894fc9bd-zvsn4\" (UID: \"b2f33469-6027-43ec-be9b-4bebb5500631\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-zvsn4" Apr 23 09:31:07.739539 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.738966 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s65wg\" (UniqueName: \"kubernetes.io/projected/c455ca72-f57f-4a17-a701-1d5db923cb29-kube-api-access-s65wg\") pod \"klusterlet-addon-workmgr-6b5955c777-bgbzm\" (UID: \"c455ca72-f57f-4a17-a701-1d5db923cb29\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b5955c777-bgbzm" Apr 23 09:31:07.739539 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.739023 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5f78147d-0dbc-471a-a1c9-05e2a3cb5333-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-vqvls\" (UID: \"5f78147d-0dbc-471a-a1c9-05e2a3cb5333\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vqvls" Apr 23 09:31:07.739539 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.739050 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/fe118deb-6994-428b-9927-b07aafd254b6-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-zxwg9\" (UID: \"fe118deb-6994-428b-9927-b07aafd254b6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zxwg9" Apr 23 09:31:07.739539 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.739401 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6b8bf9f1-73a1-4060-b09a-55a78aed4470-tmp-dir\") pod \"dns-default-2mjdq\" (UID: \"6b8bf9f1-73a1-4060-b09a-55a78aed4470\") " pod="openshift-dns/dns-default-2mjdq" Apr 23 09:31:07.739539 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.739513 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c455ca72-f57f-4a17-a701-1d5db923cb29-tmp\") pod \"klusterlet-addon-workmgr-6b5955c777-bgbzm\" (UID: \"c455ca72-f57f-4a17-a701-1d5db923cb29\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b5955c777-bgbzm" Apr 23 09:31:07.739891 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.739661 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/eaa17ea4-444a-410e-b077-79a2168b8f71-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6b858989cb-mf2j9\" (UID: \"eaa17ea4-444a-410e-b077-79a2168b8f71\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" Apr 23 09:31:07.739891 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.739703 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bnxrm\" (UniqueName: \"kubernetes.io/projected/fe118deb-6994-428b-9927-b07aafd254b6-kube-api-access-bnxrm\") pod \"cluster-monitoring-operator-75587bd455-zxwg9\" (UID: \"fe118deb-6994-428b-9927-b07aafd254b6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zxwg9" Apr 23 09:31:07.739891 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.739734 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/eaa17ea4-444a-410e-b077-79a2168b8f71-ca\") pod \"cluster-proxy-proxy-agent-6b858989cb-mf2j9\" (UID: \"eaa17ea4-444a-410e-b077-79a2168b8f71\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" Apr 23 09:31:07.739891 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.739763 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/eaa17ea4-444a-410e-b077-79a2168b8f71-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6b858989cb-mf2j9\" (UID: \"eaa17ea4-444a-410e-b077-79a2168b8f71\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" Apr 23 09:31:07.739891 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.739770 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5f78147d-0dbc-471a-a1c9-05e2a3cb5333-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-vqvls\" (UID: \"5f78147d-0dbc-471a-a1c9-05e2a3cb5333\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vqvls" Apr 23 09:31:07.739891 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.739791 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe118deb-6994-428b-9927-b07aafd254b6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zxwg9\" (UID: \"fe118deb-6994-428b-9927-b07aafd254b6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zxwg9" Apr 23 09:31:07.739891 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.739841 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c46256a3-d81a-45a4-b658-3a9db962e17a-serving-cert\") pod \"service-ca-operator-d6fc45fc5-n72bx\" (UID: \"c46256a3-d81a-45a4-b658-3a9db962e17a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n72bx" Apr 23 09:31:07.739891 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.739876 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c46256a3-d81a-45a4-b658-3a9db962e17a-config\") pod \"service-ca-operator-d6fc45fc5-n72bx\" (UID: \"c46256a3-d81a-45a4-b658-3a9db962e17a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n72bx" Apr 23 09:31:07.740325 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.739901 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/eaa17ea4-444a-410e-b077-79a2168b8f71-hub\") pod \"cluster-proxy-proxy-agent-6b858989cb-mf2j9\" (UID: \"eaa17ea4-444a-410e-b077-79a2168b8f71\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" Apr 23 09:31:07.740325 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.739952 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vfftj\" (UniqueName: \"kubernetes.io/projected/541d1fdf-75b3-4058-97be-3140f4c7fdb2-kube-api-access-vfftj\") pod \"kube-storage-version-migrator-operator-6769c5d45-qbzv8\" (UID: \"541d1fdf-75b3-4058-97be-3140f4c7fdb2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qbzv8" Apr 23 09:31:07.740325 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.739972 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/fe118deb-6994-428b-9927-b07aafd254b6-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-zxwg9\" (UID: \"fe118deb-6994-428b-9927-b07aafd254b6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zxwg9" Apr 23 09:31:07.740491 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:07.740376 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 09:31:07.740491 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:07.740456 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe118deb-6994-428b-9927-b07aafd254b6-cluster-monitoring-operator-tls podName:fe118deb-6994-428b-9927-b07aafd254b6 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:08.240436401 +0000 UTC m=+33.685498709 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fe118deb-6994-428b-9927-b07aafd254b6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zxwg9" (UID: "fe118deb-6994-428b-9927-b07aafd254b6") : secret "cluster-monitoring-operator-tls" not found Apr 23 09:31:07.740491 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.740383 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b8bf9f1-73a1-4060-b09a-55a78aed4470-config-volume\") pod \"dns-default-2mjdq\" (UID: \"6b8bf9f1-73a1-4060-b09a-55a78aed4470\") " pod="openshift-dns/dns-default-2mjdq" Apr 23 09:31:07.740754 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.740734 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-bpp9r" Apr 23 09:31:07.741133 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.741107 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c46256a3-d81a-45a4-b658-3a9db962e17a-config\") pod \"service-ca-operator-d6fc45fc5-n72bx\" (UID: \"c46256a3-d81a-45a4-b658-3a9db962e17a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n72bx" Apr 23 09:31:07.741262 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.741242 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/eaa17ea4-444a-410e-b077-79a2168b8f71-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6b858989cb-mf2j9\" (UID: \"eaa17ea4-444a-410e-b077-79a2168b8f71\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" Apr 23 09:31:07.744157 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.742359 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/eaa17ea4-444a-410e-b077-79a2168b8f71-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6b858989cb-mf2j9\" (UID: \"eaa17ea4-444a-410e-b077-79a2168b8f71\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" Apr 23 09:31:07.744157 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.742554 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d24244c3-869e-47a6-92c7-a26c11be6b66-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5f4b5f9c8c-mfnsg\" (UID: \"d24244c3-869e-47a6-92c7-a26c11be6b66\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4b5f9c8c-mfnsg" Apr 23 09:31:07.744157 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.743716 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/eaa17ea4-444a-410e-b077-79a2168b8f71-ca\") pod \"cluster-proxy-proxy-agent-6b858989cb-mf2j9\" (UID: \"eaa17ea4-444a-410e-b077-79a2168b8f71\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" Apr 23 09:31:07.744350 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.744166 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/eaa17ea4-444a-410e-b077-79a2168b8f71-hub\") pod \"cluster-proxy-proxy-agent-6b858989cb-mf2j9\" (UID: \"eaa17ea4-444a-410e-b077-79a2168b8f71\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" Apr 23 09:31:07.744350 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.744204 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/c455ca72-f57f-4a17-a701-1d5db923cb29-klusterlet-config\") pod \"klusterlet-addon-workmgr-6b5955c777-bgbzm\" (UID: \"c455ca72-f57f-4a17-a701-1d5db923cb29\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b5955c777-bgbzm" Apr 23 09:31:07.744782 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.744764 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c46256a3-d81a-45a4-b658-3a9db962e17a-serving-cert\") pod \"service-ca-operator-d6fc45fc5-n72bx\" (UID: \"c46256a3-d81a-45a4-b658-3a9db962e17a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n72bx" Apr 23 09:31:07.744987 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.744965 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/541d1fdf-75b3-4058-97be-3140f4c7fdb2-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-qbzv8\" (UID: \"541d1fdf-75b3-4058-97be-3140f4c7fdb2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qbzv8" Apr 23 09:31:07.745565 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.745539 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/eaa17ea4-444a-410e-b077-79a2168b8f71-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6b858989cb-mf2j9\" (UID: \"eaa17ea4-444a-410e-b077-79a2168b8f71\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" Apr 23 09:31:07.747745 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.747701 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-969t5\" (UniqueName: \"kubernetes.io/projected/c46256a3-d81a-45a4-b658-3a9db962e17a-kube-api-access-969t5\") pod \"service-ca-operator-d6fc45fc5-n72bx\" (UID: \"c46256a3-d81a-45a4-b658-3a9db962e17a\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n72bx" Apr 23 09:31:07.749771 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.749729 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6qt7\" (UniqueName: \"kubernetes.io/projected/6b8bf9f1-73a1-4060-b09a-55a78aed4470-kube-api-access-w6qt7\") pod \"dns-default-2mjdq\" (UID: \"6b8bf9f1-73a1-4060-b09a-55a78aed4470\") " pod="openshift-dns/dns-default-2mjdq" Apr 23 09:31:07.750149 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.750124 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn4x4\" (UniqueName: \"kubernetes.io/projected/d24244c3-869e-47a6-92c7-a26c11be6b66-kube-api-access-qn4x4\") pod \"managed-serviceaccount-addon-agent-5f4b5f9c8c-mfnsg\" (UID: \"d24244c3-869e-47a6-92c7-a26c11be6b66\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4b5f9c8c-mfnsg" Apr 23 09:31:07.750149 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.750137 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s65wg\" (UniqueName: \"kubernetes.io/projected/c455ca72-f57f-4a17-a701-1d5db923cb29-kube-api-access-s65wg\") pod \"klusterlet-addon-workmgr-6b5955c777-bgbzm\" (UID: \"c455ca72-f57f-4a17-a701-1d5db923cb29\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b5955c777-bgbzm" Apr 23 09:31:07.750324 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.750283 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc8vs\" (UniqueName: \"kubernetes.io/projected/b2f33469-6027-43ec-be9b-4bebb5500631-kube-api-access-dc8vs\") pod \"network-check-source-8894fc9bd-zvsn4\" (UID: \"b2f33469-6027-43ec-be9b-4bebb5500631\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-zvsn4" Apr 23 09:31:07.750741 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.750716 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqt2r\" (UniqueName: \"kubernetes.io/projected/eaa17ea4-444a-410e-b077-79a2168b8f71-kube-api-access-fqt2r\") pod \"cluster-proxy-proxy-agent-6b858989cb-mf2j9\" (UID: \"eaa17ea4-444a-410e-b077-79a2168b8f71\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" Apr 23 09:31:07.750823 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.750724 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rrx5\" (UniqueName: \"kubernetes.io/projected/30c9a344-56d8-4716-84da-8665f9ee4946-kube-api-access-4rrx5\") pod \"ingress-canary-rj967\" (UID: \"30c9a344-56d8-4716-84da-8665f9ee4946\") " pod="openshift-ingress-canary/ingress-canary-rj967" Apr 23 09:31:07.751603 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.751582 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnxrm\" (UniqueName: \"kubernetes.io/projected/fe118deb-6994-428b-9927-b07aafd254b6-kube-api-access-bnxrm\") pod \"cluster-monitoring-operator-75587bd455-zxwg9\" (UID: \"fe118deb-6994-428b-9927-b07aafd254b6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zxwg9" Apr 23 09:31:07.772046 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.772027 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfftj\" (UniqueName: \"kubernetes.io/projected/541d1fdf-75b3-4058-97be-3140f4c7fdb2-kube-api-access-vfftj\") pod \"kube-storage-version-migrator-operator-6769c5d45-qbzv8\" (UID: \"541d1fdf-75b3-4058-97be-3140f4c7fdb2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qbzv8" Apr 23 09:31:07.774851 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.774816 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-zllgx" Apr 23 09:31:07.792487 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.792462 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nxsf6" Apr 23 09:31:07.821765 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.821070 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qbzv8" Apr 23 09:31:07.844276 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:07.843309 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:31:07.844276 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:07.843576 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c171e7cd-9c69-4ef7-9012-fad9d2b17a46-metrics-certs podName:c171e7cd-9c69-4ef7-9012-fad9d2b17a46 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:39.843549803 +0000 UTC m=+65.288612107 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c171e7cd-9c69-4ef7-9012-fad9d2b17a46-metrics-certs") pod "network-metrics-daemon-6528f" (UID: "c171e7cd-9c69-4ef7-9012-fad9d2b17a46") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 09:31:07.844762 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.844547 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c171e7cd-9c69-4ef7-9012-fad9d2b17a46-metrics-certs\") pod \"network-metrics-daemon-6528f\" (UID: \"c171e7cd-9c69-4ef7-9012-fad9d2b17a46\") " pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:31:07.848751 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.848331 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4b5f9c8c-mfnsg" Apr 23 09:31:07.857801 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.857433 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-zvsn4" Apr 23 09:31:07.869994 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.869964 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b5955c777-bgbzm" Apr 23 09:31:07.878354 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.876242 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n72bx" Apr 23 09:31:07.882839 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.882468 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" Apr 23 09:31:07.904313 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.903827 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-bpp9r"] Apr 23 09:31:07.920459 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:31:07.920404 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3b73d76_bb0a_4807_ba45_02945da31336.slice/crio-44113bec14dc48c28b1fc73f42b2e3d6813f8df96a5a23e5d8e67c2edd9cd77c WatchSource:0}: Error finding container 44113bec14dc48c28b1fc73f42b2e3d6813f8df96a5a23e5d8e67c2edd9cd77c: Status 404 returned error can't find the container with id 44113bec14dc48c28b1fc73f42b2e3d6813f8df96a5a23e5d8e67c2edd9cd77c Apr 23 09:31:07.948678 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.948189 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxtxx\" (UniqueName: \"kubernetes.io/projected/11f39ddc-72f6-4699-8329-bbb34ab9a9f0-kube-api-access-mxtxx\") pod \"network-check-target-sb8zp\" (UID: \"11f39ddc-72f6-4699-8329-bbb34ab9a9f0\") " pod="openshift-network-diagnostics/network-check-target-sb8zp" Apr 23 09:31:07.950985 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.950938 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-zllgx"] Apr 23 09:31:07.955477 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.955313 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxtxx\" (UniqueName: \"kubernetes.io/projected/11f39ddc-72f6-4699-8329-bbb34ab9a9f0-kube-api-access-mxtxx\") pod \"network-check-target-sb8zp\" (UID: \"11f39ddc-72f6-4699-8329-bbb34ab9a9f0\") " pod="openshift-network-diagnostics/network-check-target-sb8zp" Apr 23 09:31:07.991685 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:07.991273 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nxsf6"] Apr 23 09:31:08.028554 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:08.028514 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qbzv8"] Apr 23 09:31:08.041901 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:31:08.041848 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod541d1fdf_75b3_4058_97be_3140f4c7fdb2.slice/crio-a2ebc8166504406bdcf5828fa6bc58275f60225f820c28797d1de186782c27ec WatchSource:0}: Error finding container a2ebc8166504406bdcf5828fa6bc58275f60225f820c28797d1de186782c27ec: Status 404 returned error can't find the container with id a2ebc8166504406bdcf5828fa6bc58275f60225f820c28797d1de186782c27ec Apr 23 09:31:08.080924 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:08.080886 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4b5f9c8c-mfnsg"] Apr 23 09:31:08.088012 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:31:08.087982 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd24244c3_869e_47a6_92c7_a26c11be6b66.slice/crio-98b01f83684db857055776f93b974c49042e614695328d139478a89b9ec098ff WatchSource:0}: Error finding container 98b01f83684db857055776f93b974c49042e614695328d139478a89b9ec098ff: Status 404 returned error can't find the container with id 98b01f83684db857055776f93b974c49042e614695328d139478a89b9ec098ff Apr 23 09:31:08.091000 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:08.090952 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-zvsn4"] Apr 23 09:31:08.096081 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:31:08.096050 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2f33469_6027_43ec_be9b_4bebb5500631.slice/crio-c13e08485834acbd6382325a85564e10a4da864c640b242969a320f934422a61 WatchSource:0}: Error finding container c13e08485834acbd6382325a85564e10a4da864c640b242969a320f934422a61: Status 404 returned error can't find the container with id c13e08485834acbd6382325a85564e10a4da864c640b242969a320f934422a61 Apr 23 09:31:08.102346 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:08.102289 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b5955c777-bgbzm"] Apr 23 09:31:08.105637 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:31:08.105612 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc455ca72_f57f_4a17_a701_1d5db923cb29.slice/crio-f27c5fe61c493ac368d0baac3ecf82fc3faa4012484265c26cea08cd6dc10464 WatchSource:0}: Error finding container f27c5fe61c493ac368d0baac3ecf82fc3faa4012484265c26cea08cd6dc10464: Status 404 returned error can't find the container with id f27c5fe61c493ac368d0baac3ecf82fc3faa4012484265c26cea08cd6dc10464 Apr 23 09:31:08.119077 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:08.119052 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n72bx"] Apr 23 09:31:08.122058 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:31:08.122034 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc46256a3_d81a_45a4_b658_3a9db962e17a.slice/crio-6263732f0e6f448b7719322b06d92cd0a4758a9ebdbfbfb42b1b86f1831af31a WatchSource:0}: Error finding container 6263732f0e6f448b7719322b06d92cd0a4758a9ebdbfbfb42b1b86f1831af31a: Status 404 returned error can't find the container with id 6263732f0e6f448b7719322b06d92cd0a4758a9ebdbfbfb42b1b86f1831af31a Apr 23 09:31:08.130962 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:08.130939 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9"] Apr 23 09:31:08.133278 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:31:08.133257 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeaa17ea4_444a_410e_b077_79a2168b8f71.slice/crio-b255c92562812b940854549b54b321e48c265092f02341ffd69ec54e459acadc WatchSource:0}: Error finding container b255c92562812b940854549b54b321e48c265092f02341ffd69ec54e459acadc: Status 404 returned error can't find the container with id b255c92562812b940854549b54b321e48c265092f02341ffd69ec54e459acadc Apr 23 09:31:08.150068 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:08.150047 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-registry-tls\") pod \"image-registry-554d8f4b86-p7vqw\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:08.150165 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:08.150107 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/25c6abb3-0bf5-4a51-9450-a29341379573-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ftn6j\" (UID: \"25c6abb3-0bf5-4a51-9450-a29341379573\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ftn6j" Apr 23 09:31:08.150220 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:08.150207 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 09:31:08.150278 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:08.150227 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-554d8f4b86-p7vqw: secret "image-registry-tls" not found Apr 23 09:31:08.150278 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:08.150231 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91f0276c-d1f7-4977-a25e-107ac0756380-service-ca-bundle\") pod \"router-default-86d78b875b-rbksj\" (UID: \"91f0276c-d1f7-4977-a25e-107ac0756380\") " pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:31:08.150278 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:08.150266 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 09:31:08.150431 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:08.150279 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-registry-tls podName:164f4228-f916-4e95-9a43-29cf40bacf4a nodeName:}" failed. No retries permitted until 2026-04-23 09:31:09.150259023 +0000 UTC m=+34.595321332 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-registry-tls") pod "image-registry-554d8f4b86-p7vqw" (UID: "164f4228-f916-4e95-9a43-29cf40bacf4a") : secret "image-registry-tls" not found Apr 23 09:31:08.150431 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:08.150323 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91f0276c-d1f7-4977-a25e-107ac0756380-metrics-certs\") pod \"router-default-86d78b875b-rbksj\" (UID: \"91f0276c-d1f7-4977-a25e-107ac0756380\") " pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:31:08.150431 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:08.150335 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25c6abb3-0bf5-4a51-9450-a29341379573-samples-operator-tls podName:25c6abb3-0bf5-4a51-9450-a29341379573 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:09.150318123 +0000 UTC m=+34.595380426 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/25c6abb3-0bf5-4a51-9450-a29341379573-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-ftn6j" (UID: "25c6abb3-0bf5-4a51-9450-a29341379573") : secret "samples-operator-tls" not found Apr 23 09:31:08.150431 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:08.150363 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/91f0276c-d1f7-4977-a25e-107ac0756380-service-ca-bundle podName:91f0276c-d1f7-4977-a25e-107ac0756380 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:09.150350345 +0000 UTC m=+34.595412646 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/91f0276c-d1f7-4977-a25e-107ac0756380-service-ca-bundle") pod "router-default-86d78b875b-rbksj" (UID: "91f0276c-d1f7-4977-a25e-107ac0756380") : configmap references non-existent config key: service-ca.crt Apr 23 09:31:08.150431 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:08.150386 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 09:31:08.150431 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:08.150420 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91f0276c-d1f7-4977-a25e-107ac0756380-metrics-certs podName:91f0276c-d1f7-4977-a25e-107ac0756380 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:09.150409152 +0000 UTC m=+34.595471454 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91f0276c-d1f7-4977-a25e-107ac0756380-metrics-certs") pod "router-default-86d78b875b-rbksj" (UID: "91f0276c-d1f7-4977-a25e-107ac0756380") : secret "router-metrics-certs-default" not found Apr 23 09:31:08.251576 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:08.251506 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe118deb-6994-428b-9927-b07aafd254b6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zxwg9\" (UID: \"fe118deb-6994-428b-9927-b07aafd254b6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zxwg9" Apr 23 09:31:08.251701 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:08.251597 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5f78147d-0dbc-471a-a1c9-05e2a3cb5333-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vqvls\" (UID: \"5f78147d-0dbc-471a-a1c9-05e2a3cb5333\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vqvls" Apr 23 09:31:08.251701 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:08.251634 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30c9a344-56d8-4716-84da-8665f9ee4946-cert\") pod \"ingress-canary-rj967\" (UID: \"30c9a344-56d8-4716-84da-8665f9ee4946\") " pod="openshift-ingress-canary/ingress-canary-rj967" Apr 23 09:31:08.251701 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:08.251651 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 09:31:08.251701 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:08.251680 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6b8bf9f1-73a1-4060-b09a-55a78aed4470-metrics-tls\") pod \"dns-default-2mjdq\" (UID: \"6b8bf9f1-73a1-4060-b09a-55a78aed4470\") " pod="openshift-dns/dns-default-2mjdq" Apr 23 09:31:08.251895 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:08.251712 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe118deb-6994-428b-9927-b07aafd254b6-cluster-monitoring-operator-tls podName:fe118deb-6994-428b-9927-b07aafd254b6 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:09.251694397 +0000 UTC m=+34.696756698 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fe118deb-6994-428b-9927-b07aafd254b6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zxwg9" (UID: "fe118deb-6994-428b-9927-b07aafd254b6") : secret "cluster-monitoring-operator-tls" not found Apr 23 09:31:08.251895 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:08.251751 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 09:31:08.251895 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:08.251764 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 09:31:08.251895 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:08.251770 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 09:31:08.251895 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:08.251813 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f78147d-0dbc-471a-a1c9-05e2a3cb5333-networking-console-plugin-cert podName:5f78147d-0dbc-471a-a1c9-05e2a3cb5333 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:09.251795968 +0000 UTC m=+34.696858272 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5f78147d-0dbc-471a-a1c9-05e2a3cb5333-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-vqvls" (UID: "5f78147d-0dbc-471a-a1c9-05e2a3cb5333") : secret "networking-console-plugin-cert" not found Apr 23 09:31:08.251895 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:08.251832 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b8bf9f1-73a1-4060-b09a-55a78aed4470-metrics-tls podName:6b8bf9f1-73a1-4060-b09a-55a78aed4470 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:09.251822351 +0000 UTC m=+34.696884653 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6b8bf9f1-73a1-4060-b09a-55a78aed4470-metrics-tls") pod "dns-default-2mjdq" (UID: "6b8bf9f1-73a1-4060-b09a-55a78aed4470") : secret "dns-default-metrics-tls" not found Apr 23 09:31:08.251895 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:08.251848 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30c9a344-56d8-4716-84da-8665f9ee4946-cert podName:30c9a344-56d8-4716-84da-8665f9ee4946 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:09.251839506 +0000 UTC m=+34.696901813 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/30c9a344-56d8-4716-84da-8665f9ee4946-cert") pod "ingress-canary-rj967" (UID: "30c9a344-56d8-4716-84da-8665f9ee4946") : secret "canary-serving-cert" not found Apr 23 09:31:08.320766 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:08.320732 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" event={"ID":"eaa17ea4-444a-410e-b077-79a2168b8f71","Type":"ContainerStarted","Data":"b255c92562812b940854549b54b321e48c265092f02341ffd69ec54e459acadc"} Apr 23 09:31:08.321961 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:08.321931 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4b5f9c8c-mfnsg" event={"ID":"d24244c3-869e-47a6-92c7-a26c11be6b66","Type":"ContainerStarted","Data":"98b01f83684db857055776f93b974c49042e614695328d139478a89b9ec098ff"} Apr 23 09:31:08.322999 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:08.322965 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nxsf6" event={"ID":"d01a5541-6b84-4253-a5a5-e0d82b86f84d","Type":"ContainerStarted","Data":"b81e6ad3994f93f3c36a5ff170b177a40a26cae999843cb9433028489cd53629"} Apr 23 09:31:08.324189 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:08.324161 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-zllgx" event={"ID":"bfdda918-82b7-430c-9e8e-f2555930fa85","Type":"ContainerStarted","Data":"c347a8a7412db65e2bd9c0e2f8accba3618d6e4355a8d3a15e9f0c03d1fabdbb"} Apr 23 09:31:08.325279 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:08.325257 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qbzv8" event={"ID":"541d1fdf-75b3-4058-97be-3140f4c7fdb2","Type":"ContainerStarted","Data":"a2ebc8166504406bdcf5828fa6bc58275f60225f820c28797d1de186782c27ec"} Apr 23 09:31:08.326367 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:08.326345 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n72bx" event={"ID":"c46256a3-d81a-45a4-b658-3a9db962e17a","Type":"ContainerStarted","Data":"6263732f0e6f448b7719322b06d92cd0a4758a9ebdbfbfb42b1b86f1831af31a"} Apr 23 09:31:08.327486 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:08.327441 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b5955c777-bgbzm" event={"ID":"c455ca72-f57f-4a17-a701-1d5db923cb29","Type":"ContainerStarted","Data":"f27c5fe61c493ac368d0baac3ecf82fc3faa4012484265c26cea08cd6dc10464"} Apr 23 09:31:08.328598 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:08.328579 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-zvsn4" event={"ID":"b2f33469-6027-43ec-be9b-4bebb5500631","Type":"ContainerStarted","Data":"c13e08485834acbd6382325a85564e10a4da864c640b242969a320f934422a61"} Apr 23 09:31:08.330059 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:08.330024 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-bpp9r" event={"ID":"e3b73d76-bb0a-4807-ba45-02945da31336","Type":"ContainerStarted","Data":"44113bec14dc48c28b1fc73f42b2e3d6813f8df96a5a23e5d8e67c2edd9cd77c"} Apr 23 09:31:09.163344 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:09.160648 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91f0276c-d1f7-4977-a25e-107ac0756380-service-ca-bundle\") pod \"router-default-86d78b875b-rbksj\" (UID: \"91f0276c-d1f7-4977-a25e-107ac0756380\") " pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:31:09.163344 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:09.160704 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91f0276c-d1f7-4977-a25e-107ac0756380-metrics-certs\") pod \"router-default-86d78b875b-rbksj\" (UID: \"91f0276c-d1f7-4977-a25e-107ac0756380\") " pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:31:09.163344 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:09.160764 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-registry-tls\") pod \"image-registry-554d8f4b86-p7vqw\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:09.163344 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:09.160837 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/25c6abb3-0bf5-4a51-9450-a29341379573-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ftn6j\" (UID: \"25c6abb3-0bf5-4a51-9450-a29341379573\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ftn6j" Apr 23 09:31:09.163344 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:09.161058 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 09:31:09.163344 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:09.161166 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25c6abb3-0bf5-4a51-9450-a29341379573-samples-operator-tls podName:25c6abb3-0bf5-4a51-9450-a29341379573 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:11.16114362 +0000 UTC m=+36.606205925 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/25c6abb3-0bf5-4a51-9450-a29341379573-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-ftn6j" (UID: "25c6abb3-0bf5-4a51-9450-a29341379573") : secret "samples-operator-tls" not found Apr 23 09:31:09.163344 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:09.161281 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/91f0276c-d1f7-4977-a25e-107ac0756380-service-ca-bundle podName:91f0276c-d1f7-4977-a25e-107ac0756380 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:11.161268485 +0000 UTC m=+36.606330797 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/91f0276c-d1f7-4977-a25e-107ac0756380-service-ca-bundle") pod "router-default-86d78b875b-rbksj" (UID: "91f0276c-d1f7-4977-a25e-107ac0756380") : configmap references non-existent config key: service-ca.crt Apr 23 09:31:09.163344 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:09.162346 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 09:31:09.163344 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:09.162412 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91f0276c-d1f7-4977-a25e-107ac0756380-metrics-certs podName:91f0276c-d1f7-4977-a25e-107ac0756380 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:11.162394824 +0000 UTC m=+36.607457128 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91f0276c-d1f7-4977-a25e-107ac0756380-metrics-certs") pod "router-default-86d78b875b-rbksj" (UID: "91f0276c-d1f7-4977-a25e-107ac0756380") : secret "router-metrics-certs-default" not found Apr 23 09:31:09.163344 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:09.162496 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 09:31:09.163344 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:09.162509 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-554d8f4b86-p7vqw: secret "image-registry-tls" not found Apr 23 09:31:09.163344 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:09.162550 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-registry-tls podName:164f4228-f916-4e95-9a43-29cf40bacf4a nodeName:}" failed. No retries permitted until 2026-04-23 09:31:11.162535393 +0000 UTC m=+36.607597700 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-registry-tls") pod "image-registry-554d8f4b86-p7vqw" (UID: "164f4228-f916-4e95-9a43-29cf40bacf4a") : secret "image-registry-tls" not found Apr 23 09:31:09.177513 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:09.177485 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb8zp" Apr 23 09:31:09.180740 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:09.177919 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:31:09.180740 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:09.180571 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 09:31:09.180740 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:09.178289 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:31:09.186188 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:09.185647 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jwhsm\"" Apr 23 09:31:09.186188 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:09.185893 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gg4dj\"" Apr 23 09:31:09.186188 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:09.186056 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 09:31:09.199536 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:09.199080 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sb8zp" Apr 23 09:31:09.261898 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:09.261868 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe118deb-6994-428b-9927-b07aafd254b6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zxwg9\" (UID: \"fe118deb-6994-428b-9927-b07aafd254b6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zxwg9" Apr 23 09:31:09.262022 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:09.261967 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5f78147d-0dbc-471a-a1c9-05e2a3cb5333-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vqvls\" (UID: \"5f78147d-0dbc-471a-a1c9-05e2a3cb5333\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vqvls" Apr 23 09:31:09.262092 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:09.262041 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30c9a344-56d8-4716-84da-8665f9ee4946-cert\") pod \"ingress-canary-rj967\" (UID: \"30c9a344-56d8-4716-84da-8665f9ee4946\") " pod="openshift-ingress-canary/ingress-canary-rj967" Apr 23 09:31:09.262144 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:09.262096 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6b8bf9f1-73a1-4060-b09a-55a78aed4470-metrics-tls\") pod \"dns-default-2mjdq\" (UID: \"6b8bf9f1-73a1-4060-b09a-55a78aed4470\") " pod="openshift-dns/dns-default-2mjdq" Apr 23 09:31:09.262741 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:09.262721 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 09:31:09.262830 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:09.262787 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe118deb-6994-428b-9927-b07aafd254b6-cluster-monitoring-operator-tls podName:fe118deb-6994-428b-9927-b07aafd254b6 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:11.26276847 +0000 UTC m=+36.707830775 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fe118deb-6994-428b-9927-b07aafd254b6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zxwg9" (UID: "fe118deb-6994-428b-9927-b07aafd254b6") : secret "cluster-monitoring-operator-tls" not found Apr 23 09:31:09.263183 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:09.263167 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 09:31:09.263238 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:09.263220 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f78147d-0dbc-471a-a1c9-05e2a3cb5333-networking-console-plugin-cert podName:5f78147d-0dbc-471a-a1c9-05e2a3cb5333 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:11.263205232 +0000 UTC m=+36.708267538 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5f78147d-0dbc-471a-a1c9-05e2a3cb5333-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-vqvls" (UID: "5f78147d-0dbc-471a-a1c9-05e2a3cb5333") : secret "networking-console-plugin-cert" not found Apr 23 09:31:09.263426 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:09.263412 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 09:31:09.263487 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:09.263458 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b8bf9f1-73a1-4060-b09a-55a78aed4470-metrics-tls podName:6b8bf9f1-73a1-4060-b09a-55a78aed4470 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:11.263444611 +0000 UTC m=+36.708506914 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6b8bf9f1-73a1-4060-b09a-55a78aed4470-metrics-tls") pod "dns-default-2mjdq" (UID: "6b8bf9f1-73a1-4060-b09a-55a78aed4470") : secret "dns-default-metrics-tls" not found Apr 23 09:31:09.263541 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:09.263517 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 09:31:09.263590 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:09.263547 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30c9a344-56d8-4716-84da-8665f9ee4946-cert podName:30c9a344-56d8-4716-84da-8665f9ee4946 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:11.263537353 +0000 UTC m=+36.708599656 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/30c9a344-56d8-4716-84da-8665f9ee4946-cert") pod "ingress-canary-rj967" (UID: "30c9a344-56d8-4716-84da-8665f9ee4946") : secret "canary-serving-cert" not found Apr 23 09:31:09.406395 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:09.406339 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-sb8zp"] Apr 23 09:31:09.412412 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:31:09.412379 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11f39ddc_72f6_4699_8329_bbb34ab9a9f0.slice/crio-ed6f123ff2d4c58153d7b2ad17113f28fc947acef0989654f46e57e2987c201c WatchSource:0}: Error finding container ed6f123ff2d4c58153d7b2ad17113f28fc947acef0989654f46e57e2987c201c: Status 404 returned error can't find the container with id ed6f123ff2d4c58153d7b2ad17113f28fc947acef0989654f46e57e2987c201c Apr 23 09:31:10.360388 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:10.360321 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-sb8zp" event={"ID":"11f39ddc-72f6-4699-8329-bbb34ab9a9f0","Type":"ContainerStarted","Data":"ed6f123ff2d4c58153d7b2ad17113f28fc947acef0989654f46e57e2987c201c"} Apr 23 09:31:11.185191 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:11.183904 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/25c6abb3-0bf5-4a51-9450-a29341379573-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ftn6j\" (UID: \"25c6abb3-0bf5-4a51-9450-a29341379573\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ftn6j" Apr 23 09:31:11.185191 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:11.184042 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91f0276c-d1f7-4977-a25e-107ac0756380-service-ca-bundle\") pod \"router-default-86d78b875b-rbksj\" (UID: \"91f0276c-d1f7-4977-a25e-107ac0756380\") " pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:31:11.185191 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:11.184080 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91f0276c-d1f7-4977-a25e-107ac0756380-metrics-certs\") pod \"router-default-86d78b875b-rbksj\" (UID: \"91f0276c-d1f7-4977-a25e-107ac0756380\") " pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:31:11.185191 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:11.184149 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-registry-tls\") pod \"image-registry-554d8f4b86-p7vqw\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:11.185191 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:11.184278 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 09:31:11.185191 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:11.184312 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-554d8f4b86-p7vqw: secret "image-registry-tls" not found Apr 23 09:31:11.185191 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:11.184370 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-registry-tls podName:164f4228-f916-4e95-9a43-29cf40bacf4a nodeName:}" failed. No retries permitted until 2026-04-23 09:31:15.184351539 +0000 UTC m=+40.629413854 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-registry-tls") pod "image-registry-554d8f4b86-p7vqw" (UID: "164f4228-f916-4e95-9a43-29cf40bacf4a") : secret "image-registry-tls" not found Apr 23 09:31:11.185191 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:11.184896 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 09:31:11.185191 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:11.184951 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25c6abb3-0bf5-4a51-9450-a29341379573-samples-operator-tls podName:25c6abb3-0bf5-4a51-9450-a29341379573 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:15.184934847 +0000 UTC m=+40.629997150 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/25c6abb3-0bf5-4a51-9450-a29341379573-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-ftn6j" (UID: "25c6abb3-0bf5-4a51-9450-a29341379573") : secret "samples-operator-tls" not found Apr 23 09:31:11.185191 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:11.185019 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/91f0276c-d1f7-4977-a25e-107ac0756380-service-ca-bundle podName:91f0276c-d1f7-4977-a25e-107ac0756380 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:15.185009821 +0000 UTC m=+40.630072122 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/91f0276c-d1f7-4977-a25e-107ac0756380-service-ca-bundle") pod "router-default-86d78b875b-rbksj" (UID: "91f0276c-d1f7-4977-a25e-107ac0756380") : configmap references non-existent config key: service-ca.crt Apr 23 09:31:11.185191 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:11.185077 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 09:31:11.185191 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:11.185114 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91f0276c-d1f7-4977-a25e-107ac0756380-metrics-certs podName:91f0276c-d1f7-4977-a25e-107ac0756380 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:15.185103219 +0000 UTC m=+40.630165523 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91f0276c-d1f7-4977-a25e-107ac0756380-metrics-certs") pod "router-default-86d78b875b-rbksj" (UID: "91f0276c-d1f7-4977-a25e-107ac0756380") : secret "router-metrics-certs-default" not found Apr 23 09:31:11.286665 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:11.285528 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6b8bf9f1-73a1-4060-b09a-55a78aed4470-metrics-tls\") pod \"dns-default-2mjdq\" (UID: \"6b8bf9f1-73a1-4060-b09a-55a78aed4470\") " pod="openshift-dns/dns-default-2mjdq" Apr 23 09:31:11.286665 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:11.285632 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe118deb-6994-428b-9927-b07aafd254b6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zxwg9\" (UID: \"fe118deb-6994-428b-9927-b07aafd254b6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zxwg9" Apr 23 09:31:11.286665 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:11.285706 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5f78147d-0dbc-471a-a1c9-05e2a3cb5333-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vqvls\" (UID: \"5f78147d-0dbc-471a-a1c9-05e2a3cb5333\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vqvls" Apr 23 09:31:11.286665 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:11.285764 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30c9a344-56d8-4716-84da-8665f9ee4946-cert\") pod \"ingress-canary-rj967\" (UID: \"30c9a344-56d8-4716-84da-8665f9ee4946\") " pod="openshift-ingress-canary/ingress-canary-rj967" Apr 23 09:31:11.286665 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:11.285887 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 09:31:11.286665 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:11.285943 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30c9a344-56d8-4716-84da-8665f9ee4946-cert podName:30c9a344-56d8-4716-84da-8665f9ee4946 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:15.285926116 +0000 UTC m=+40.730988423 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/30c9a344-56d8-4716-84da-8665f9ee4946-cert") pod "ingress-canary-rj967" (UID: "30c9a344-56d8-4716-84da-8665f9ee4946") : secret "canary-serving-cert" not found Apr 23 09:31:11.286665 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:11.286358 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 09:31:11.286665 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:11.286411 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b8bf9f1-73a1-4060-b09a-55a78aed4470-metrics-tls podName:6b8bf9f1-73a1-4060-b09a-55a78aed4470 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:15.286394238 +0000 UTC m=+40.731456541 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6b8bf9f1-73a1-4060-b09a-55a78aed4470-metrics-tls") pod "dns-default-2mjdq" (UID: "6b8bf9f1-73a1-4060-b09a-55a78aed4470") : secret "dns-default-metrics-tls" not found Apr 23 09:31:11.286665 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:11.286473 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 09:31:11.286665 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:11.286508 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe118deb-6994-428b-9927-b07aafd254b6-cluster-monitoring-operator-tls podName:fe118deb-6994-428b-9927-b07aafd254b6 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:15.286496449 +0000 UTC m=+40.731558757 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fe118deb-6994-428b-9927-b07aafd254b6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zxwg9" (UID: "fe118deb-6994-428b-9927-b07aafd254b6") : secret "cluster-monitoring-operator-tls" not found Apr 23 09:31:11.286665 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:11.286576 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 09:31:11.286665 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:11.286609 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f78147d-0dbc-471a-a1c9-05e2a3cb5333-networking-console-plugin-cert podName:5f78147d-0dbc-471a-a1c9-05e2a3cb5333 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:15.286598177 +0000 UTC m=+40.731660491 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5f78147d-0dbc-471a-a1c9-05e2a3cb5333-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-vqvls" (UID: "5f78147d-0dbc-471a-a1c9-05e2a3cb5333") : secret "networking-console-plugin-cert" not found Apr 23 09:31:14.416832 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:14.416798 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05ab6e35-f200-483a-b4f6-fee2629df7f2-original-pull-secret\") pod \"global-pull-secret-syncer-459kk\" (UID: \"05ab6e35-f200-483a-b4f6-fee2629df7f2\") " pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:31:14.419451 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:14.419427 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05ab6e35-f200-483a-b4f6-fee2629df7f2-original-pull-secret\") pod \"global-pull-secret-syncer-459kk\" (UID: \"05ab6e35-f200-483a-b4f6-fee2629df7f2\") " pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:31:14.610746 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:14.610712 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-459kk" Apr 23 09:31:15.224554 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:15.224514 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91f0276c-d1f7-4977-a25e-107ac0756380-service-ca-bundle\") pod \"router-default-86d78b875b-rbksj\" (UID: \"91f0276c-d1f7-4977-a25e-107ac0756380\") " pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:31:15.224731 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:15.224566 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91f0276c-d1f7-4977-a25e-107ac0756380-metrics-certs\") pod \"router-default-86d78b875b-rbksj\" (UID: \"91f0276c-d1f7-4977-a25e-107ac0756380\") " pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:31:15.224731 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:15.224628 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-registry-tls\") pod \"image-registry-554d8f4b86-p7vqw\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:15.224731 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:15.224678 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/25c6abb3-0bf5-4a51-9450-a29341379573-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ftn6j\" (UID: \"25c6abb3-0bf5-4a51-9450-a29341379573\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ftn6j" Apr 23 09:31:15.224731 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:15.224702 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/91f0276c-d1f7-4977-a25e-107ac0756380-service-ca-bundle podName:91f0276c-d1f7-4977-a25e-107ac0756380 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:23.224681171 +0000 UTC m=+48.669743472 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/91f0276c-d1f7-4977-a25e-107ac0756380-service-ca-bundle") pod "router-default-86d78b875b-rbksj" (UID: "91f0276c-d1f7-4977-a25e-107ac0756380") : configmap references non-existent config key: service-ca.crt Apr 23 09:31:15.224956 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:15.224741 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 09:31:15.224956 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:15.224760 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 09:31:15.224956 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:15.224790 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 09:31:15.224956 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:15.224805 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-554d8f4b86-p7vqw: secret "image-registry-tls" not found Apr 23 09:31:15.224956 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:15.224821 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91f0276c-d1f7-4977-a25e-107ac0756380-metrics-certs podName:91f0276c-d1f7-4977-a25e-107ac0756380 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:23.224802988 +0000 UTC m=+48.669865303 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91f0276c-d1f7-4977-a25e-107ac0756380-metrics-certs") pod "router-default-86d78b875b-rbksj" (UID: "91f0276c-d1f7-4977-a25e-107ac0756380") : secret "router-metrics-certs-default" not found Apr 23 09:31:15.224956 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:15.224843 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25c6abb3-0bf5-4a51-9450-a29341379573-samples-operator-tls podName:25c6abb3-0bf5-4a51-9450-a29341379573 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:23.224833492 +0000 UTC m=+48.669895818 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/25c6abb3-0bf5-4a51-9450-a29341379573-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-ftn6j" (UID: "25c6abb3-0bf5-4a51-9450-a29341379573") : secret "samples-operator-tls" not found Apr 23 09:31:15.224956 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:15.224858 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-registry-tls podName:164f4228-f916-4e95-9a43-29cf40bacf4a nodeName:}" failed. No retries permitted until 2026-04-23 09:31:23.224850507 +0000 UTC m=+48.669912814 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-registry-tls") pod "image-registry-554d8f4b86-p7vqw" (UID: "164f4228-f916-4e95-9a43-29cf40bacf4a") : secret "image-registry-tls" not found Apr 23 09:31:15.325172 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:15.325140 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30c9a344-56d8-4716-84da-8665f9ee4946-cert\") pod \"ingress-canary-rj967\" (UID: \"30c9a344-56d8-4716-84da-8665f9ee4946\") " pod="openshift-ingress-canary/ingress-canary-rj967" Apr 23 09:31:15.325351 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:15.325252 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 09:31:15.325351 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:15.325281 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6b8bf9f1-73a1-4060-b09a-55a78aed4470-metrics-tls\") pod \"dns-default-2mjdq\" (UID: \"6b8bf9f1-73a1-4060-b09a-55a78aed4470\") " pod="openshift-dns/dns-default-2mjdq" Apr 23 09:31:15.325351 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:15.325321 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30c9a344-56d8-4716-84da-8665f9ee4946-cert podName:30c9a344-56d8-4716-84da-8665f9ee4946 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:23.325305086 +0000 UTC m=+48.770367400 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/30c9a344-56d8-4716-84da-8665f9ee4946-cert") pod "ingress-canary-rj967" (UID: "30c9a344-56d8-4716-84da-8665f9ee4946") : secret "canary-serving-cert" not found Apr 23 09:31:15.325528 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:15.325379 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 09:31:15.325528 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:15.325400 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe118deb-6994-428b-9927-b07aafd254b6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zxwg9\" (UID: \"fe118deb-6994-428b-9927-b07aafd254b6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zxwg9" Apr 23 09:31:15.325528 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:15.325439 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b8bf9f1-73a1-4060-b09a-55a78aed4470-metrics-tls podName:6b8bf9f1-73a1-4060-b09a-55a78aed4470 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:23.325423715 +0000 UTC m=+48.770486020 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6b8bf9f1-73a1-4060-b09a-55a78aed4470-metrics-tls") pod "dns-default-2mjdq" (UID: "6b8bf9f1-73a1-4060-b09a-55a78aed4470") : secret "dns-default-metrics-tls" not found Apr 23 09:31:15.325528 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:15.325467 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 09:31:15.325528 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:15.325506 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe118deb-6994-428b-9927-b07aafd254b6-cluster-monitoring-operator-tls podName:fe118deb-6994-428b-9927-b07aafd254b6 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:23.32549412 +0000 UTC m=+48.770556421 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fe118deb-6994-428b-9927-b07aafd254b6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zxwg9" (UID: "fe118deb-6994-428b-9927-b07aafd254b6") : secret "cluster-monitoring-operator-tls" not found Apr 23 09:31:15.325528 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:15.325504 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5f78147d-0dbc-471a-a1c9-05e2a3cb5333-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vqvls\" (UID: \"5f78147d-0dbc-471a-a1c9-05e2a3cb5333\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vqvls" Apr 23 09:31:15.325778 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:15.325613 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 09:31:15.325778 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:15.325662 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f78147d-0dbc-471a-a1c9-05e2a3cb5333-networking-console-plugin-cert podName:5f78147d-0dbc-471a-a1c9-05e2a3cb5333 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:23.325649129 +0000 UTC m=+48.770711431 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5f78147d-0dbc-471a-a1c9-05e2a3cb5333-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-vqvls" (UID: "5f78147d-0dbc-471a-a1c9-05e2a3cb5333") : secret "networking-console-plugin-cert" not found Apr 23 09:31:23.238598 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.238366 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-459kk"] Apr 23 09:31:23.262203 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:31:23.262154 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05ab6e35_f200_483a_b4f6_fee2629df7f2.slice/crio-9a806320486fb87ad5526bdf2741639728cd5435af3584503441cc816139b338 WatchSource:0}: Error finding container 9a806320486fb87ad5526bdf2741639728cd5435af3584503441cc816139b338: Status 404 returned error can't find the container with id 9a806320486fb87ad5526bdf2741639728cd5435af3584503441cc816139b338 Apr 23 09:31:23.295956 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.295917 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-registry-tls\") pod \"image-registry-554d8f4b86-p7vqw\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:23.296079 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.296001 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/25c6abb3-0bf5-4a51-9450-a29341379573-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ftn6j\" (UID: \"25c6abb3-0bf5-4a51-9450-a29341379573\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ftn6j" Apr 23 09:31:23.296137 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.296126 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91f0276c-d1f7-4977-a25e-107ac0756380-service-ca-bundle\") pod \"router-default-86d78b875b-rbksj\" (UID: \"91f0276c-d1f7-4977-a25e-107ac0756380\") " pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:31:23.296193 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.296163 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91f0276c-d1f7-4977-a25e-107ac0756380-metrics-certs\") pod \"router-default-86d78b875b-rbksj\" (UID: \"91f0276c-d1f7-4977-a25e-107ac0756380\") " pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:31:23.297016 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:23.296344 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 09:31:23.297016 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:23.296412 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91f0276c-d1f7-4977-a25e-107ac0756380-metrics-certs podName:91f0276c-d1f7-4977-a25e-107ac0756380 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:39.296389147 +0000 UTC m=+64.741451453 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/91f0276c-d1f7-4977-a25e-107ac0756380-metrics-certs") pod "router-default-86d78b875b-rbksj" (UID: "91f0276c-d1f7-4977-a25e-107ac0756380") : secret "router-metrics-certs-default" not found Apr 23 09:31:23.297016 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:23.296784 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 09:31:23.297016 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:23.296800 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-554d8f4b86-p7vqw: secret "image-registry-tls" not found Apr 23 09:31:23.297016 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:23.296846 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-registry-tls podName:164f4228-f916-4e95-9a43-29cf40bacf4a nodeName:}" failed. No retries permitted until 2026-04-23 09:31:39.296830884 +0000 UTC m=+64.741893201 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-registry-tls") pod "image-registry-554d8f4b86-p7vqw" (UID: "164f4228-f916-4e95-9a43-29cf40bacf4a") : secret "image-registry-tls" not found Apr 23 09:31:23.297016 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:23.296905 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 09:31:23.297016 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:23.296936 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25c6abb3-0bf5-4a51-9450-a29341379573-samples-operator-tls podName:25c6abb3-0bf5-4a51-9450-a29341379573 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:39.296926149 +0000 UTC m=+64.741988465 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/25c6abb3-0bf5-4a51-9450-a29341379573-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-ftn6j" (UID: "25c6abb3-0bf5-4a51-9450-a29341379573") : secret "samples-operator-tls" not found Apr 23 09:31:23.297016 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:23.296993 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/91f0276c-d1f7-4977-a25e-107ac0756380-service-ca-bundle podName:91f0276c-d1f7-4977-a25e-107ac0756380 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:39.296983788 +0000 UTC m=+64.742046103 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/91f0276c-d1f7-4977-a25e-107ac0756380-service-ca-bundle") pod "router-default-86d78b875b-rbksj" (UID: "91f0276c-d1f7-4977-a25e-107ac0756380") : configmap references non-existent config key: service-ca.crt Apr 23 09:31:23.397963 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.396828 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5f78147d-0dbc-471a-a1c9-05e2a3cb5333-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vqvls\" (UID: \"5f78147d-0dbc-471a-a1c9-05e2a3cb5333\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vqvls" Apr 23 09:31:23.397963 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.396904 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30c9a344-56d8-4716-84da-8665f9ee4946-cert\") pod \"ingress-canary-rj967\" (UID: \"30c9a344-56d8-4716-84da-8665f9ee4946\") " pod="openshift-ingress-canary/ingress-canary-rj967" Apr 23 09:31:23.397963 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.396959 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6b8bf9f1-73a1-4060-b09a-55a78aed4470-metrics-tls\") pod \"dns-default-2mjdq\" (UID: \"6b8bf9f1-73a1-4060-b09a-55a78aed4470\") " pod="openshift-dns/dns-default-2mjdq" Apr 23 09:31:23.397963 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.397037 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe118deb-6994-428b-9927-b07aafd254b6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zxwg9\" (UID: \"fe118deb-6994-428b-9927-b07aafd254b6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zxwg9" Apr 23 09:31:23.397963 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:23.397197 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 09:31:23.397963 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:23.397258 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe118deb-6994-428b-9927-b07aafd254b6-cluster-monitoring-operator-tls podName:fe118deb-6994-428b-9927-b07aafd254b6 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:39.397237881 +0000 UTC m=+64.842300186 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fe118deb-6994-428b-9927-b07aafd254b6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zxwg9" (UID: "fe118deb-6994-428b-9927-b07aafd254b6") : secret "cluster-monitoring-operator-tls" not found Apr 23 09:31:23.397963 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:23.397687 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 09:31:23.397963 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:23.397752 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f78147d-0dbc-471a-a1c9-05e2a3cb5333-networking-console-plugin-cert podName:5f78147d-0dbc-471a-a1c9-05e2a3cb5333 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:39.397733867 +0000 UTC m=+64.842796169 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5f78147d-0dbc-471a-a1c9-05e2a3cb5333-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-vqvls" (UID: "5f78147d-0dbc-471a-a1c9-05e2a3cb5333") : secret "networking-console-plugin-cert" not found Apr 23 09:31:23.397963 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:23.397814 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 09:31:23.397963 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:23.397845 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/30c9a344-56d8-4716-84da-8665f9ee4946-cert podName:30c9a344-56d8-4716-84da-8665f9ee4946 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:39.39783435 +0000 UTC m=+64.842896658 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/30c9a344-56d8-4716-84da-8665f9ee4946-cert") pod "ingress-canary-rj967" (UID: "30c9a344-56d8-4716-84da-8665f9ee4946") : secret "canary-serving-cert" not found Apr 23 09:31:23.397963 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:23.397901 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 09:31:23.397963 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:23.397932 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b8bf9f1-73a1-4060-b09a-55a78aed4470-metrics-tls podName:6b8bf9f1-73a1-4060-b09a-55a78aed4470 nodeName:}" failed. No retries permitted until 2026-04-23 09:31:39.397921583 +0000 UTC m=+64.842983886 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6b8bf9f1-73a1-4060-b09a-55a78aed4470-metrics-tls") pod "dns-default-2mjdq" (UID: "6b8bf9f1-73a1-4060-b09a-55a78aed4470") : secret "dns-default-metrics-tls" not found Apr 23 09:31:23.413982 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.413824 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n72bx" event={"ID":"c46256a3-d81a-45a4-b658-3a9db962e17a","Type":"ContainerStarted","Data":"6502e598f894a1f934a66f5da9adf19293a371ca157ff6b2e6bbd1ebbe1a5037"} Apr 23 09:31:23.421680 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.419706 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b5955c777-bgbzm" event={"ID":"c455ca72-f57f-4a17-a701-1d5db923cb29","Type":"ContainerStarted","Data":"238701978d358f703e96e4c7cac10836b5a05e90ad4e72ff78733b2016d4b411"} Apr 23 09:31:23.421680 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.420964 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b5955c777-bgbzm" Apr 23 09:31:23.423369 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.423276 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b5955c777-bgbzm" Apr 23 09:31:23.426031 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.426006 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bpp9r_e3b73d76-bb0a-4807-ba45-02945da31336/console-operator/0.log" Apr 23 09:31:23.426126 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.426054 2566 generic.go:358] "Generic (PLEG): container finished" podID="e3b73d76-bb0a-4807-ba45-02945da31336" containerID="4f4b821f066b3add5e1da84dfad65fac0d6147c1827de6308b31ffb6c3b5f85f" exitCode=255 Apr 23 09:31:23.426187 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.426162 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-bpp9r" event={"ID":"e3b73d76-bb0a-4807-ba45-02945da31336","Type":"ContainerDied","Data":"4f4b821f066b3add5e1da84dfad65fac0d6147c1827de6308b31ffb6c3b5f85f"} Apr 23 09:31:23.426350 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.426333 2566 scope.go:117] "RemoveContainer" containerID="4f4b821f066b3add5e1da84dfad65fac0d6147c1827de6308b31ffb6c3b5f85f" Apr 23 09:31:23.429064 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.429024 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" event={"ID":"eaa17ea4-444a-410e-b077-79a2168b8f71","Type":"ContainerStarted","Data":"da163222c00564e45d97d1db709589e82384c28840e28e6089deaa8c87766289"} Apr 23 09:31:23.431052 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.430848 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4b5f9c8c-mfnsg" event={"ID":"d24244c3-869e-47a6-92c7-a26c11be6b66","Type":"ContainerStarted","Data":"233bb79635d51053ff64b7923cfdadc292d4fe187f3c3c3e06ffbbe27ffc8c21"} Apr 23 09:31:23.433228 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.432913 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nxsf6" event={"ID":"d01a5541-6b84-4253-a5a5-e0d82b86f84d","Type":"ContainerStarted","Data":"79149211f73b5ffc87ae424790c6a7fe9e13a74ff7c9f345946912216de646f1"} Apr 23 09:31:23.436505 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.436482 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-zllgx" event={"ID":"bfdda918-82b7-430c-9e8e-f2555930fa85","Type":"ContainerStarted","Data":"1a470777f196876298faf2e4df7c68dd49344f92516420fc6944865dcb214258"} Apr 23 09:31:23.436991 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.436636 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6b5955c777-bgbzm" podStartSLOduration=24.435571315 podStartE2EDuration="39.436620551s" podCreationTimestamp="2026-04-23 09:30:44 +0000 UTC" firstStartedPulling="2026-04-23 09:31:08.107709597 +0000 UTC m=+33.552771900" lastFinishedPulling="2026-04-23 09:31:23.108758822 +0000 UTC m=+48.553821136" observedRunningTime="2026-04-23 09:31:23.436218289 +0000 UTC m=+48.881280613" watchObservedRunningTime="2026-04-23 09:31:23.436620551 +0000 UTC m=+48.881682877" Apr 23 09:31:23.439949 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.439715 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-sb8zp" event={"ID":"11f39ddc-72f6-4699-8329-bbb34ab9a9f0","Type":"ContainerStarted","Data":"9c2f51b2b051170952207e26e04c27031d301608e9aed08699ad61c37fed0b9c"} Apr 23 09:31:23.439949 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.439850 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-sb8zp" Apr 23 09:31:23.441645 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.441556 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-459kk" event={"ID":"05ab6e35-f200-483a-b4f6-fee2629df7f2","Type":"ContainerStarted","Data":"9a806320486fb87ad5526bdf2741639728cd5435af3584503441cc816139b338"} Apr 23 09:31:23.444041 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.444018 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qbzv8" event={"ID":"541d1fdf-75b3-4058-97be-3140f4c7fdb2","Type":"ContainerStarted","Data":"f76eca4066be561ef5b59a7e07beb1bc746be9e50a3029d89e7df33619ad2332"} Apr 23 09:31:23.453844 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.453801 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nxsf6" podStartSLOduration=20.411349758 podStartE2EDuration="34.453785342s" podCreationTimestamp="2026-04-23 09:30:49 +0000 UTC" firstStartedPulling="2026-04-23 09:31:08.010373562 +0000 UTC m=+33.455435877" lastFinishedPulling="2026-04-23 09:31:22.052809155 +0000 UTC m=+47.497871461" observedRunningTime="2026-04-23 09:31:23.450186862 +0000 UTC m=+48.895249186" watchObservedRunningTime="2026-04-23 09:31:23.453785342 +0000 UTC m=+48.898847666" Apr 23 09:31:23.489579 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.489537 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5f4b5f9c8c-mfnsg" podStartSLOduration=24.475540264 podStartE2EDuration="39.48952111s" podCreationTimestamp="2026-04-23 09:30:44 +0000 UTC" firstStartedPulling="2026-04-23 09:31:08.092271315 +0000 UTC m=+33.537333619" lastFinishedPulling="2026-04-23 09:31:23.10625215 +0000 UTC m=+48.551314465" observedRunningTime="2026-04-23 09:31:23.469957445 +0000 UTC m=+48.915019769" watchObservedRunningTime="2026-04-23 09:31:23.48952111 +0000 UTC m=+48.934583438" Apr 23 09:31:23.533905 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.533441 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-zllgx" podStartSLOduration=19.472821646 podStartE2EDuration="34.533405757s" podCreationTimestamp="2026-04-23 09:30:49 +0000 UTC" firstStartedPulling="2026-04-23 09:31:07.972881959 +0000 UTC m=+33.417944266" lastFinishedPulling="2026-04-23 09:31:23.033466071 +0000 UTC m=+48.478528377" observedRunningTime="2026-04-23 09:31:23.508410641 +0000 UTC m=+48.953473004" watchObservedRunningTime="2026-04-23 09:31:23.533405757 +0000 UTC m=+48.978468081" Apr 23 09:31:23.555217 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.554068 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qbzv8" podStartSLOduration=19.493507853 podStartE2EDuration="34.5540472s" podCreationTimestamp="2026-04-23 09:30:49 +0000 UTC" firstStartedPulling="2026-04-23 09:31:08.047889056 +0000 UTC m=+33.492951357" lastFinishedPulling="2026-04-23 09:31:23.108428395 +0000 UTC m=+48.553490704" observedRunningTime="2026-04-23 09:31:23.553017269 +0000 UTC m=+48.998079593" watchObservedRunningTime="2026-04-23 09:31:23.5540472 +0000 UTC m=+48.999109539" Apr 23 09:31:23.575242 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:23.573972 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-sb8zp" podStartSLOduration=34.881149914 podStartE2EDuration="48.573953047s" podCreationTimestamp="2026-04-23 09:30:35 +0000 UTC" firstStartedPulling="2026-04-23 09:31:09.415989012 +0000 UTC m=+34.861051312" lastFinishedPulling="2026-04-23 09:31:23.108792141 +0000 UTC m=+48.553854445" observedRunningTime="2026-04-23 09:31:23.573254161 +0000 UTC m=+49.018316485" watchObservedRunningTime="2026-04-23 09:31:23.573953047 +0000 UTC m=+49.019015374" Apr 23 09:31:24.467206 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:24.466612 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-zvsn4" event={"ID":"b2f33469-6027-43ec-be9b-4bebb5500631","Type":"ContainerStarted","Data":"79cdad979f7237430b41e76c74e10a7c3ecd30913039e97c7effc5203dc9e55c"} Apr 23 09:31:24.468552 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:24.468476 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bpp9r_e3b73d76-bb0a-4807-ba45-02945da31336/console-operator/1.log" Apr 23 09:31:24.469869 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:24.468933 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bpp9r_e3b73d76-bb0a-4807-ba45-02945da31336/console-operator/0.log" Apr 23 09:31:24.469869 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:24.468966 2566 generic.go:358] "Generic (PLEG): container finished" podID="e3b73d76-bb0a-4807-ba45-02945da31336" containerID="302317e3f3c3d547cbd3e1d39df3e0401c1f7c7f0c993ee6e42ca920712e607d" exitCode=255 Apr 23 09:31:24.469869 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:24.469344 2566 scope.go:117] "RemoveContainer" containerID="302317e3f3c3d547cbd3e1d39df3e0401c1f7c7f0c993ee6e42ca920712e607d" Apr 23 09:31:24.469869 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:24.469532 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-bpp9r_openshift-console-operator(e3b73d76-bb0a-4807-ba45-02945da31336)\"" pod="openshift-console-operator/console-operator-9d4b6777b-bpp9r" podUID="e3b73d76-bb0a-4807-ba45-02945da31336" Apr 23 09:31:24.469869 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:24.469736 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-bpp9r" event={"ID":"e3b73d76-bb0a-4807-ba45-02945da31336","Type":"ContainerDied","Data":"302317e3f3c3d547cbd3e1d39df3e0401c1f7c7f0c993ee6e42ca920712e607d"} Apr 23 09:31:24.469869 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:24.469764 2566 scope.go:117] "RemoveContainer" containerID="4f4b821f066b3add5e1da84dfad65fac0d6147c1827de6308b31ffb6c3b5f85f" Apr 23 09:31:24.479247 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:24.478690 2566 generic.go:358] "Generic (PLEG): container finished" podID="495c169c-a1fe-4740-b5e4-88f23ef7e5d0" containerID="c5eaa12ea6a29209dc5f871b5f4618a4e42e6a26a4999529194ef4421bc9c079" exitCode=0 Apr 23 09:31:24.479247 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:24.478841 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wkpk8" event={"ID":"495c169c-a1fe-4740-b5e4-88f23ef7e5d0","Type":"ContainerDied","Data":"c5eaa12ea6a29209dc5f871b5f4618a4e42e6a26a4999529194ef4421bc9c079"} Apr 23 09:31:24.484408 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:24.484365 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-zvsn4" podStartSLOduration=20.462127173 podStartE2EDuration="35.484350786s" podCreationTimestamp="2026-04-23 09:30:49 +0000 UTC" firstStartedPulling="2026-04-23 09:31:08.098323689 +0000 UTC m=+33.543385996" lastFinishedPulling="2026-04-23 09:31:23.120547307 +0000 UTC m=+48.565609609" observedRunningTime="2026-04-23 09:31:24.481560411 +0000 UTC m=+49.926622736" watchObservedRunningTime="2026-04-23 09:31:24.484350786 +0000 UTC m=+49.929413111" Apr 23 09:31:24.554831 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:24.554780 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n72bx" podStartSLOduration=20.57005308 podStartE2EDuration="35.554763403s" podCreationTimestamp="2026-04-23 09:30:49 +0000 UTC" firstStartedPulling="2026-04-23 09:31:08.123838107 +0000 UTC m=+33.568900408" lastFinishedPulling="2026-04-23 09:31:23.108548429 +0000 UTC m=+48.553610731" observedRunningTime="2026-04-23 09:31:24.52362038 +0000 UTC m=+49.968682708" watchObservedRunningTime="2026-04-23 09:31:24.554763403 +0000 UTC m=+49.999825728" Apr 23 09:31:25.487018 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:25.486120 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bpp9r_e3b73d76-bb0a-4807-ba45-02945da31336/console-operator/1.log" Apr 23 09:31:25.487018 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:25.486554 2566 scope.go:117] "RemoveContainer" containerID="302317e3f3c3d547cbd3e1d39df3e0401c1f7c7f0c993ee6e42ca920712e607d" Apr 23 09:31:25.487018 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:25.486753 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-bpp9r_openshift-console-operator(e3b73d76-bb0a-4807-ba45-02945da31336)\"" pod="openshift-console-operator/console-operator-9d4b6777b-bpp9r" podUID="e3b73d76-bb0a-4807-ba45-02945da31336" Apr 23 09:31:25.491989 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:25.491961 2566 generic.go:358] "Generic (PLEG): container finished" podID="495c169c-a1fe-4740-b5e4-88f23ef7e5d0" containerID="f57d4afb14d7f8b7c299723cec1af19a41478601691031f2ce49bd615f079ac5" exitCode=0 Apr 23 09:31:25.493031 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:25.492171 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wkpk8" event={"ID":"495c169c-a1fe-4740-b5e4-88f23ef7e5d0","Type":"ContainerDied","Data":"f57d4afb14d7f8b7c299723cec1af19a41478601691031f2ce49bd615f079ac5"} Apr 23 09:31:26.499124 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:26.499080 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wkpk8" event={"ID":"495c169c-a1fe-4740-b5e4-88f23ef7e5d0","Type":"ContainerStarted","Data":"d874cd2e0a88891725acae7fb7584f5d25d2e614139fc1b5dd5f5840790504ed"} Apr 23 09:31:26.501230 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:26.501162 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" event={"ID":"eaa17ea4-444a-410e-b077-79a2168b8f71","Type":"ContainerStarted","Data":"fa1e2091d3d6792945c95ffe7a9328c060807d9ad870a07432269d275a8ebc43"} Apr 23 09:31:26.501230 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:26.501199 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" event={"ID":"eaa17ea4-444a-410e-b077-79a2168b8f71","Type":"ContainerStarted","Data":"8a488b60219f23a2fa2eb550916bd1e8049c026024d16b7254e2ba1f871cf8a4"} Apr 23 09:31:26.522119 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:26.521861 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wkpk8" podStartSLOduration=5.416536279 podStartE2EDuration="51.521852172s" podCreationTimestamp="2026-04-23 09:30:35 +0000 UTC" firstStartedPulling="2026-04-23 09:30:37.002722635 +0000 UTC m=+2.447784939" lastFinishedPulling="2026-04-23 09:31:23.108038512 +0000 UTC m=+48.553100832" observedRunningTime="2026-04-23 09:31:26.521280133 +0000 UTC m=+51.966342455" watchObservedRunningTime="2026-04-23 09:31:26.521852172 +0000 UTC m=+51.966914494" Apr 23 09:31:26.539151 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:26.539111 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" podStartSLOduration=24.357614278 podStartE2EDuration="42.539095021s" podCreationTimestamp="2026-04-23 09:30:44 +0000 UTC" firstStartedPulling="2026-04-23 09:31:08.134949142 +0000 UTC m=+33.580011451" lastFinishedPulling="2026-04-23 09:31:26.316429893 +0000 UTC m=+51.761492194" observedRunningTime="2026-04-23 09:31:26.537843809 +0000 UTC m=+51.982906132" watchObservedRunningTime="2026-04-23 09:31:26.539095021 +0000 UTC m=+51.984157344" Apr 23 09:31:27.713538 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:27.713507 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-k5h5x_aac600fc-205b-40e3-aa53-d90a5274d7e4/dns-node-resolver/0.log" Apr 23 09:31:27.741643 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:27.741617 2566 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-bpp9r" Apr 23 09:31:27.741777 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:27.741659 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-bpp9r" Apr 23 09:31:27.742083 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:27.742066 2566 scope.go:117] "RemoveContainer" containerID="302317e3f3c3d547cbd3e1d39df3e0401c1f7c7f0c993ee6e42ca920712e607d" Apr 23 09:31:27.742309 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:27.742274 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-bpp9r_openshift-console-operator(e3b73d76-bb0a-4807-ba45-02945da31336)\"" pod="openshift-console-operator/console-operator-9d4b6777b-bpp9r" podUID="e3b73d76-bb0a-4807-ba45-02945da31336" Apr 23 09:31:28.508091 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:28.508014 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-459kk" event={"ID":"05ab6e35-f200-483a-b4f6-fee2629df7f2","Type":"ContainerStarted","Data":"07e2bda4eeb06b135501354ecfd1132f0f4958c8a9bdfbffb1cb7b9ca6da9904"} Apr 23 09:31:28.521441 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:28.521399 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-459kk" podStartSLOduration=41.659817855 podStartE2EDuration="46.521386482s" podCreationTimestamp="2026-04-23 09:30:42 +0000 UTC" firstStartedPulling="2026-04-23 09:31:23.267480027 +0000 UTC m=+48.712542343" lastFinishedPulling="2026-04-23 09:31:28.129048666 +0000 UTC m=+53.574110970" observedRunningTime="2026-04-23 09:31:28.520824381 +0000 UTC m=+53.965886703" watchObservedRunningTime="2026-04-23 09:31:28.521386482 +0000 UTC m=+53.966448804" Apr 23 09:31:28.714913 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:28.714888 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-nd7df_1793a43e-b740-4967-92f1-e3ecda36b452/node-ca/0.log" Apr 23 09:31:33.313508 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:33.313473 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-pf5hs" Apr 23 09:31:39.343164 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:39.343128 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-registry-tls\") pod \"image-registry-554d8f4b86-p7vqw\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:39.343642 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:39.343181 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/25c6abb3-0bf5-4a51-9450-a29341379573-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ftn6j\" (UID: \"25c6abb3-0bf5-4a51-9450-a29341379573\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ftn6j" Apr 23 09:31:39.343642 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:39.343241 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91f0276c-d1f7-4977-a25e-107ac0756380-service-ca-bundle\") pod \"router-default-86d78b875b-rbksj\" (UID: \"91f0276c-d1f7-4977-a25e-107ac0756380\") " pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:31:39.343642 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:39.343260 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91f0276c-d1f7-4977-a25e-107ac0756380-metrics-certs\") pod \"router-default-86d78b875b-rbksj\" (UID: \"91f0276c-d1f7-4977-a25e-107ac0756380\") " pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:31:39.343642 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:39.343423 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/91f0276c-d1f7-4977-a25e-107ac0756380-service-ca-bundle podName:91f0276c-d1f7-4977-a25e-107ac0756380 nodeName:}" failed. No retries permitted until 2026-04-23 09:32:11.343404324 +0000 UTC m=+96.788466632 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/91f0276c-d1f7-4977-a25e-107ac0756380-service-ca-bundle") pod "router-default-86d78b875b-rbksj" (UID: "91f0276c-d1f7-4977-a25e-107ac0756380") : configmap references non-existent config key: service-ca.crt Apr 23 09:31:39.345549 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:39.345523 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-registry-tls\") pod \"image-registry-554d8f4b86-p7vqw\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:39.345664 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:39.345585 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/25c6abb3-0bf5-4a51-9450-a29341379573-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ftn6j\" (UID: \"25c6abb3-0bf5-4a51-9450-a29341379573\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ftn6j" Apr 23 09:31:39.345722 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:39.345700 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/91f0276c-d1f7-4977-a25e-107ac0756380-metrics-certs\") pod \"router-default-86d78b875b-rbksj\" (UID: \"91f0276c-d1f7-4977-a25e-107ac0756380\") " pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:31:39.443884 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:39.443859 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe118deb-6994-428b-9927-b07aafd254b6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zxwg9\" (UID: \"fe118deb-6994-428b-9927-b07aafd254b6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zxwg9" Apr 23 09:31:39.444033 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:39.443913 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5f78147d-0dbc-471a-a1c9-05e2a3cb5333-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vqvls\" (UID: \"5f78147d-0dbc-471a-a1c9-05e2a3cb5333\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vqvls" Apr 23 09:31:39.444033 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:39.443942 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30c9a344-56d8-4716-84da-8665f9ee4946-cert\") pod \"ingress-canary-rj967\" (UID: \"30c9a344-56d8-4716-84da-8665f9ee4946\") " pod="openshift-ingress-canary/ingress-canary-rj967" Apr 23 09:31:39.444033 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:39.443967 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6b8bf9f1-73a1-4060-b09a-55a78aed4470-metrics-tls\") pod \"dns-default-2mjdq\" (UID: \"6b8bf9f1-73a1-4060-b09a-55a78aed4470\") " pod="openshift-dns/dns-default-2mjdq" Apr 23 09:31:39.444033 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:39.444021 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 09:31:39.444244 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:39.444052 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 09:31:39.444244 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:39.444095 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe118deb-6994-428b-9927-b07aafd254b6-cluster-monitoring-operator-tls podName:fe118deb-6994-428b-9927-b07aafd254b6 nodeName:}" failed. No retries permitted until 2026-04-23 09:32:11.444075155 +0000 UTC m=+96.889137458 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fe118deb-6994-428b-9927-b07aafd254b6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-zxwg9" (UID: "fe118deb-6994-428b-9927-b07aafd254b6") : secret "cluster-monitoring-operator-tls" not found Apr 23 09:31:39.444244 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:39.444117 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f78147d-0dbc-471a-a1c9-05e2a3cb5333-networking-console-plugin-cert podName:5f78147d-0dbc-471a-a1c9-05e2a3cb5333 nodeName:}" failed. No retries permitted until 2026-04-23 09:32:11.444106886 +0000 UTC m=+96.889169186 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5f78147d-0dbc-471a-a1c9-05e2a3cb5333-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-vqvls" (UID: "5f78147d-0dbc-471a-a1c9-05e2a3cb5333") : secret "networking-console-plugin-cert" not found Apr 23 09:31:39.446068 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:39.446050 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6b8bf9f1-73a1-4060-b09a-55a78aed4470-metrics-tls\") pod \"dns-default-2mjdq\" (UID: \"6b8bf9f1-73a1-4060-b09a-55a78aed4470\") " pod="openshift-dns/dns-default-2mjdq" Apr 23 09:31:39.446165 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:39.446147 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30c9a344-56d8-4716-84da-8665f9ee4946-cert\") pod \"ingress-canary-rj967\" (UID: \"30c9a344-56d8-4716-84da-8665f9ee4946\") " pod="openshift-ingress-canary/ingress-canary-rj967" Apr 23 09:31:39.567730 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:39.567710 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-fhpfg\"" Apr 23 09:31:39.575799 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:39.575777 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ftn6j" Apr 23 09:31:39.584368 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:39.584347 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-shblk\"" Apr 23 09:31:39.592373 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:39.592352 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:39.609631 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:39.609605 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-mvkwp\"" Apr 23 09:31:39.616113 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:39.616093 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mw7t9\"" Apr 23 09:31:39.618053 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:39.618015 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2mjdq" Apr 23 09:31:39.624504 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:39.624043 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rj967" Apr 23 09:31:39.713671 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:39.713640 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ftn6j"] Apr 23 09:31:39.737347 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:39.737321 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-554d8f4b86-p7vqw"] Apr 23 09:31:39.740607 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:31:39.740582 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod164f4228_f916_4e95_9a43_29cf40bacf4a.slice/crio-38ff9bbd7d674ed69d3dc697f63b0b087b4e0e819a7bb290e99fcaaf1a3e5079 WatchSource:0}: Error finding container 38ff9bbd7d674ed69d3dc697f63b0b087b4e0e819a7bb290e99fcaaf1a3e5079: Status 404 returned error can't find the container with id 38ff9bbd7d674ed69d3dc697f63b0b087b4e0e819a7bb290e99fcaaf1a3e5079 Apr 23 09:31:39.781516 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:39.781489 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rj967"] Apr 23 09:31:39.784002 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:31:39.783977 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30c9a344_56d8_4716_84da_8665f9ee4946.slice/crio-c796c79c45e4ab51614642662365489df3d835d40e11ea154b4ef0e956e8203f WatchSource:0}: Error finding container c796c79c45e4ab51614642662365489df3d835d40e11ea154b4ef0e956e8203f: Status 404 returned error can't find the container with id c796c79c45e4ab51614642662365489df3d835d40e11ea154b4ef0e956e8203f Apr 23 09:31:39.795980 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:39.795956 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2mjdq"] Apr 23 09:31:39.799350 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:31:39.799322 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b8bf9f1_73a1_4060_b09a_55a78aed4470.slice/crio-138d3d00e6dff4a434ae587bc69297032a695ee850481714973df8870e1b59de WatchSource:0}: Error finding container 138d3d00e6dff4a434ae587bc69297032a695ee850481714973df8870e1b59de: Status 404 returned error can't find the container with id 138d3d00e6dff4a434ae587bc69297032a695ee850481714973df8870e1b59de Apr 23 09:31:39.849201 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:39.849181 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c171e7cd-9c69-4ef7-9012-fad9d2b17a46-metrics-certs\") pod \"network-metrics-daemon-6528f\" (UID: \"c171e7cd-9c69-4ef7-9012-fad9d2b17a46\") " pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:31:39.851713 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:39.851664 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 09:31:39.859927 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:39.859911 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 09:31:39.859996 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:31:39.859963 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c171e7cd-9c69-4ef7-9012-fad9d2b17a46-metrics-certs podName:c171e7cd-9c69-4ef7-9012-fad9d2b17a46 nodeName:}" failed. No retries permitted until 2026-04-23 09:32:43.859948267 +0000 UTC m=+129.305010568 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c171e7cd-9c69-4ef7-9012-fad9d2b17a46-metrics-certs") pod "network-metrics-daemon-6528f" (UID: "c171e7cd-9c69-4ef7-9012-fad9d2b17a46") : secret "metrics-daemon-secret" not found Apr 23 09:31:40.546164 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:40.546107 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2mjdq" event={"ID":"6b8bf9f1-73a1-4060-b09a-55a78aed4470","Type":"ContainerStarted","Data":"138d3d00e6dff4a434ae587bc69297032a695ee850481714973df8870e1b59de"} Apr 23 09:31:40.547576 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:40.547545 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ftn6j" event={"ID":"25c6abb3-0bf5-4a51-9450-a29341379573","Type":"ContainerStarted","Data":"47c8d79797376b85c4df4a5bac770b6377ace5528f95b28adb5b00e86234558c"} Apr 23 09:31:40.549163 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:40.549105 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" event={"ID":"164f4228-f916-4e95-9a43-29cf40bacf4a","Type":"ContainerStarted","Data":"66a82bc50b6316b93bec8fc5cd90ab014882c33773bf6f761732c52b4bce7246"} Apr 23 09:31:40.549163 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:40.549137 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" event={"ID":"164f4228-f916-4e95-9a43-29cf40bacf4a","Type":"ContainerStarted","Data":"38ff9bbd7d674ed69d3dc697f63b0b087b4e0e819a7bb290e99fcaaf1a3e5079"} Apr 23 09:31:40.549410 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:40.549382 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:31:40.550401 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:40.550366 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rj967" event={"ID":"30c9a344-56d8-4716-84da-8665f9ee4946","Type":"ContainerStarted","Data":"c796c79c45e4ab51614642662365489df3d835d40e11ea154b4ef0e956e8203f"} Apr 23 09:31:40.575576 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:40.575536 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" podStartSLOduration=65.575521027 podStartE2EDuration="1m5.575521027s" podCreationTimestamp="2026-04-23 09:30:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:31:40.574097663 +0000 UTC m=+66.019159987" watchObservedRunningTime="2026-04-23 09:31:40.575521027 +0000 UTC m=+66.020583352" Apr 23 09:31:41.175343 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:41.175317 2566 scope.go:117] "RemoveContainer" containerID="302317e3f3c3d547cbd3e1d39df3e0401c1f7c7f0c993ee6e42ca920712e607d" Apr 23 09:31:43.561510 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:43.560922 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2mjdq" event={"ID":"6b8bf9f1-73a1-4060-b09a-55a78aed4470","Type":"ContainerStarted","Data":"1b8adbde75093e775d0c38ac779a2c35b9593e517ab4e8ced35d34fbf2434ac7"} Apr 23 09:31:43.561510 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:43.560962 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2mjdq" event={"ID":"6b8bf9f1-73a1-4060-b09a-55a78aed4470","Type":"ContainerStarted","Data":"f727b4fbb4c8677a1b43dabdacdf5b33e9d125a53048384cab874b844501fb7c"} Apr 23 09:31:43.561510 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:43.561376 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-2mjdq" Apr 23 09:31:43.563436 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:43.563409 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ftn6j" event={"ID":"25c6abb3-0bf5-4a51-9450-a29341379573","Type":"ContainerStarted","Data":"b457c34cb3bc241a03f396bfb1acae070a14af43510d8f92ef65c1dcf8c3dd13"} Apr 23 09:31:43.563547 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:43.563442 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ftn6j" event={"ID":"25c6abb3-0bf5-4a51-9450-a29341379573","Type":"ContainerStarted","Data":"e5959b46939e5f8fd9910d64a9f21b4c6256ce7a1436ac9c888ddcfa732fa7c6"} Apr 23 09:31:43.565001 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:43.564985 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bpp9r_e3b73d76-bb0a-4807-ba45-02945da31336/console-operator/1.log" Apr 23 09:31:43.565090 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:43.565043 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-bpp9r" event={"ID":"e3b73d76-bb0a-4807-ba45-02945da31336","Type":"ContainerStarted","Data":"50f976204aec2764fa2dfe0b466e4f1486df9fc171a0c574f4817d332ca02662"} Apr 23 09:31:43.565318 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:43.565285 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-bpp9r" Apr 23 09:31:43.582269 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:43.582231 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-2mjdq" podStartSLOduration=33.781710942 podStartE2EDuration="36.582219894s" podCreationTimestamp="2026-04-23 09:31:07 +0000 UTC" firstStartedPulling="2026-04-23 09:31:39.800954301 +0000 UTC m=+65.246016605" lastFinishedPulling="2026-04-23 09:31:42.601463253 +0000 UTC m=+68.046525557" observedRunningTime="2026-04-23 09:31:43.580271965 +0000 UTC m=+69.025334288" watchObservedRunningTime="2026-04-23 09:31:43.582219894 +0000 UTC m=+69.027282216" Apr 23 09:31:43.597469 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:43.597433 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ftn6j" podStartSLOduration=51.796279377 podStartE2EDuration="54.597422144s" podCreationTimestamp="2026-04-23 09:30:49 +0000 UTC" firstStartedPulling="2026-04-23 09:31:39.803769165 +0000 UTC m=+65.248831466" lastFinishedPulling="2026-04-23 09:31:42.604911931 +0000 UTC m=+68.049974233" observedRunningTime="2026-04-23 09:31:43.596793286 +0000 UTC m=+69.041855615" watchObservedRunningTime="2026-04-23 09:31:43.597422144 +0000 UTC m=+69.042484499" Apr 23 09:31:43.612694 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:43.612659 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-bpp9r" podStartSLOduration=39.431851313 podStartE2EDuration="54.612649968s" podCreationTimestamp="2026-04-23 09:30:49 +0000 UTC" firstStartedPulling="2026-04-23 09:31:07.923811652 +0000 UTC m=+33.368873953" lastFinishedPulling="2026-04-23 09:31:23.1046103 +0000 UTC m=+48.549672608" observedRunningTime="2026-04-23 09:31:43.611982198 +0000 UTC m=+69.057044525" watchObservedRunningTime="2026-04-23 09:31:43.612649968 +0000 UTC m=+69.057712307" Apr 23 09:31:43.927494 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:43.927464 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-bpp9r" Apr 23 09:31:44.569474 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:44.569430 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rj967" event={"ID":"30c9a344-56d8-4716-84da-8665f9ee4946","Type":"ContainerStarted","Data":"43015aa7de24d573e9bb4bd3253f2620fd95023789faa9db990de554b709eb04"} Apr 23 09:31:44.585810 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:44.585765 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rj967" podStartSLOduration=33.581445152 podStartE2EDuration="37.585750785s" podCreationTimestamp="2026-04-23 09:31:07 +0000 UTC" firstStartedPulling="2026-04-23 09:31:39.785801867 +0000 UTC m=+65.230864167" lastFinishedPulling="2026-04-23 09:31:43.790107485 +0000 UTC m=+69.235169800" observedRunningTime="2026-04-23 09:31:44.58470055 +0000 UTC m=+70.029762874" watchObservedRunningTime="2026-04-23 09:31:44.585750785 +0000 UTC m=+70.030813107" Apr 23 09:31:47.868375 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:47.868340 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-gwh72"] Apr 23 09:31:47.910553 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:47.910527 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gwh72"] Apr 23 09:31:47.910691 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:47.910678 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gwh72" Apr 23 09:31:47.913327 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:47.913291 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 09:31:47.913950 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:47.913928 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 09:31:47.914120 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:47.914102 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-qqv29\"" Apr 23 09:31:48.014772 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:48.014747 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e5fae73e-630f-4593-a153-91162968e0fd-data-volume\") pod \"insights-runtime-extractor-gwh72\" (UID: \"e5fae73e-630f-4593-a153-91162968e0fd\") " pod="openshift-insights/insights-runtime-extractor-gwh72" Apr 23 09:31:48.014886 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:48.014792 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e5fae73e-630f-4593-a153-91162968e0fd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gwh72\" (UID: \"e5fae73e-630f-4593-a153-91162968e0fd\") " pod="openshift-insights/insights-runtime-extractor-gwh72" Apr 23 09:31:48.014886 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:48.014846 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e5fae73e-630f-4593-a153-91162968e0fd-crio-socket\") pod \"insights-runtime-extractor-gwh72\" (UID: \"e5fae73e-630f-4593-a153-91162968e0fd\") " pod="openshift-insights/insights-runtime-extractor-gwh72" Apr 23 09:31:48.015002 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:48.014890 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldskx\" (UniqueName: \"kubernetes.io/projected/e5fae73e-630f-4593-a153-91162968e0fd-kube-api-access-ldskx\") pod \"insights-runtime-extractor-gwh72\" (UID: \"e5fae73e-630f-4593-a153-91162968e0fd\") " pod="openshift-insights/insights-runtime-extractor-gwh72" Apr 23 09:31:48.015002 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:48.014948 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e5fae73e-630f-4593-a153-91162968e0fd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gwh72\" (UID: \"e5fae73e-630f-4593-a153-91162968e0fd\") " pod="openshift-insights/insights-runtime-extractor-gwh72" Apr 23 09:31:48.115810 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:48.115786 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e5fae73e-630f-4593-a153-91162968e0fd-data-volume\") pod \"insights-runtime-extractor-gwh72\" (UID: \"e5fae73e-630f-4593-a153-91162968e0fd\") " pod="openshift-insights/insights-runtime-extractor-gwh72" Apr 23 09:31:48.115938 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:48.115817 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e5fae73e-630f-4593-a153-91162968e0fd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gwh72\" (UID: \"e5fae73e-630f-4593-a153-91162968e0fd\") " pod="openshift-insights/insights-runtime-extractor-gwh72" Apr 23 09:31:48.115938 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:48.115854 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e5fae73e-630f-4593-a153-91162968e0fd-crio-socket\") pod \"insights-runtime-extractor-gwh72\" (UID: \"e5fae73e-630f-4593-a153-91162968e0fd\") " pod="openshift-insights/insights-runtime-extractor-gwh72" Apr 23 09:31:48.115938 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:48.115930 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e5fae73e-630f-4593-a153-91162968e0fd-crio-socket\") pod \"insights-runtime-extractor-gwh72\" (UID: \"e5fae73e-630f-4593-a153-91162968e0fd\") " pod="openshift-insights/insights-runtime-extractor-gwh72" Apr 23 09:31:48.116125 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:48.115982 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldskx\" (UniqueName: \"kubernetes.io/projected/e5fae73e-630f-4593-a153-91162968e0fd-kube-api-access-ldskx\") pod \"insights-runtime-extractor-gwh72\" (UID: \"e5fae73e-630f-4593-a153-91162968e0fd\") " pod="openshift-insights/insights-runtime-extractor-gwh72" Apr 23 09:31:48.116125 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:48.116036 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e5fae73e-630f-4593-a153-91162968e0fd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gwh72\" (UID: \"e5fae73e-630f-4593-a153-91162968e0fd\") " pod="openshift-insights/insights-runtime-extractor-gwh72" Apr 23 09:31:48.116224 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:48.116194 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e5fae73e-630f-4593-a153-91162968e0fd-data-volume\") pod \"insights-runtime-extractor-gwh72\" (UID: \"e5fae73e-630f-4593-a153-91162968e0fd\") " pod="openshift-insights/insights-runtime-extractor-gwh72" Apr 23 09:31:48.116434 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:48.116409 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e5fae73e-630f-4593-a153-91162968e0fd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gwh72\" (UID: \"e5fae73e-630f-4593-a153-91162968e0fd\") " pod="openshift-insights/insights-runtime-extractor-gwh72" Apr 23 09:31:48.118328 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:48.118310 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e5fae73e-630f-4593-a153-91162968e0fd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gwh72\" (UID: \"e5fae73e-630f-4593-a153-91162968e0fd\") " pod="openshift-insights/insights-runtime-extractor-gwh72" Apr 23 09:31:48.125067 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:48.125010 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldskx\" (UniqueName: \"kubernetes.io/projected/e5fae73e-630f-4593-a153-91162968e0fd-kube-api-access-ldskx\") pod \"insights-runtime-extractor-gwh72\" (UID: \"e5fae73e-630f-4593-a153-91162968e0fd\") " pod="openshift-insights/insights-runtime-extractor-gwh72" Apr 23 09:31:48.221140 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:48.221120 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gwh72" Apr 23 09:31:48.346217 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:48.346186 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gwh72"] Apr 23 09:31:48.349710 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:31:48.349683 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5fae73e_630f_4593_a153_91162968e0fd.slice/crio-a1f537371276db3161e55efb510ffb70a85eef4c0a02f0a37b879965b741624d WatchSource:0}: Error finding container a1f537371276db3161e55efb510ffb70a85eef4c0a02f0a37b879965b741624d: Status 404 returned error can't find the container with id a1f537371276db3161e55efb510ffb70a85eef4c0a02f0a37b879965b741624d Apr 23 09:31:48.580855 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:48.580830 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gwh72" event={"ID":"e5fae73e-630f-4593-a153-91162968e0fd","Type":"ContainerStarted","Data":"53298b5096d0d3d6b066b38429139b63880f18ccff63180b0091503fe99d969e"} Apr 23 09:31:48.580963 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:48.580862 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gwh72" event={"ID":"e5fae73e-630f-4593-a153-91162968e0fd","Type":"ContainerStarted","Data":"a1f537371276db3161e55efb510ffb70a85eef4c0a02f0a37b879965b741624d"} Apr 23 09:31:50.587951 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:50.587907 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gwh72" event={"ID":"e5fae73e-630f-4593-a153-91162968e0fd","Type":"ContainerStarted","Data":"949acfb1fe5ae504fb4293e3ac7f5880021e5191eb1f83c1e1e8712648f64e6b"} Apr 23 09:31:53.572647 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:53.572621 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-2mjdq" Apr 23 09:31:54.481121 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:54.480999 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-sb8zp" Apr 23 09:31:54.600422 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:54.600387 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gwh72" event={"ID":"e5fae73e-630f-4593-a153-91162968e0fd","Type":"ContainerStarted","Data":"5e9b166d21fdf381a85cd01061b7280134cc90deea873abad4b4910d573e5273"} Apr 23 09:31:54.620447 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:54.620400 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-gwh72" podStartSLOduration=2.408462198 podStartE2EDuration="7.620385701s" podCreationTimestamp="2026-04-23 09:31:47 +0000 UTC" firstStartedPulling="2026-04-23 09:31:48.494926757 +0000 UTC m=+73.939989058" lastFinishedPulling="2026-04-23 09:31:53.706850259 +0000 UTC m=+79.151912561" observedRunningTime="2026-04-23 09:31:54.619395606 +0000 UTC m=+80.064457929" watchObservedRunningTime="2026-04-23 09:31:54.620385701 +0000 UTC m=+80.065448024" Apr 23 09:31:59.596345 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:59.596309 2566 patch_prober.go:28] interesting pod/image-registry-554d8f4b86-p7vqw container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 09:31:59.596816 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:31:59.596362 2566 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" podUID="164f4228-f916-4e95-9a43-29cf40bacf4a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 09:32:01.556811 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:01.556781 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:32:09.641698 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:09.641665 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-554d8f4b86-p7vqw"] Apr 23 09:32:11.386000 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:11.385966 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91f0276c-d1f7-4977-a25e-107ac0756380-service-ca-bundle\") pod \"router-default-86d78b875b-rbksj\" (UID: \"91f0276c-d1f7-4977-a25e-107ac0756380\") " pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:32:11.386500 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:11.386482 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91f0276c-d1f7-4977-a25e-107ac0756380-service-ca-bundle\") pod \"router-default-86d78b875b-rbksj\" (UID: \"91f0276c-d1f7-4977-a25e-107ac0756380\") " pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:32:11.486643 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:11.486610 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe118deb-6994-428b-9927-b07aafd254b6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zxwg9\" (UID: \"fe118deb-6994-428b-9927-b07aafd254b6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zxwg9" Apr 23 09:32:11.486811 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:11.486654 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5f78147d-0dbc-471a-a1c9-05e2a3cb5333-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vqvls\" (UID: \"5f78147d-0dbc-471a-a1c9-05e2a3cb5333\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vqvls" Apr 23 09:32:11.489011 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:11.488988 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5f78147d-0dbc-471a-a1c9-05e2a3cb5333-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vqvls\" (UID: \"5f78147d-0dbc-471a-a1c9-05e2a3cb5333\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vqvls" Apr 23 09:32:11.489011 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:11.488998 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe118deb-6994-428b-9927-b07aafd254b6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-zxwg9\" (UID: \"fe118deb-6994-428b-9927-b07aafd254b6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zxwg9" Apr 23 09:32:11.664861 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:11.664789 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-9swqv\"" Apr 23 09:32:11.672899 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:11.672870 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:32:11.701898 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:11.701874 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-9gsjj\"" Apr 23 09:32:11.709898 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:11.709874 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zxwg9" Apr 23 09:32:11.764412 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:11.764388 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-vvx66\"" Apr 23 09:32:11.772346 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:11.772118 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vqvls" Apr 23 09:32:11.823419 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:11.823374 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-86d78b875b-rbksj"] Apr 23 09:32:11.830281 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:32:11.830231 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91f0276c_d1f7_4977_a25e_107ac0756380.slice/crio-1109bc7df1ec59236bdd2f8ba178e5d545dee188c42f1bd144b6f279f26126a1 WatchSource:0}: Error finding container 1109bc7df1ec59236bdd2f8ba178e5d545dee188c42f1bd144b6f279f26126a1: Status 404 returned error can't find the container with id 1109bc7df1ec59236bdd2f8ba178e5d545dee188c42f1bd144b6f279f26126a1 Apr 23 09:32:11.865871 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:11.865849 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-zxwg9"] Apr 23 09:32:11.919646 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:11.919624 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-vqvls"] Apr 23 09:32:11.922313 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:32:11.922270 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f78147d_0dbc_471a_a1c9_05e2a3cb5333.slice/crio-f99a8987342b1343f6eb92008f577498b67bd3d3841fb4dd4666a53a0b2914cf WatchSource:0}: Error finding container f99a8987342b1343f6eb92008f577498b67bd3d3841fb4dd4666a53a0b2914cf: Status 404 returned error can't find the container with id f99a8987342b1343f6eb92008f577498b67bd3d3841fb4dd4666a53a0b2914cf Apr 23 09:32:12.653970 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:12.653924 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vqvls" event={"ID":"5f78147d-0dbc-471a-a1c9-05e2a3cb5333","Type":"ContainerStarted","Data":"f99a8987342b1343f6eb92008f577498b67bd3d3841fb4dd4666a53a0b2914cf"} Apr 23 09:32:12.655904 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:12.655658 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-86d78b875b-rbksj" event={"ID":"91f0276c-d1f7-4977-a25e-107ac0756380","Type":"ContainerStarted","Data":"b92fbea4ccc90597da013b93c91f80e568aacd59d3f40d8d6da5a661edb54866"} Apr 23 09:32:12.655904 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:12.655692 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-86d78b875b-rbksj" event={"ID":"91f0276c-d1f7-4977-a25e-107ac0756380","Type":"ContainerStarted","Data":"1109bc7df1ec59236bdd2f8ba178e5d545dee188c42f1bd144b6f279f26126a1"} Apr 23 09:32:12.656911 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:12.656884 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zxwg9" event={"ID":"fe118deb-6994-428b-9927-b07aafd254b6","Type":"ContainerStarted","Data":"fea787c90eccb140d5206fd948b86462c5dd9ef237bc4d57b79d4d53e8f06ef9"} Apr 23 09:32:12.673631 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:12.673611 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:32:12.676557 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:12.676508 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-86d78b875b-rbksj" podStartSLOduration=83.676495284 podStartE2EDuration="1m23.676495284s" podCreationTimestamp="2026-04-23 09:30:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:32:12.67439501 +0000 UTC m=+98.119457361" watchObservedRunningTime="2026-04-23 09:32:12.676495284 +0000 UTC m=+98.121557608" Apr 23 09:32:12.677630 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:12.677607 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:32:13.661902 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:13.661875 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vqvls" event={"ID":"5f78147d-0dbc-471a-a1c9-05e2a3cb5333","Type":"ContainerStarted","Data":"a4cdba7911bcab4a3e329eabd3b7464750310c74afd14cb7d0a8dc578e547bb1"} Apr 23 09:32:13.663473 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:13.663450 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:32:13.664663 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:13.664634 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-86d78b875b-rbksj" Apr 23 09:32:13.676054 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:13.676005 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vqvls" podStartSLOduration=69.596268147 podStartE2EDuration="1m10.675990683s" podCreationTimestamp="2026-04-23 09:31:03 +0000 UTC" firstStartedPulling="2026-04-23 09:32:11.924256815 +0000 UTC m=+97.369319123" lastFinishedPulling="2026-04-23 09:32:13.003979357 +0000 UTC m=+98.449041659" observedRunningTime="2026-04-23 09:32:13.675701279 +0000 UTC m=+99.120763615" watchObservedRunningTime="2026-04-23 09:32:13.675990683 +0000 UTC m=+99.121053010" Apr 23 09:32:14.667385 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:14.667342 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zxwg9" event={"ID":"fe118deb-6994-428b-9927-b07aafd254b6","Type":"ContainerStarted","Data":"4fda719d514b62fed83a0dca42e489ba4402e8e8a356bbe1cd293c99103fd491"} Apr 23 09:32:14.685416 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:14.685374 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-zxwg9" podStartSLOduration=83.959927804 podStartE2EDuration="1m25.685359674s" podCreationTimestamp="2026-04-23 09:30:49 +0000 UTC" firstStartedPulling="2026-04-23 09:32:11.874288603 +0000 UTC m=+97.319350907" lastFinishedPulling="2026-04-23 09:32:13.599720462 +0000 UTC m=+99.044782777" observedRunningTime="2026-04-23 09:32:14.685207643 +0000 UTC m=+100.130269968" watchObservedRunningTime="2026-04-23 09:32:14.685359674 +0000 UTC m=+100.130421996" Apr 23 09:32:24.551038 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.551005 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-9mn8q"] Apr 23 09:32:24.554413 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.554397 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9mn8q" Apr 23 09:32:24.556872 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.556846 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 09:32:24.557015 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.556988 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 09:32:24.557132 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.557114 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 09:32:24.557199 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.557147 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 09:32:24.557938 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.557924 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-cdnqx\"" Apr 23 09:32:24.598209 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.598179 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5f37820f-ab4f-473f-8b88-2d261fea31a3-sys\") pod \"node-exporter-9mn8q\" (UID: \"5f37820f-ab4f-473f-8b88-2d261fea31a3\") " pod="openshift-monitoring/node-exporter-9mn8q" Apr 23 09:32:24.598387 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.598216 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdtgr\" (UniqueName: \"kubernetes.io/projected/5f37820f-ab4f-473f-8b88-2d261fea31a3-kube-api-access-wdtgr\") pod \"node-exporter-9mn8q\" (UID: \"5f37820f-ab4f-473f-8b88-2d261fea31a3\") " pod="openshift-monitoring/node-exporter-9mn8q" Apr 23 09:32:24.598387 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.598245 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5f37820f-ab4f-473f-8b88-2d261fea31a3-root\") pod \"node-exporter-9mn8q\" (UID: \"5f37820f-ab4f-473f-8b88-2d261fea31a3\") " pod="openshift-monitoring/node-exporter-9mn8q" Apr 23 09:32:24.598387 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.598343 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5f37820f-ab4f-473f-8b88-2d261fea31a3-node-exporter-wtmp\") pod \"node-exporter-9mn8q\" (UID: \"5f37820f-ab4f-473f-8b88-2d261fea31a3\") " pod="openshift-monitoring/node-exporter-9mn8q" Apr 23 09:32:24.598387 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.598380 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5f37820f-ab4f-473f-8b88-2d261fea31a3-node-exporter-tls\") pod \"node-exporter-9mn8q\" (UID: \"5f37820f-ab4f-473f-8b88-2d261fea31a3\") " pod="openshift-monitoring/node-exporter-9mn8q" Apr 23 09:32:24.598602 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.598409 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5f37820f-ab4f-473f-8b88-2d261fea31a3-node-exporter-textfile\") pod \"node-exporter-9mn8q\" (UID: \"5f37820f-ab4f-473f-8b88-2d261fea31a3\") " pod="openshift-monitoring/node-exporter-9mn8q" Apr 23 09:32:24.598602 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.598444 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5f37820f-ab4f-473f-8b88-2d261fea31a3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9mn8q\" (UID: \"5f37820f-ab4f-473f-8b88-2d261fea31a3\") " pod="openshift-monitoring/node-exporter-9mn8q" Apr 23 09:32:24.598602 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.598476 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5f37820f-ab4f-473f-8b88-2d261fea31a3-node-exporter-accelerators-collector-config\") pod \"node-exporter-9mn8q\" (UID: \"5f37820f-ab4f-473f-8b88-2d261fea31a3\") " pod="openshift-monitoring/node-exporter-9mn8q" Apr 23 09:32:24.598602 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.598513 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5f37820f-ab4f-473f-8b88-2d261fea31a3-metrics-client-ca\") pod \"node-exporter-9mn8q\" (UID: \"5f37820f-ab4f-473f-8b88-2d261fea31a3\") " pod="openshift-monitoring/node-exporter-9mn8q" Apr 23 09:32:24.699263 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.699239 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5f37820f-ab4f-473f-8b88-2d261fea31a3-metrics-client-ca\") pod \"node-exporter-9mn8q\" (UID: \"5f37820f-ab4f-473f-8b88-2d261fea31a3\") " pod="openshift-monitoring/node-exporter-9mn8q" Apr 23 09:32:24.699434 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.699322 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5f37820f-ab4f-473f-8b88-2d261fea31a3-sys\") pod \"node-exporter-9mn8q\" (UID: \"5f37820f-ab4f-473f-8b88-2d261fea31a3\") " pod="openshift-monitoring/node-exporter-9mn8q" Apr 23 09:32:24.699434 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.699342 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wdtgr\" (UniqueName: \"kubernetes.io/projected/5f37820f-ab4f-473f-8b88-2d261fea31a3-kube-api-access-wdtgr\") pod \"node-exporter-9mn8q\" (UID: \"5f37820f-ab4f-473f-8b88-2d261fea31a3\") " pod="openshift-monitoring/node-exporter-9mn8q" Apr 23 09:32:24.699434 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.699359 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5f37820f-ab4f-473f-8b88-2d261fea31a3-root\") pod \"node-exporter-9mn8q\" (UID: \"5f37820f-ab4f-473f-8b88-2d261fea31a3\") " pod="openshift-monitoring/node-exporter-9mn8q" Apr 23 09:32:24.699434 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.699379 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5f37820f-ab4f-473f-8b88-2d261fea31a3-node-exporter-wtmp\") pod \"node-exporter-9mn8q\" (UID: \"5f37820f-ab4f-473f-8b88-2d261fea31a3\") " pod="openshift-monitoring/node-exporter-9mn8q" Apr 23 09:32:24.699434 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.699398 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5f37820f-ab4f-473f-8b88-2d261fea31a3-node-exporter-tls\") pod \"node-exporter-9mn8q\" (UID: \"5f37820f-ab4f-473f-8b88-2d261fea31a3\") " pod="openshift-monitoring/node-exporter-9mn8q" Apr 23 09:32:24.699434 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.699415 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5f37820f-ab4f-473f-8b88-2d261fea31a3-node-exporter-textfile\") pod \"node-exporter-9mn8q\" (UID: \"5f37820f-ab4f-473f-8b88-2d261fea31a3\") " pod="openshift-monitoring/node-exporter-9mn8q" Apr 23 09:32:24.699434 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.699431 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5f37820f-ab4f-473f-8b88-2d261fea31a3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9mn8q\" (UID: \"5f37820f-ab4f-473f-8b88-2d261fea31a3\") " pod="openshift-monitoring/node-exporter-9mn8q" Apr 23 09:32:24.699795 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.699441 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5f37820f-ab4f-473f-8b88-2d261fea31a3-sys\") pod \"node-exporter-9mn8q\" (UID: \"5f37820f-ab4f-473f-8b88-2d261fea31a3\") " pod="openshift-monitoring/node-exporter-9mn8q" Apr 23 09:32:24.699795 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.699454 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5f37820f-ab4f-473f-8b88-2d261fea31a3-root\") pod \"node-exporter-9mn8q\" (UID: \"5f37820f-ab4f-473f-8b88-2d261fea31a3\") " pod="openshift-monitoring/node-exporter-9mn8q" Apr 23 09:32:24.699795 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.699479 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5f37820f-ab4f-473f-8b88-2d261fea31a3-node-exporter-accelerators-collector-config\") pod \"node-exporter-9mn8q\" (UID: \"5f37820f-ab4f-473f-8b88-2d261fea31a3\") " pod="openshift-monitoring/node-exporter-9mn8q" Apr 23 09:32:24.699795 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.699546 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5f37820f-ab4f-473f-8b88-2d261fea31a3-node-exporter-wtmp\") pod \"node-exporter-9mn8q\" (UID: \"5f37820f-ab4f-473f-8b88-2d261fea31a3\") " pod="openshift-monitoring/node-exporter-9mn8q" Apr 23 09:32:24.699795 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:32:24.699610 2566 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 09:32:24.699795 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:32:24.699688 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f37820f-ab4f-473f-8b88-2d261fea31a3-node-exporter-tls podName:5f37820f-ab4f-473f-8b88-2d261fea31a3 nodeName:}" failed. No retries permitted until 2026-04-23 09:32:25.199669577 +0000 UTC m=+110.644731891 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/5f37820f-ab4f-473f-8b88-2d261fea31a3-node-exporter-tls") pod "node-exporter-9mn8q" (UID: "5f37820f-ab4f-473f-8b88-2d261fea31a3") : secret "node-exporter-tls" not found Apr 23 09:32:24.699795 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.699712 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5f37820f-ab4f-473f-8b88-2d261fea31a3-node-exporter-textfile\") pod \"node-exporter-9mn8q\" (UID: \"5f37820f-ab4f-473f-8b88-2d261fea31a3\") " pod="openshift-monitoring/node-exporter-9mn8q" Apr 23 09:32:24.700105 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.699964 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5f37820f-ab4f-473f-8b88-2d261fea31a3-metrics-client-ca\") pod \"node-exporter-9mn8q\" (UID: \"5f37820f-ab4f-473f-8b88-2d261fea31a3\") " pod="openshift-monitoring/node-exporter-9mn8q" Apr 23 09:32:24.700105 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.700005 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5f37820f-ab4f-473f-8b88-2d261fea31a3-node-exporter-accelerators-collector-config\") pod \"node-exporter-9mn8q\" (UID: \"5f37820f-ab4f-473f-8b88-2d261fea31a3\") " pod="openshift-monitoring/node-exporter-9mn8q" Apr 23 09:32:24.701702 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.701683 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5f37820f-ab4f-473f-8b88-2d261fea31a3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-9mn8q\" (UID: \"5f37820f-ab4f-473f-8b88-2d261fea31a3\") " pod="openshift-monitoring/node-exporter-9mn8q" Apr 23 09:32:24.710348 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:24.710321 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdtgr\" (UniqueName: \"kubernetes.io/projected/5f37820f-ab4f-473f-8b88-2d261fea31a3-kube-api-access-wdtgr\") pod \"node-exporter-9mn8q\" (UID: \"5f37820f-ab4f-473f-8b88-2d261fea31a3\") " pod="openshift-monitoring/node-exporter-9mn8q" Apr 23 09:32:25.203391 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:25.203354 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5f37820f-ab4f-473f-8b88-2d261fea31a3-node-exporter-tls\") pod \"node-exporter-9mn8q\" (UID: \"5f37820f-ab4f-473f-8b88-2d261fea31a3\") " pod="openshift-monitoring/node-exporter-9mn8q" Apr 23 09:32:25.205624 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:25.205601 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5f37820f-ab4f-473f-8b88-2d261fea31a3-node-exporter-tls\") pod \"node-exporter-9mn8q\" (UID: \"5f37820f-ab4f-473f-8b88-2d261fea31a3\") " pod="openshift-monitoring/node-exporter-9mn8q" Apr 23 09:32:25.463870 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:25.463803 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-9mn8q" Apr 23 09:32:25.472733 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:32:25.472706 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f37820f_ab4f_473f_8b88_2d261fea31a3.slice/crio-78316c654ecd9f92595438285cd6180b95b7e122444e7c02b80b2d4498157caa WatchSource:0}: Error finding container 78316c654ecd9f92595438285cd6180b95b7e122444e7c02b80b2d4498157caa: Status 404 returned error can't find the container with id 78316c654ecd9f92595438285cd6180b95b7e122444e7c02b80b2d4498157caa Apr 23 09:32:25.699576 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:25.699541 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9mn8q" event={"ID":"5f37820f-ab4f-473f-8b88-2d261fea31a3","Type":"ContainerStarted","Data":"78316c654ecd9f92595438285cd6180b95b7e122444e7c02b80b2d4498157caa"} Apr 23 09:32:27.706450 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:27.706419 2566 generic.go:358] "Generic (PLEG): container finished" podID="5f37820f-ab4f-473f-8b88-2d261fea31a3" containerID="cea5ef720d964f66bcac7d5fe62004becf4ff12685971ffffb0cda998263d88c" exitCode=0 Apr 23 09:32:27.706817 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:27.706457 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9mn8q" event={"ID":"5f37820f-ab4f-473f-8b88-2d261fea31a3","Type":"ContainerDied","Data":"cea5ef720d964f66bcac7d5fe62004becf4ff12685971ffffb0cda998263d88c"} Apr 23 09:32:28.711370 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:28.711333 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9mn8q" event={"ID":"5f37820f-ab4f-473f-8b88-2d261fea31a3","Type":"ContainerStarted","Data":"f1f8b85616e3e06e6e4950140edbbb0be78d1171d72f62f7daca11a5040e915a"} Apr 23 09:32:28.711370 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:28.711373 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-9mn8q" event={"ID":"5f37820f-ab4f-473f-8b88-2d261fea31a3","Type":"ContainerStarted","Data":"40a0df1ee4212ebc048c081553a6dd737639275a700717dd54a30beda4573a86"} Apr 23 09:32:28.731263 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:28.731214 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-9mn8q" podStartSLOduration=3.175889389 podStartE2EDuration="4.731201162s" podCreationTimestamp="2026-04-23 09:32:24 +0000 UTC" firstStartedPulling="2026-04-23 09:32:25.474441033 +0000 UTC m=+110.919503334" lastFinishedPulling="2026-04-23 09:32:27.029752807 +0000 UTC m=+112.474815107" observedRunningTime="2026-04-23 09:32:28.729528002 +0000 UTC m=+114.174590346" watchObservedRunningTime="2026-04-23 09:32:28.731201162 +0000 UTC m=+114.176263484" Apr 23 09:32:34.661357 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:34.661265 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" podUID="164f4228-f916-4e95-9a43-29cf40bacf4a" containerName="registry" containerID="cri-o://66a82bc50b6316b93bec8fc5cd90ab014882c33773bf6f761732c52b4bce7246" gracePeriod=30 Apr 23 09:32:34.905370 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:34.905337 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:32:34.979624 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:34.979563 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c82kr\" (UniqueName: \"kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-kube-api-access-c82kr\") pod \"164f4228-f916-4e95-9a43-29cf40bacf4a\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " Apr 23 09:32:34.979624 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:34.979596 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/164f4228-f916-4e95-9a43-29cf40bacf4a-registry-certificates\") pod \"164f4228-f916-4e95-9a43-29cf40bacf4a\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " Apr 23 09:32:34.979927 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:34.979632 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/164f4228-f916-4e95-9a43-29cf40bacf4a-ca-trust-extracted\") pod \"164f4228-f916-4e95-9a43-29cf40bacf4a\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " Apr 23 09:32:34.979927 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:34.979669 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-registry-tls\") pod \"164f4228-f916-4e95-9a43-29cf40bacf4a\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " Apr 23 09:32:34.979927 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:34.979689 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-bound-sa-token\") pod \"164f4228-f916-4e95-9a43-29cf40bacf4a\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " Apr 23 09:32:34.979927 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:34.979713 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/164f4228-f916-4e95-9a43-29cf40bacf4a-trusted-ca\") pod \"164f4228-f916-4e95-9a43-29cf40bacf4a\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " Apr 23 09:32:34.979927 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:34.979730 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/164f4228-f916-4e95-9a43-29cf40bacf4a-image-registry-private-configuration\") pod \"164f4228-f916-4e95-9a43-29cf40bacf4a\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " Apr 23 09:32:34.979927 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:34.979784 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/164f4228-f916-4e95-9a43-29cf40bacf4a-installation-pull-secrets\") pod \"164f4228-f916-4e95-9a43-29cf40bacf4a\" (UID: \"164f4228-f916-4e95-9a43-29cf40bacf4a\") " Apr 23 09:32:34.980244 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:34.980172 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/164f4228-f916-4e95-9a43-29cf40bacf4a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "164f4228-f916-4e95-9a43-29cf40bacf4a" (UID: "164f4228-f916-4e95-9a43-29cf40bacf4a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 09:32:34.980332 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:34.980282 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/164f4228-f916-4e95-9a43-29cf40bacf4a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "164f4228-f916-4e95-9a43-29cf40bacf4a" (UID: "164f4228-f916-4e95-9a43-29cf40bacf4a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 09:32:34.982162 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:34.982132 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "164f4228-f916-4e95-9a43-29cf40bacf4a" (UID: "164f4228-f916-4e95-9a43-29cf40bacf4a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:32:34.982317 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:34.982264 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/164f4228-f916-4e95-9a43-29cf40bacf4a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "164f4228-f916-4e95-9a43-29cf40bacf4a" (UID: "164f4228-f916-4e95-9a43-29cf40bacf4a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 09:32:34.982399 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:34.982319 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "164f4228-f916-4e95-9a43-29cf40bacf4a" (UID: "164f4228-f916-4e95-9a43-29cf40bacf4a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:32:34.982502 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:34.982475 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/164f4228-f916-4e95-9a43-29cf40bacf4a-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "164f4228-f916-4e95-9a43-29cf40bacf4a" (UID: "164f4228-f916-4e95-9a43-29cf40bacf4a"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 09:32:34.982573 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:34.982556 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-kube-api-access-c82kr" (OuterVolumeSpecName: "kube-api-access-c82kr") pod "164f4228-f916-4e95-9a43-29cf40bacf4a" (UID: "164f4228-f916-4e95-9a43-29cf40bacf4a"). InnerVolumeSpecName "kube-api-access-c82kr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:32:34.989809 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:34.989766 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/164f4228-f916-4e95-9a43-29cf40bacf4a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "164f4228-f916-4e95-9a43-29cf40bacf4a" (UID: "164f4228-f916-4e95-9a43-29cf40bacf4a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 09:32:35.080873 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:35.080844 2566 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/164f4228-f916-4e95-9a43-29cf40bacf4a-ca-trust-extracted\") on node \"ip-10-0-136-17.ec2.internal\" DevicePath \"\"" Apr 23 09:32:35.080873 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:35.080869 2566 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-registry-tls\") on node \"ip-10-0-136-17.ec2.internal\" DevicePath \"\"" Apr 23 09:32:35.080997 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:35.080881 2566 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-bound-sa-token\") on node \"ip-10-0-136-17.ec2.internal\" DevicePath \"\"" Apr 23 09:32:35.080997 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:35.080891 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/164f4228-f916-4e95-9a43-29cf40bacf4a-trusted-ca\") on node \"ip-10-0-136-17.ec2.internal\" DevicePath \"\"" Apr 23 09:32:35.080997 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:35.080900 2566 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/164f4228-f916-4e95-9a43-29cf40bacf4a-image-registry-private-configuration\") on node \"ip-10-0-136-17.ec2.internal\" DevicePath \"\"" Apr 23 09:32:35.080997 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:35.080908 2566 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/164f4228-f916-4e95-9a43-29cf40bacf4a-installation-pull-secrets\") on node \"ip-10-0-136-17.ec2.internal\" DevicePath \"\"" Apr 23 09:32:35.080997 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:35.080918 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c82kr\" (UniqueName: \"kubernetes.io/projected/164f4228-f916-4e95-9a43-29cf40bacf4a-kube-api-access-c82kr\") on node \"ip-10-0-136-17.ec2.internal\" DevicePath \"\"" Apr 23 09:32:35.080997 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:35.080927 2566 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/164f4228-f916-4e95-9a43-29cf40bacf4a-registry-certificates\") on node \"ip-10-0-136-17.ec2.internal\" DevicePath \"\"" Apr 23 09:32:35.731876 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:35.731838 2566 generic.go:358] "Generic (PLEG): container finished" podID="164f4228-f916-4e95-9a43-29cf40bacf4a" containerID="66a82bc50b6316b93bec8fc5cd90ab014882c33773bf6f761732c52b4bce7246" exitCode=0 Apr 23 09:32:35.732398 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:35.731909 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" Apr 23 09:32:35.732398 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:35.731915 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" event={"ID":"164f4228-f916-4e95-9a43-29cf40bacf4a","Type":"ContainerDied","Data":"66a82bc50b6316b93bec8fc5cd90ab014882c33773bf6f761732c52b4bce7246"} Apr 23 09:32:35.732398 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:35.731955 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-554d8f4b86-p7vqw" event={"ID":"164f4228-f916-4e95-9a43-29cf40bacf4a","Type":"ContainerDied","Data":"38ff9bbd7d674ed69d3dc697f63b0b087b4e0e819a7bb290e99fcaaf1a3e5079"} Apr 23 09:32:35.732398 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:35.731971 2566 scope.go:117] "RemoveContainer" containerID="66a82bc50b6316b93bec8fc5cd90ab014882c33773bf6f761732c52b4bce7246" Apr 23 09:32:35.739847 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:35.739831 2566 scope.go:117] "RemoveContainer" containerID="66a82bc50b6316b93bec8fc5cd90ab014882c33773bf6f761732c52b4bce7246" Apr 23 09:32:35.740104 ip-10-0-136-17 kubenswrapper[2566]: E0423 09:32:35.740084 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66a82bc50b6316b93bec8fc5cd90ab014882c33773bf6f761732c52b4bce7246\": container with ID starting with 66a82bc50b6316b93bec8fc5cd90ab014882c33773bf6f761732c52b4bce7246 not found: ID does not exist" containerID="66a82bc50b6316b93bec8fc5cd90ab014882c33773bf6f761732c52b4bce7246" Apr 23 09:32:35.740151 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:35.740113 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66a82bc50b6316b93bec8fc5cd90ab014882c33773bf6f761732c52b4bce7246"} err="failed to get container status \"66a82bc50b6316b93bec8fc5cd90ab014882c33773bf6f761732c52b4bce7246\": rpc error: code = NotFound desc = could not find container \"66a82bc50b6316b93bec8fc5cd90ab014882c33773bf6f761732c52b4bce7246\": container with ID starting with 66a82bc50b6316b93bec8fc5cd90ab014882c33773bf6f761732c52b4bce7246 not found: ID does not exist" Apr 23 09:32:35.751176 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:35.751159 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-554d8f4b86-p7vqw"] Apr 23 09:32:35.754766 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:35.754745 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-554d8f4b86-p7vqw"] Apr 23 09:32:37.182667 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:37.180825 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="164f4228-f916-4e95-9a43-29cf40bacf4a" path="/var/lib/kubelet/pods/164f4228-f916-4e95-9a43-29cf40bacf4a/volumes" Apr 23 09:32:43.952957 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:43.952921 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c171e7cd-9c69-4ef7-9012-fad9d2b17a46-metrics-certs\") pod \"network-metrics-daemon-6528f\" (UID: \"c171e7cd-9c69-4ef7-9012-fad9d2b17a46\") " pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:32:43.955187 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:43.955161 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c171e7cd-9c69-4ef7-9012-fad9d2b17a46-metrics-certs\") pod \"network-metrics-daemon-6528f\" (UID: \"c171e7cd-9c69-4ef7-9012-fad9d2b17a46\") " pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:32:44.036789 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:44.036766 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gg4dj\"" Apr 23 09:32:44.044747 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:44.044730 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6528f" Apr 23 09:32:44.163691 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:44.163560 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6528f"] Apr 23 09:32:44.166238 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:32:44.166217 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc171e7cd_9c69_4ef7_9012_fad9d2b17a46.slice/crio-c3f9dc0b767a113146335bfa1a8014430ef0e25ee4567e4994d53e15ba09c002 WatchSource:0}: Error finding container c3f9dc0b767a113146335bfa1a8014430ef0e25ee4567e4994d53e15ba09c002: Status 404 returned error can't find the container with id c3f9dc0b767a113146335bfa1a8014430ef0e25ee4567e4994d53e15ba09c002 Apr 23 09:32:44.760857 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:44.760819 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6528f" event={"ID":"c171e7cd-9c69-4ef7-9012-fad9d2b17a46","Type":"ContainerStarted","Data":"c3f9dc0b767a113146335bfa1a8014430ef0e25ee4567e4994d53e15ba09c002"} Apr 23 09:32:46.768071 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:46.768031 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6528f" event={"ID":"c171e7cd-9c69-4ef7-9012-fad9d2b17a46","Type":"ContainerStarted","Data":"3630bb297b2f4dfe1e594e386e53d12d1213c0dea64cb9d42b31d527ccd4deb0"} Apr 23 09:32:46.768071 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:46.768073 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6528f" event={"ID":"c171e7cd-9c69-4ef7-9012-fad9d2b17a46","Type":"ContainerStarted","Data":"369796b1c93d6efd483cc5fc2a4cd11144bc1c03039ae243dae6298d96af9f48"} Apr 23 09:32:46.786390 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:46.786344 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6528f" podStartSLOduration=130.145941042 podStartE2EDuration="2m11.78632911s" podCreationTimestamp="2026-04-23 09:30:35 +0000 UTC" firstStartedPulling="2026-04-23 09:32:44.167932774 +0000 UTC m=+129.612995075" lastFinishedPulling="2026-04-23 09:32:45.808320839 +0000 UTC m=+131.253383143" observedRunningTime="2026-04-23 09:32:46.784191438 +0000 UTC m=+132.229253761" watchObservedRunningTime="2026-04-23 09:32:46.78632911 +0000 UTC m=+132.231391489" Apr 23 09:32:54.791386 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:54.791351 2566 generic.go:358] "Generic (PLEG): container finished" podID="bfdda918-82b7-430c-9e8e-f2555930fa85" containerID="1a470777f196876298faf2e4df7c68dd49344f92516420fc6944865dcb214258" exitCode=0 Apr 23 09:32:54.791743 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:54.791414 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-zllgx" event={"ID":"bfdda918-82b7-430c-9e8e-f2555930fa85","Type":"ContainerDied","Data":"1a470777f196876298faf2e4df7c68dd49344f92516420fc6944865dcb214258"} Apr 23 09:32:54.791743 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:54.791680 2566 scope.go:117] "RemoveContainer" containerID="1a470777f196876298faf2e4df7c68dd49344f92516420fc6944865dcb214258" Apr 23 09:32:55.795916 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:55.795876 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-zllgx" event={"ID":"bfdda918-82b7-430c-9e8e-f2555930fa85","Type":"ContainerStarted","Data":"c06b15333c7b9fb70077335e4fcf7173e0591fffad07d6babcfb430befed3459"} Apr 23 09:32:55.797074 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:55.797045 2566 generic.go:358] "Generic (PLEG): container finished" podID="c46256a3-d81a-45a4-b658-3a9db962e17a" containerID="6502e598f894a1f934a66f5da9adf19293a371ca157ff6b2e6bbd1ebbe1a5037" exitCode=0 Apr 23 09:32:55.797186 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:55.797122 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n72bx" event={"ID":"c46256a3-d81a-45a4-b658-3a9db962e17a","Type":"ContainerDied","Data":"6502e598f894a1f934a66f5da9adf19293a371ca157ff6b2e6bbd1ebbe1a5037"} Apr 23 09:32:55.797447 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:55.797432 2566 scope.go:117] "RemoveContainer" containerID="6502e598f894a1f934a66f5da9adf19293a371ca157ff6b2e6bbd1ebbe1a5037" Apr 23 09:32:55.798564 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:55.798529 2566 generic.go:358] "Generic (PLEG): container finished" podID="541d1fdf-75b3-4058-97be-3140f4c7fdb2" containerID="f76eca4066be561ef5b59a7e07beb1bc746be9e50a3029d89e7df33619ad2332" exitCode=0 Apr 23 09:32:55.798681 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:55.798599 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qbzv8" event={"ID":"541d1fdf-75b3-4058-97be-3140f4c7fdb2","Type":"ContainerDied","Data":"f76eca4066be561ef5b59a7e07beb1bc746be9e50a3029d89e7df33619ad2332"} Apr 23 09:32:55.799018 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:55.798999 2566 scope.go:117] "RemoveContainer" containerID="f76eca4066be561ef5b59a7e07beb1bc746be9e50a3029d89e7df33619ad2332" Apr 23 09:32:56.804039 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:56.804003 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qbzv8" event={"ID":"541d1fdf-75b3-4058-97be-3140f4c7fdb2","Type":"ContainerStarted","Data":"56914e6af9815a8c74b2ad8f2ed6fc6e637141355ce0caf1e6e72614327a9ba2"} Apr 23 09:32:56.805532 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:56.805509 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-n72bx" event={"ID":"c46256a3-d81a-45a4-b658-3a9db962e17a","Type":"ContainerStarted","Data":"b33bd424ad6e84dc257a17c7df08ad9bf5d070f86fcebc03093d5bbed1ab97d6"} Apr 23 09:32:57.884443 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:32:57.884396 2566 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" podUID="eaa17ea4-444a-410e-b077-79a2168b8f71" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 09:33:07.883968 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:33:07.883914 2566 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" podUID="eaa17ea4-444a-410e-b077-79a2168b8f71" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 09:33:17.884866 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:33:17.884826 2566 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" podUID="eaa17ea4-444a-410e-b077-79a2168b8f71" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 09:33:17.885228 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:33:17.884899 2566 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" Apr 23 09:33:17.885388 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:33:17.885350 2566 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"fa1e2091d3d6792945c95ffe7a9328c060807d9ad870a07432269d275a8ebc43"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 23 09:33:17.885439 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:33:17.885410 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" podUID="eaa17ea4-444a-410e-b077-79a2168b8f71" containerName="service-proxy" containerID="cri-o://fa1e2091d3d6792945c95ffe7a9328c060807d9ad870a07432269d275a8ebc43" gracePeriod=30 Apr 23 09:33:18.884460 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:33:18.884428 2566 generic.go:358] "Generic (PLEG): container finished" podID="eaa17ea4-444a-410e-b077-79a2168b8f71" containerID="fa1e2091d3d6792945c95ffe7a9328c060807d9ad870a07432269d275a8ebc43" exitCode=2 Apr 23 09:33:18.884618 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:33:18.884503 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" event={"ID":"eaa17ea4-444a-410e-b077-79a2168b8f71","Type":"ContainerDied","Data":"fa1e2091d3d6792945c95ffe7a9328c060807d9ad870a07432269d275a8ebc43"} Apr 23 09:33:18.884618 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:33:18.884542 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6b858989cb-mf2j9" event={"ID":"eaa17ea4-444a-410e-b077-79a2168b8f71","Type":"ContainerStarted","Data":"4437faeb4a887cd114519bb837076ceb052f7eb0890a725ffdb9869b967b5681"} Apr 23 09:35:35.082466 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:35:35.082433 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bpp9r_e3b73d76-bb0a-4807-ba45-02945da31336/console-operator/1.log" Apr 23 09:35:35.082984 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:35:35.082583 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bpp9r_e3b73d76-bb0a-4807-ba45-02945da31336/console-operator/1.log" Apr 23 09:35:35.091776 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:35:35.091756 2566 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 09:40:35.102345 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:40:35.102285 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bpp9r_e3b73d76-bb0a-4807-ba45-02945da31336/console-operator/1.log" Apr 23 09:40:35.103486 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:40:35.103464 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bpp9r_e3b73d76-bb0a-4807-ba45-02945da31336/console-operator/1.log" Apr 23 09:41:24.547120 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:41:24.547084 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-fcqc9/progression-custom-config-node-0-0-klxdr"] Apr 23 09:41:24.549363 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:41:24.547408 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="164f4228-f916-4e95-9a43-29cf40bacf4a" containerName="registry" Apr 23 09:41:24.549363 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:41:24.547422 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="164f4228-f916-4e95-9a43-29cf40bacf4a" containerName="registry" Apr 23 09:41:24.549363 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:41:24.547487 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="164f4228-f916-4e95-9a43-29cf40bacf4a" containerName="registry" Apr 23 09:41:24.550221 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:41:24.550205 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fcqc9/progression-custom-config-node-0-0-klxdr" Apr 23 09:41:24.553006 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:41:24.552985 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-fcqc9\"/\"default-dockercfg-zg6nr\"" Apr 23 09:41:24.553140 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:41:24.552983 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-fcqc9\"/\"openshift-service-ca.crt\"" Apr 23 09:41:24.553140 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:41:24.553041 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-fcqc9\"/\"kube-root-ca.crt\"" Apr 23 09:41:24.557784 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:41:24.557625 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-fcqc9/progression-custom-config-node-0-0-klxdr"] Apr 23 09:41:24.662492 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:41:24.662462 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs2bt\" (UniqueName: \"kubernetes.io/projected/5769fa6e-646b-4cdf-a9ce-c9b87edca603-kube-api-access-cs2bt\") pod \"progression-custom-config-node-0-0-klxdr\" (UID: \"5769fa6e-646b-4cdf-a9ce-c9b87edca603\") " pod="rhai-e2e-progression-fcqc9/progression-custom-config-node-0-0-klxdr" Apr 23 09:41:24.762890 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:41:24.762856 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cs2bt\" (UniqueName: \"kubernetes.io/projected/5769fa6e-646b-4cdf-a9ce-c9b87edca603-kube-api-access-cs2bt\") pod \"progression-custom-config-node-0-0-klxdr\" (UID: \"5769fa6e-646b-4cdf-a9ce-c9b87edca603\") " pod="rhai-e2e-progression-fcqc9/progression-custom-config-node-0-0-klxdr" Apr 23 09:41:24.771730 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:41:24.771705 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs2bt\" (UniqueName: \"kubernetes.io/projected/5769fa6e-646b-4cdf-a9ce-c9b87edca603-kube-api-access-cs2bt\") pod \"progression-custom-config-node-0-0-klxdr\" (UID: \"5769fa6e-646b-4cdf-a9ce-c9b87edca603\") " pod="rhai-e2e-progression-fcqc9/progression-custom-config-node-0-0-klxdr" Apr 23 09:41:24.860644 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:41:24.860588 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fcqc9/progression-custom-config-node-0-0-klxdr" Apr 23 09:41:25.008406 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:41:25.008381 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-fcqc9/progression-custom-config-node-0-0-klxdr"] Apr 23 09:41:25.010677 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:41:25.010653 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5769fa6e_646b_4cdf_a9ce_c9b87edca603.slice/crio-8abd1ecb6942b680d373dca10a376b56f94336cafe37618d10c6583f9f44b555 WatchSource:0}: Error finding container 8abd1ecb6942b680d373dca10a376b56f94336cafe37618d10c6583f9f44b555: Status 404 returned error can't find the container with id 8abd1ecb6942b680d373dca10a376b56f94336cafe37618d10c6583f9f44b555 Apr 23 09:41:25.012668 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:41:25.012644 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 09:41:25.246438 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:41:25.246371 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fcqc9/progression-custom-config-node-0-0-klxdr" event={"ID":"5769fa6e-646b-4cdf-a9ce-c9b87edca603","Type":"ContainerStarted","Data":"8abd1ecb6942b680d373dca10a376b56f94336cafe37618d10c6583f9f44b555"} Apr 23 09:43:10.564664 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:43:10.564625 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fcqc9/progression-custom-config-node-0-0-klxdr" event={"ID":"5769fa6e-646b-4cdf-a9ce-c9b87edca603","Type":"ContainerStarted","Data":"3b43da235c50b962a243d45f1da6f2d8c648979cdae2d00349814e24eee86289"} Apr 23 09:43:10.565165 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:43:10.564705 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-fcqc9/progression-custom-config-node-0-0-klxdr" Apr 23 09:43:10.620339 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:43:10.620270 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-fcqc9/progression-custom-config-node-0-0-klxdr" podStartSLOduration=1.421988422 podStartE2EDuration="1m46.620245041s" podCreationTimestamp="2026-04-23 09:41:24 +0000 UTC" firstStartedPulling="2026-04-23 09:41:25.012778361 +0000 UTC m=+650.457840663" lastFinishedPulling="2026-04-23 09:43:10.211034975 +0000 UTC m=+755.656097282" observedRunningTime="2026-04-23 09:43:10.617956482 +0000 UTC m=+756.063018807" watchObservedRunningTime="2026-04-23 09:43:10.620245041 +0000 UTC m=+756.065307345" Apr 23 09:43:12.570948 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:43:12.570922 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-fcqc9/progression-custom-config-node-0-0-klxdr" Apr 23 09:43:33.568791 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:43:33.568753 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-fcqc9/progression-custom-config-node-0-0-klxdr" podUID="5769fa6e-646b-4cdf-a9ce-c9b87edca603" containerName="node" probeResult="failure" output="Get \"http://10.132.0.23:28080/metrics\": dial tcp 10.132.0.23:28080: connect: connection refused" Apr 23 09:43:33.629869 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:43:33.629809 2566 generic.go:358] "Generic (PLEG): container finished" podID="5769fa6e-646b-4cdf-a9ce-c9b87edca603" containerID="3b43da235c50b962a243d45f1da6f2d8c648979cdae2d00349814e24eee86289" exitCode=0 Apr 23 09:43:33.629969 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:43:33.629881 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fcqc9/progression-custom-config-node-0-0-klxdr" event={"ID":"5769fa6e-646b-4cdf-a9ce-c9b87edca603","Type":"ContainerDied","Data":"3b43da235c50b962a243d45f1da6f2d8c648979cdae2d00349814e24eee86289"} Apr 23 09:43:34.755955 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:43:34.755925 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fcqc9/progression-custom-config-node-0-0-klxdr" Apr 23 09:43:34.762352 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:43:34.762330 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cs2bt\" (UniqueName: \"kubernetes.io/projected/5769fa6e-646b-4cdf-a9ce-c9b87edca603-kube-api-access-cs2bt\") pod \"5769fa6e-646b-4cdf-a9ce-c9b87edca603\" (UID: \"5769fa6e-646b-4cdf-a9ce-c9b87edca603\") " Apr 23 09:43:34.764257 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:43:34.764225 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5769fa6e-646b-4cdf-a9ce-c9b87edca603-kube-api-access-cs2bt" (OuterVolumeSpecName: "kube-api-access-cs2bt") pod "5769fa6e-646b-4cdf-a9ce-c9b87edca603" (UID: "5769fa6e-646b-4cdf-a9ce-c9b87edca603"). InnerVolumeSpecName "kube-api-access-cs2bt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 09:43:34.863580 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:43:34.863551 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cs2bt\" (UniqueName: \"kubernetes.io/projected/5769fa6e-646b-4cdf-a9ce-c9b87edca603-kube-api-access-cs2bt\") on node \"ip-10-0-136-17.ec2.internal\" DevicePath \"\"" Apr 23 09:43:35.637253 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:43:35.637221 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-fcqc9/progression-custom-config-node-0-0-klxdr" Apr 23 09:43:35.637464 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:43:35.637218 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-fcqc9/progression-custom-config-node-0-0-klxdr" event={"ID":"5769fa6e-646b-4cdf-a9ce-c9b87edca603","Type":"ContainerDied","Data":"8abd1ecb6942b680d373dca10a376b56f94336cafe37618d10c6583f9f44b555"} Apr 23 09:43:35.637464 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:43:35.637326 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8abd1ecb6942b680d373dca10a376b56f94336cafe37618d10c6583f9f44b555" Apr 23 09:43:42.216350 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:43:42.216318 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-fcqc9/progression-custom-config-node-0-0-klxdr"] Apr 23 09:43:42.225594 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:43:42.225559 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-fcqc9/progression-custom-config-node-0-0-klxdr"] Apr 23 09:43:43.178673 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:43:43.178637 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5769fa6e-646b-4cdf-a9ce-c9b87edca603" path="/var/lib/kubelet/pods/5769fa6e-646b-4cdf-a9ce-c9b87edca603/volumes" Apr 23 09:44:31.262677 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:31.262647 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mpzrx/must-gather-bdblt"] Apr 23 09:44:31.263118 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:31.262903 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5769fa6e-646b-4cdf-a9ce-c9b87edca603" containerName="node" Apr 23 09:44:31.263118 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:31.262913 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="5769fa6e-646b-4cdf-a9ce-c9b87edca603" containerName="node" Apr 23 09:44:31.263118 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:31.262965 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="5769fa6e-646b-4cdf-a9ce-c9b87edca603" containerName="node" Apr 23 09:44:31.265829 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:31.265813 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mpzrx/must-gather-bdblt" Apr 23 09:44:31.268318 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:31.268272 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mpzrx\"/\"kube-root-ca.crt\"" Apr 23 09:44:31.268445 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:31.268320 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mpzrx\"/\"openshift-service-ca.crt\"" Apr 23 09:44:31.268445 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:31.268335 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-mpzrx\"/\"default-dockercfg-xkkx2\"" Apr 23 09:44:31.275683 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:31.275660 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mpzrx/must-gather-bdblt"] Apr 23 09:44:31.355347 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:31.355321 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxtht\" (UniqueName: \"kubernetes.io/projected/116320b2-752d-4314-937c-cd2c2c497b42-kube-api-access-rxtht\") pod \"must-gather-bdblt\" (UID: \"116320b2-752d-4314-937c-cd2c2c497b42\") " pod="openshift-must-gather-mpzrx/must-gather-bdblt" Apr 23 09:44:31.355455 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:31.355375 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/116320b2-752d-4314-937c-cd2c2c497b42-must-gather-output\") pod \"must-gather-bdblt\" (UID: \"116320b2-752d-4314-937c-cd2c2c497b42\") " pod="openshift-must-gather-mpzrx/must-gather-bdblt" Apr 23 09:44:31.455655 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:31.455633 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxtht\" (UniqueName: \"kubernetes.io/projected/116320b2-752d-4314-937c-cd2c2c497b42-kube-api-access-rxtht\") pod \"must-gather-bdblt\" (UID: \"116320b2-752d-4314-937c-cd2c2c497b42\") " pod="openshift-must-gather-mpzrx/must-gather-bdblt" Apr 23 09:44:31.455745 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:31.455680 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/116320b2-752d-4314-937c-cd2c2c497b42-must-gather-output\") pod \"must-gather-bdblt\" (UID: \"116320b2-752d-4314-937c-cd2c2c497b42\") " pod="openshift-must-gather-mpzrx/must-gather-bdblt" Apr 23 09:44:31.455982 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:31.455966 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/116320b2-752d-4314-937c-cd2c2c497b42-must-gather-output\") pod \"must-gather-bdblt\" (UID: \"116320b2-752d-4314-937c-cd2c2c497b42\") " pod="openshift-must-gather-mpzrx/must-gather-bdblt" Apr 23 09:44:31.462948 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:31.462930 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxtht\" (UniqueName: \"kubernetes.io/projected/116320b2-752d-4314-937c-cd2c2c497b42-kube-api-access-rxtht\") pod \"must-gather-bdblt\" (UID: \"116320b2-752d-4314-937c-cd2c2c497b42\") " pod="openshift-must-gather-mpzrx/must-gather-bdblt" Apr 23 09:44:31.574812 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:31.574793 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mpzrx/must-gather-bdblt" Apr 23 09:44:31.688859 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:31.688830 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mpzrx/must-gather-bdblt"] Apr 23 09:44:31.691898 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:44:31.691871 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod116320b2_752d_4314_937c_cd2c2c497b42.slice/crio-87e4ab766f446cb8c1ea9c3570e3994c3d5a6229bef5fd15434446ae056c741a WatchSource:0}: Error finding container 87e4ab766f446cb8c1ea9c3570e3994c3d5a6229bef5fd15434446ae056c741a: Status 404 returned error can't find the container with id 87e4ab766f446cb8c1ea9c3570e3994c3d5a6229bef5fd15434446ae056c741a Apr 23 09:44:31.799760 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:31.799733 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mpzrx/must-gather-bdblt" event={"ID":"116320b2-752d-4314-937c-cd2c2c497b42","Type":"ContainerStarted","Data":"87e4ab766f446cb8c1ea9c3570e3994c3d5a6229bef5fd15434446ae056c741a"} Apr 23 09:44:33.808954 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:33.808845 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mpzrx/must-gather-bdblt" event={"ID":"116320b2-752d-4314-937c-cd2c2c497b42","Type":"ContainerStarted","Data":"7a3c23c59350929e8ffc69866f99c55c1a81f97e69b2b1bd0cafdd7ae3a5e17f"} Apr 23 09:44:33.808954 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:33.808892 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mpzrx/must-gather-bdblt" event={"ID":"116320b2-752d-4314-937c-cd2c2c497b42","Type":"ContainerStarted","Data":"210a17b1ff6b0a5d361e6620e4f7bafbbdf6b1e726881c3c3042527ee5800136"} Apr 23 09:44:33.826317 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:33.826244 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mpzrx/must-gather-bdblt" podStartSLOduration=1.639826669 podStartE2EDuration="2.826216697s" podCreationTimestamp="2026-04-23 09:44:31 +0000 UTC" firstStartedPulling="2026-04-23 09:44:31.69359779 +0000 UTC m=+837.138660091" lastFinishedPulling="2026-04-23 09:44:32.879987801 +0000 UTC m=+838.325050119" observedRunningTime="2026-04-23 09:44:33.825120868 +0000 UTC m=+839.270183192" watchObservedRunningTime="2026-04-23 09:44:33.826216697 +0000 UTC m=+839.271279020" Apr 23 09:44:34.231281 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:34.231197 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-459kk_05ab6e35-f200-483a-b4f6-fee2629df7f2/global-pull-secret-syncer/0.log" Apr 23 09:44:34.356510 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:34.356458 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-57rnm_9461f0b4-eff2-4028-be01-d417107cb9a8/konnectivity-agent/0.log" Apr 23 09:44:34.456220 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:34.456184 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-17.ec2.internal_dde61d294f6e26e3e31c5e2576f80003/haproxy/0.log" Apr 23 09:44:37.518650 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:37.518617 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-zxwg9_fe118deb-6994-428b-9927-b07aafd254b6/cluster-monitoring-operator/0.log" Apr 23 09:44:37.744704 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:37.744626 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9mn8q_5f37820f-ab4f-473f-8b88-2d261fea31a3/node-exporter/0.log" Apr 23 09:44:37.776071 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:37.776041 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9mn8q_5f37820f-ab4f-473f-8b88-2d261fea31a3/kube-rbac-proxy/0.log" Apr 23 09:44:37.798275 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:37.798251 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-9mn8q_5f37820f-ab4f-473f-8b88-2d261fea31a3/init-textfile/0.log" Apr 23 09:44:39.459226 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:39.459194 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-vqvls_5f78147d-0dbc-471a-a1c9-05e2a3cb5333/networking-console-plugin/0.log" Apr 23 09:44:39.850800 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:39.850775 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bpp9r_e3b73d76-bb0a-4807-ba45-02945da31336/console-operator/1.log" Apr 23 09:44:39.861615 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:39.861584 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bpp9r_e3b73d76-bb0a-4807-ba45-02945da31336/console-operator/2.log" Apr 23 09:44:40.593549 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:40.593522 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-nxsf6_d01a5541-6b84-4253-a5a5-e0d82b86f84d/volume-data-source-validator/0.log" Apr 23 09:44:41.112961 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:41.112917 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mpzrx/perf-node-gather-daemonset-s6djz"] Apr 23 09:44:41.117866 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:41.117838 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-s6djz" Apr 23 09:44:41.123200 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:41.123177 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mpzrx/perf-node-gather-daemonset-s6djz"] Apr 23 09:44:41.206459 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:41.206431 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2mjdq_6b8bf9f1-73a1-4060-b09a-55a78aed4470/dns/0.log" Apr 23 09:44:41.226225 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:41.226197 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2mjdq_6b8bf9f1-73a1-4060-b09a-55a78aed4470/kube-rbac-proxy/0.log" Apr 23 09:44:41.241990 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:41.241958 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/20fc36fc-31c4-4515-81f8-df47d1a0570f-podres\") pod \"perf-node-gather-daemonset-s6djz\" (UID: \"20fc36fc-31c4-4515-81f8-df47d1a0570f\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-s6djz" Apr 23 09:44:41.242097 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:41.242006 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/20fc36fc-31c4-4515-81f8-df47d1a0570f-sys\") pod \"perf-node-gather-daemonset-s6djz\" (UID: \"20fc36fc-31c4-4515-81f8-df47d1a0570f\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-s6djz" Apr 23 09:44:41.242097 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:41.242029 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmhlx\" (UniqueName: \"kubernetes.io/projected/20fc36fc-31c4-4515-81f8-df47d1a0570f-kube-api-access-kmhlx\") pod \"perf-node-gather-daemonset-s6djz\" (UID: \"20fc36fc-31c4-4515-81f8-df47d1a0570f\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-s6djz" Apr 23 09:44:41.242192 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:41.242111 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/20fc36fc-31c4-4515-81f8-df47d1a0570f-lib-modules\") pod \"perf-node-gather-daemonset-s6djz\" (UID: \"20fc36fc-31c4-4515-81f8-df47d1a0570f\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-s6djz" Apr 23 09:44:41.242192 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:41.242170 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/20fc36fc-31c4-4515-81f8-df47d1a0570f-proc\") pod \"perf-node-gather-daemonset-s6djz\" (UID: \"20fc36fc-31c4-4515-81f8-df47d1a0570f\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-s6djz" Apr 23 09:44:41.342631 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:41.342603 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/20fc36fc-31c4-4515-81f8-df47d1a0570f-sys\") pod \"perf-node-gather-daemonset-s6djz\" (UID: \"20fc36fc-31c4-4515-81f8-df47d1a0570f\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-s6djz" Apr 23 09:44:41.342781 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:41.342650 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kmhlx\" (UniqueName: \"kubernetes.io/projected/20fc36fc-31c4-4515-81f8-df47d1a0570f-kube-api-access-kmhlx\") pod \"perf-node-gather-daemonset-s6djz\" (UID: \"20fc36fc-31c4-4515-81f8-df47d1a0570f\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-s6djz" Apr 23 09:44:41.342781 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:41.342678 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/20fc36fc-31c4-4515-81f8-df47d1a0570f-lib-modules\") pod \"perf-node-gather-daemonset-s6djz\" (UID: \"20fc36fc-31c4-4515-81f8-df47d1a0570f\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-s6djz" Apr 23 09:44:41.342781 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:41.342719 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/20fc36fc-31c4-4515-81f8-df47d1a0570f-proc\") pod \"perf-node-gather-daemonset-s6djz\" (UID: \"20fc36fc-31c4-4515-81f8-df47d1a0570f\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-s6djz" Apr 23 09:44:41.342781 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:41.342726 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/20fc36fc-31c4-4515-81f8-df47d1a0570f-sys\") pod \"perf-node-gather-daemonset-s6djz\" (UID: \"20fc36fc-31c4-4515-81f8-df47d1a0570f\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-s6djz" Apr 23 09:44:41.342948 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:41.342804 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/20fc36fc-31c4-4515-81f8-df47d1a0570f-proc\") pod \"perf-node-gather-daemonset-s6djz\" (UID: \"20fc36fc-31c4-4515-81f8-df47d1a0570f\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-s6djz" Apr 23 09:44:41.342948 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:41.342822 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/20fc36fc-31c4-4515-81f8-df47d1a0570f-podres\") pod \"perf-node-gather-daemonset-s6djz\" (UID: \"20fc36fc-31c4-4515-81f8-df47d1a0570f\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-s6djz" Apr 23 09:44:41.342948 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:41.342857 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/20fc36fc-31c4-4515-81f8-df47d1a0570f-lib-modules\") pod \"perf-node-gather-daemonset-s6djz\" (UID: \"20fc36fc-31c4-4515-81f8-df47d1a0570f\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-s6djz" Apr 23 09:44:41.342948 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:41.342936 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/20fc36fc-31c4-4515-81f8-df47d1a0570f-podres\") pod \"perf-node-gather-daemonset-s6djz\" (UID: \"20fc36fc-31c4-4515-81f8-df47d1a0570f\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-s6djz" Apr 23 09:44:41.347345 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:41.347325 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-k5h5x_aac600fc-205b-40e3-aa53-d90a5274d7e4/dns-node-resolver/0.log" Apr 23 09:44:41.349606 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:41.349582 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmhlx\" (UniqueName: \"kubernetes.io/projected/20fc36fc-31c4-4515-81f8-df47d1a0570f-kube-api-access-kmhlx\") pod \"perf-node-gather-daemonset-s6djz\" (UID: \"20fc36fc-31c4-4515-81f8-df47d1a0570f\") " pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-s6djz" Apr 23 09:44:41.432183 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:41.432110 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-s6djz" Apr 23 09:44:41.569018 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:41.568912 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mpzrx/perf-node-gather-daemonset-s6djz"] Apr 23 09:44:41.571499 ip-10-0-136-17 kubenswrapper[2566]: W0423 09:44:41.571469 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod20fc36fc_31c4_4515_81f8_df47d1a0570f.slice/crio-caefc8ba934901cf48e8f84a6c604d5621d4fcc5fd87bebce010aaf6fd35b9b8 WatchSource:0}: Error finding container caefc8ba934901cf48e8f84a6c604d5621d4fcc5fd87bebce010aaf6fd35b9b8: Status 404 returned error can't find the container with id caefc8ba934901cf48e8f84a6c604d5621d4fcc5fd87bebce010aaf6fd35b9b8 Apr 23 09:44:41.779888 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:41.779863 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-nd7df_1793a43e-b740-4967-92f1-e3ecda36b452/node-ca/0.log" Apr 23 09:44:41.839093 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:41.838604 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-s6djz" event={"ID":"20fc36fc-31c4-4515-81f8-df47d1a0570f","Type":"ContainerStarted","Data":"499b6be50c9a2c9c7e7f6ae4305a168fdfdabdc002f5965733c86e72912775b4"} Apr 23 09:44:41.839093 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:41.838646 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-s6djz" event={"ID":"20fc36fc-31c4-4515-81f8-df47d1a0570f","Type":"ContainerStarted","Data":"caefc8ba934901cf48e8f84a6c604d5621d4fcc5fd87bebce010aaf6fd35b9b8"} Apr 23 09:44:41.839274 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:41.839098 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-s6djz" Apr 23 09:44:41.856394 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:41.856355 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-s6djz" podStartSLOduration=0.856341493 podStartE2EDuration="856.341493ms" podCreationTimestamp="2026-04-23 09:44:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 09:44:41.855658526 +0000 UTC m=+847.300720843" watchObservedRunningTime="2026-04-23 09:44:41.856341493 +0000 UTC m=+847.301403815" Apr 23 09:44:42.444696 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:42.444665 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-86d78b875b-rbksj_91f0276c-d1f7-4977-a25e-107ac0756380/router/0.log" Apr 23 09:44:42.733592 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:42.733526 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rj967_30c9a344-56d8-4716-84da-8665f9ee4946/serve-healthcheck-canary/0.log" Apr 23 09:44:43.087582 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:43.087554 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-zllgx_bfdda918-82b7-430c-9e8e-f2555930fa85/insights-operator/1.log" Apr 23 09:44:43.088067 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:43.087818 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-zllgx_bfdda918-82b7-430c-9e8e-f2555930fa85/insights-operator/0.log" Apr 23 09:44:43.163061 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:43.163028 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gwh72_e5fae73e-630f-4593-a153-91162968e0fd/kube-rbac-proxy/0.log" Apr 23 09:44:43.181273 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:43.181253 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gwh72_e5fae73e-630f-4593-a153-91162968e0fd/exporter/0.log" Apr 23 09:44:43.204035 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:43.204011 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gwh72_e5fae73e-630f-4593-a153-91162968e0fd/extractor/0.log" Apr 23 09:44:47.853815 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:47.853785 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-mpzrx/perf-node-gather-daemonset-s6djz" Apr 23 09:44:48.037404 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:48.037371 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-qbzv8_541d1fdf-75b3-4058-97be-3140f4c7fdb2/kube-storage-version-migrator-operator/1.log" Apr 23 09:44:48.038408 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:48.038375 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-qbzv8_541d1fdf-75b3-4058-97be-3140f4c7fdb2/kube-storage-version-migrator-operator/0.log" Apr 23 09:44:49.107906 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:49.107876 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wkpk8_495c169c-a1fe-4740-b5e4-88f23ef7e5d0/kube-multus-additional-cni-plugins/0.log" Apr 23 09:44:49.127847 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:49.127816 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wkpk8_495c169c-a1fe-4740-b5e4-88f23ef7e5d0/egress-router-binary-copy/0.log" Apr 23 09:44:49.145861 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:49.145842 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wkpk8_495c169c-a1fe-4740-b5e4-88f23ef7e5d0/cni-plugins/0.log" Apr 23 09:44:49.166177 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:49.166147 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wkpk8_495c169c-a1fe-4740-b5e4-88f23ef7e5d0/bond-cni-plugin/0.log" Apr 23 09:44:49.185408 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:49.185386 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wkpk8_495c169c-a1fe-4740-b5e4-88f23ef7e5d0/routeoverride-cni/0.log" Apr 23 09:44:49.204703 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:49.204681 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wkpk8_495c169c-a1fe-4740-b5e4-88f23ef7e5d0/whereabouts-cni-bincopy/0.log" Apr 23 09:44:49.223914 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:49.223888 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wkpk8_495c169c-a1fe-4740-b5e4-88f23ef7e5d0/whereabouts-cni/0.log" Apr 23 09:44:49.251530 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:49.251498 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-nblz4_23461d04-1ac2-4f06-bec9-42875ddaa8aa/kube-multus/0.log" Apr 23 09:44:49.325019 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:49.324989 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6528f_c171e7cd-9c69-4ef7-9012-fad9d2b17a46/network-metrics-daemon/0.log" Apr 23 09:44:49.345082 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:49.345036 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6528f_c171e7cd-9c69-4ef7-9012-fad9d2b17a46/kube-rbac-proxy/0.log" Apr 23 09:44:50.569068 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:50.569039 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pf5hs_cbbe5fc8-e02f-40b7-8e74-ca04c14c0260/ovn-controller/0.log" Apr 23 09:44:50.609650 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:50.609616 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pf5hs_cbbe5fc8-e02f-40b7-8e74-ca04c14c0260/ovn-acl-logging/0.log" Apr 23 09:44:50.632862 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:50.632827 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pf5hs_cbbe5fc8-e02f-40b7-8e74-ca04c14c0260/kube-rbac-proxy-node/0.log" Apr 23 09:44:50.656863 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:50.656833 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pf5hs_cbbe5fc8-e02f-40b7-8e74-ca04c14c0260/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 09:44:50.676704 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:50.676673 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pf5hs_cbbe5fc8-e02f-40b7-8e74-ca04c14c0260/northd/0.log" Apr 23 09:44:50.695788 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:50.695768 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pf5hs_cbbe5fc8-e02f-40b7-8e74-ca04c14c0260/nbdb/0.log" Apr 23 09:44:50.716237 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:50.716185 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pf5hs_cbbe5fc8-e02f-40b7-8e74-ca04c14c0260/sbdb/0.log" Apr 23 09:44:50.823524 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:50.823491 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-pf5hs_cbbe5fc8-e02f-40b7-8e74-ca04c14c0260/ovnkube-controller/0.log" Apr 23 09:44:52.102580 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:52.102502 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-zvsn4_b2f33469-6027-43ec-be9b-4bebb5500631/check-endpoints/0.log" Apr 23 09:44:52.146262 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:52.146236 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-sb8zp_11f39ddc-72f6-4699-8329-bbb34ab9a9f0/network-check-target-container/0.log" Apr 23 09:44:53.011692 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:53.011666 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-tdc8p_70691076-656b-493e-863d-2a367fb8eccf/iptables-alerter/0.log" Apr 23 09:44:53.561561 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:53.561531 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-rn5xp_50de6365-8798-4807-b392-6cc780c49635/tuned/0.log" Apr 23 09:44:55.231696 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:55.231671 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-ftn6j_25c6abb3-0bf5-4a51-9450-a29341379573/cluster-samples-operator/0.log" Apr 23 09:44:55.245781 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:55.245760 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-ftn6j_25c6abb3-0bf5-4a51-9450-a29341379573/cluster-samples-operator-watch/0.log" Apr 23 09:44:56.065143 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:56.065091 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-n72bx_c46256a3-d81a-45a4-b658-3a9db962e17a/service-ca-operator/1.log" Apr 23 09:44:56.066116 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:56.066094 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-n72bx_c46256a3-d81a-45a4-b658-3a9db962e17a/service-ca-operator/0.log" Apr 23 09:44:56.741158 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:56.741132 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-mjhnv_b967352d-3f1f-4e67-b468-5d9326f772ea/csi-driver/0.log" Apr 23 09:44:56.759278 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:56.759256 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-mjhnv_b967352d-3f1f-4e67-b468-5d9326f772ea/csi-node-driver-registrar/0.log" Apr 23 09:44:56.778020 ip-10-0-136-17 kubenswrapper[2566]: I0423 09:44:56.777995 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-mjhnv_b967352d-3f1f-4e67-b468-5d9326f772ea/csi-liveness-probe/0.log"