Apr 24 21:27:02.606361 ip-10-0-136-201 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:27:03.113773 ip-10-0-136-201 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:03.113773 ip-10-0-136-201 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:27:03.113773 ip-10-0-136-201 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:03.113773 ip-10-0-136-201 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:27:03.113773 ip-10-0-136-201 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:03.115768 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.115676 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:27:03.121106 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121079 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:03.121106 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121107 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:03.121180 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121111 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:03.121180 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121115 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:03.121180 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121118 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:03.121180 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121121 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:03.121180 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121124 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:03.121180 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121126 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:03.121180 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121130 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:03.121180 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121133 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:03.121180 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121136 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:03.121180 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121140 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:03.121180 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121143 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:03.121180 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121145 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:03.121180 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121154 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:03.121180 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121157 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:03.121180 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121160 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:03.121180 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121162 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:03.121180 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121165 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:03.121180 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121168 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:03.121180 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121170 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:03.121180 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121173 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:03.121662 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121175 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:03.121662 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121179 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:03.121662 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121182 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:03.121662 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121184 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:03.121662 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121187 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:03.121662 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121190 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:03.121662 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121193 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:03.121662 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121196 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:03.121662 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121198 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:03.121662 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121201 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:03.121662 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121203 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:03.121662 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121206 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:03.121662 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121208 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:03.121662 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121211 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:03.121662 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121214 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:03.121662 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121216 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:03.121662 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121218 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:03.121662 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121221 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:03.121662 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121224 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:03.121662 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121229 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:03.122161 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121233 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:03.122161 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121235 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:03.122161 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121239 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:03.122161 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121241 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:03.122161 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121244 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:03.122161 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121248 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:03.122161 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121250 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:03.122161 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121253 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:03.122161 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121256 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:03.122161 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121259 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:03.122161 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121261 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:03.122161 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121264 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:03.122161 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121266 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:03.122161 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121270 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:03.122161 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121272 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:03.122161 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121275 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:03.122161 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121278 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:03.122161 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121282 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:03.122161 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121286 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:03.122697 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121289 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:03.122697 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121291 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:03.122697 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121294 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:03.122697 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121297 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:03.122697 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121299 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:03.122697 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121303 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:03.122697 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121305 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:03.122697 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121308 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:03.122697 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121310 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:03.122697 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121313 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:03.122697 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121315 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:03.122697 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121318 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:03.122697 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121320 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:03.122697 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121323 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:03.122697 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121325 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:03.122697 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121328 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:03.122697 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121330 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:03.122697 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121333 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:03.122697 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121335 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:03.122697 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121338 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:03.123187 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121340 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:03.123187 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121343 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:03.123187 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121345 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:03.123187 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121348 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:03.123187 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.121350 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:03.123187 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122541 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:03.123187 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122548 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:03.123187 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122551 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:03.123187 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122553 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:03.123187 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122557 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:03.123187 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122559 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:03.123187 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122562 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:03.123187 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122564 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:03.123187 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122568 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:03.123187 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122570 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:03.123187 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122573 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:03.123187 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122576 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:03.123187 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122583 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:03.123187 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122586 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:03.123187 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122589 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:03.123656 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122593 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:03.123656 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122596 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:03.123656 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122599 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:03.123656 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122602 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:03.123656 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122605 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:03.123656 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122608 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:03.123656 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122610 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:03.123656 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122613 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:03.123656 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122615 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:03.123656 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122617 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:03.123656 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122620 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:03.123656 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122622 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:03.123656 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122625 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:03.123656 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122627 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:03.123656 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122630 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:03.123656 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122632 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:03.123656 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122635 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:03.123656 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122637 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:03.123656 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122640 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:03.124124 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122642 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:03.124124 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122645 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:03.124124 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122647 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:03.124124 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122649 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:03.124124 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122652 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:03.124124 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122654 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:03.124124 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122658 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:03.124124 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122661 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:03.124124 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122663 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:03.124124 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122666 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:03.124124 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122668 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:03.124124 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122671 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:03.124124 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122673 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:03.124124 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122676 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:03.124124 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122678 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:03.124124 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122681 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:03.124124 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122683 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:03.124124 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122685 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:03.124124 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122688 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:03.124124 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122691 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:03.124644 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122695 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:03.124644 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122698 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:03.124644 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122701 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:03.124644 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122703 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:03.124644 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122706 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:03.124644 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122709 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:03.124644 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122711 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:03.124644 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122714 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:03.124644 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122716 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:03.124644 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122719 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:03.124644 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122722 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:03.124644 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122724 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:03.124644 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122726 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:03.124644 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122729 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:03.124644 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122731 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:03.124644 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122734 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:03.124644 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122736 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:03.124644 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122739 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:03.124644 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122741 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:03.125148 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122744 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:03.125148 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122746 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:03.125148 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122749 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:03.125148 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122751 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:03.125148 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122754 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:03.125148 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122756 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:03.125148 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122759 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:03.125148 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122761 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:03.125148 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122763 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:03.125148 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122766 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:03.125148 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122769 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:03.125148 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122772 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:03.125148 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.122774 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:03.125148 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122856 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:27:03.125148 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122866 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:27:03.125148 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122876 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:27:03.125148 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122881 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:27:03.125148 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122886 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:27:03.125148 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122889 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:27:03.125148 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122893 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:27:03.125148 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122897 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:27:03.125148 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122901 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:27:03.125696 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122904 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:27:03.125696 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122908 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:27:03.125696 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122911 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:27:03.125696 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122914 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:27:03.125696 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122917 2573 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:27:03.125696 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122920 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:27:03.125696 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122923 2573 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:27:03.125696 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122926 2573 flags.go:64] FLAG: --cloud-config="" Apr 24 21:27:03.125696 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122929 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:27:03.125696 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122932 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:27:03.125696 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122936 2573 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:27:03.125696 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122939 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:27:03.125696 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122942 2573 flags.go:64] FLAG: --config-dir="" Apr 24 21:27:03.125696 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122945 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:27:03.125696 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122948 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:27:03.125696 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122952 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:27:03.125696 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122957 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:27:03.125696 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122960 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:27:03.125696 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122963 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:27:03.125696 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122966 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:27:03.125696 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122969 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:27:03.125696 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122972 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:27:03.125696 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122975 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:27:03.125696 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122978 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:27:03.125696 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122982 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:27:03.126314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122985 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:27:03.126314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122988 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:27:03.126314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122991 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:27:03.126314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122994 2573 flags.go:64] FLAG: --enable-server="true" Apr 24 21:27:03.126314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.122997 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:27:03.126314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123002 2573 flags.go:64] FLAG: --event-burst="100" Apr 24 21:27:03.126314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123005 2573 flags.go:64] FLAG: --event-qps="50" Apr 24 21:27:03.126314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123009 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:27:03.126314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123012 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:27:03.126314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123015 2573 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:27:03.126314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123019 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:27:03.126314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123022 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:27:03.126314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123025 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:27:03.126314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123028 2573 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:27:03.126314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123031 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:27:03.126314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123034 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:27:03.126314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123037 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:27:03.126314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123040 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:27:03.126314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123043 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:27:03.126314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123045 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:27:03.126314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123048 2573 flags.go:64] FLAG: --feature-gates="" Apr 24 21:27:03.126314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123052 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:27:03.126314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123055 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:27:03.126314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123058 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:27:03.126314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123062 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:27:03.126933 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123065 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:27:03.126933 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123068 2573 flags.go:64] FLAG: --help="false" Apr 24 21:27:03.126933 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123071 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-136-201.ec2.internal" Apr 24 21:27:03.126933 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123074 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:27:03.126933 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123077 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:27:03.126933 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123080 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:27:03.126933 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123084 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:27:03.126933 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123087 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:27:03.126933 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123102 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:27:03.126933 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123105 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:27:03.126933 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123108 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:27:03.126933 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123111 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:27:03.126933 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123114 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:27:03.126933 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123118 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:27:03.126933 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123123 2573 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:27:03.126933 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123126 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:27:03.126933 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123129 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:27:03.126933 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123132 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:27:03.126933 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123135 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:27:03.126933 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123138 2573 flags.go:64] FLAG: --lock-file="" Apr 24 21:27:03.126933 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123141 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:27:03.126933 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123144 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:27:03.126933 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123147 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:27:03.126933 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123153 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:27:03.127519 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123156 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:27:03.127519 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123159 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:27:03.127519 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123162 2573 flags.go:64] FLAG: --logging-format="text" Apr 24 21:27:03.127519 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123164 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:27:03.127519 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123168 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:27:03.127519 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123171 2573 flags.go:64] FLAG: --manifest-url="" Apr 24 21:27:03.127519 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123174 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:27:03.127519 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123178 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:27:03.127519 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123181 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:27:03.127519 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123186 2573 flags.go:64] FLAG: --max-pods="110" Apr 24 21:27:03.127519 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123189 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:27:03.127519 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123192 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:27:03.127519 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123195 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:27:03.127519 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123198 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:27:03.127519 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123203 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:27:03.127519 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123206 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:27:03.127519 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123209 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:27:03.127519 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123217 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:27:03.127519 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123220 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:27:03.127519 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123223 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:27:03.127519 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123226 2573 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:27:03.127519 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123229 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:27:03.127519 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123236 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:27:03.128112 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123238 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:27:03.128112 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123241 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:27:03.128112 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123244 2573 flags.go:64] FLAG: --port="10250" Apr 24 21:27:03.128112 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123247 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:27:03.128112 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123250 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-054b555c1b65d4afe" Apr 24 21:27:03.128112 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123253 2573 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:27:03.128112 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123269 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:27:03.128112 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123273 2573 flags.go:64] FLAG: --register-node="true" Apr 24 21:27:03.128112 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123276 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:27:03.128112 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123279 2573 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:27:03.128112 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123283 2573 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:27:03.128112 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123285 2573 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:27:03.128112 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123288 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:27:03.128112 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123291 2573 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:27:03.128112 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123295 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:27:03.128112 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123298 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:27:03.128112 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123301 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:27:03.128112 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123304 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:27:03.128112 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123307 2573 flags.go:64] FLAG: --runonce="false" Apr 24 21:27:03.128112 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123310 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:27:03.128112 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123313 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:27:03.128112 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123315 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:27:03.128112 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123318 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:27:03.128112 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123321 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:27:03.128112 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123326 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:27:03.128112 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123329 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:27:03.128776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123332 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:27:03.128776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123335 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:27:03.128776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123338 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:27:03.128776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123340 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:27:03.128776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123343 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:27:03.128776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123347 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:27:03.128776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123350 2573 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:27:03.128776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123353 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:27:03.128776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123359 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:27:03.128776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123362 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:27:03.128776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123365 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:27:03.128776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123369 2573 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:27:03.128776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123372 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:27:03.128776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123374 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:27:03.128776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123377 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:27:03.128776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123380 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:27:03.128776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123383 2573 flags.go:64] FLAG: --v="2" Apr 24 21:27:03.128776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123387 2573 flags.go:64] FLAG: --version="false" Apr 24 21:27:03.128776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123391 2573 flags.go:64] FLAG: --vmodule="" Apr 24 21:27:03.128776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123395 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:27:03.128776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123398 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:27:03.128776 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123498 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:03.128776 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123502 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:03.128776 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123505 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:03.129400 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123508 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:03.129400 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123511 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:03.129400 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123514 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:03.129400 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123517 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:03.129400 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123520 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:03.129400 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123522 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:03.129400 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123526 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:03.129400 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123529 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:03.129400 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123532 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:03.129400 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123534 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:03.129400 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123537 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:03.129400 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123539 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:03.129400 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123542 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:03.129400 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123545 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:03.129400 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123548 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:03.129400 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123550 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:03.129400 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123553 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:03.129400 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123555 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:03.129400 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123558 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:03.129400 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123560 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:03.129947 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123563 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:03.129947 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123565 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:03.129947 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123568 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:03.129947 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123570 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:03.129947 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123572 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:03.129947 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123576 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:03.129947 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123579 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:03.129947 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123582 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:03.129947 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123585 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:03.129947 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123587 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:03.129947 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123590 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:03.129947 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123593 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:03.129947 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123595 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:03.129947 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123597 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:03.129947 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123600 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:03.129947 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123603 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:03.129947 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123606 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:03.129947 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123608 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:03.129947 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123612 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:03.130434 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123614 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:03.130434 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123617 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:03.130434 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123620 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:03.130434 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123622 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:03.130434 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123625 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:03.130434 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123627 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:03.130434 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123630 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:03.130434 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123632 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:03.130434 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123635 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:03.130434 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123637 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:03.130434 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123639 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:03.130434 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123642 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:03.130434 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123645 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:03.130434 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123648 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:03.130434 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123650 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:03.130434 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123652 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:03.130434 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123655 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:03.130434 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123657 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:03.130434 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123660 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:03.130434 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123662 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:03.130921 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123665 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:03.130921 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123668 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:03.130921 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123670 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:03.130921 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123673 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:03.130921 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123675 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:03.130921 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123677 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:03.130921 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123680 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:03.130921 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123682 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:03.130921 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123685 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:03.130921 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123688 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:03.130921 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123690 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:03.130921 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123693 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:03.130921 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123696 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:03.130921 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123699 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:03.130921 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123701 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:03.130921 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123703 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:03.130921 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123706 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:03.130921 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123709 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:03.130921 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123715 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:03.130921 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123718 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:03.131424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123720 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:03.131424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123722 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:03.131424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123726 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:03.131424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.123729 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:03.131424 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.123735 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:03.131424 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.130389 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:27:03.131424 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.130404 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:27:03.131424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130450 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:03.131424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130457 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:03.131424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130460 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:03.131424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130464 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:03.131424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130467 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:03.131424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130470 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:03.131424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130473 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:03.131424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130476 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:03.131424 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130479 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:03.131814 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130482 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:03.131814 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130485 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:03.131814 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130488 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:03.131814 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130491 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:03.131814 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130493 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:03.131814 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130496 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:03.131814 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130498 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:03.131814 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130501 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:03.131814 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130504 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:03.131814 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130506 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:03.131814 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130509 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:03.131814 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130511 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:03.131814 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130514 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:03.131814 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130516 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:03.131814 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130519 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:03.131814 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130521 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:03.131814 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130524 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:03.131814 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130526 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:03.131814 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130530 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:03.131814 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130535 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:03.132327 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130538 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:03.132327 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130541 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:03.132327 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130545 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:03.132327 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130548 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:03.132327 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130550 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:03.132327 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130553 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:03.132327 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130556 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:03.132327 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130558 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:03.132327 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130561 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:03.132327 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130563 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:03.132327 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130566 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:03.132327 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130568 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:03.132327 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130571 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:03.132327 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130573 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:03.132327 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130576 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:03.132327 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130578 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:03.132327 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130581 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:03.132327 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130583 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:03.132327 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130586 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:03.132327 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130588 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:03.132813 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130591 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:03.132813 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130593 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:03.132813 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130596 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:03.132813 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130598 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:03.132813 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130600 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:03.132813 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130603 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:03.132813 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130605 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:03.132813 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130608 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:03.132813 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130610 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:03.132813 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130613 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:03.132813 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130615 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:03.132813 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130618 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:03.132813 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130621 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:03.132813 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130623 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:03.132813 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130627 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:03.132813 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130630 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:03.132813 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130633 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:03.132813 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130635 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:03.132813 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130638 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:03.133351 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130640 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:03.133351 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130643 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:03.133351 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130645 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:03.133351 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130648 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:03.133351 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130650 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:03.133351 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130652 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:03.133351 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130655 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:03.133351 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130657 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:03.133351 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130660 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:03.133351 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130662 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:03.133351 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130666 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:03.133351 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130670 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:03.133351 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130672 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:03.133351 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130675 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:03.133351 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130677 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:03.133351 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130680 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:03.133351 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130682 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:03.133351 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130684 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:03.133792 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.130689 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:03.133792 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130782 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:03.133792 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130788 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:03.133792 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130790 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:03.133792 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130793 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:03.133792 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130796 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:03.133792 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130799 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:03.133792 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130801 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:03.133792 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130804 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:03.133792 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130808 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:03.133792 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130811 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:03.133792 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130815 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:03.133792 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130818 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:03.133792 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130820 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:03.133792 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130823 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:03.134181 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130825 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:03.134181 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130828 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:03.134181 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130830 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:03.134181 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130833 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:03.134181 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130835 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:03.134181 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130837 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:03.134181 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130840 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:03.134181 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130842 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:03.134181 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130845 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:03.134181 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130847 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:03.134181 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130850 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:03.134181 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130852 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:03.134181 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130855 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:03.134181 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130857 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:03.134181 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130859 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:03.134181 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130862 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:03.134181 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130864 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:03.134181 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130867 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:03.134181 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130869 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:03.134181 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130872 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:03.134671 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130875 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:03.134671 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130879 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:03.134671 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130881 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:03.134671 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130884 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:03.134671 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130886 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:03.134671 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130889 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:03.134671 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130892 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:03.134671 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130894 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:03.134671 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130897 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:03.134671 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130900 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:03.134671 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130902 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:03.134671 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130905 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:03.134671 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130907 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:03.134671 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130910 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:03.134671 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130912 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:03.134671 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130915 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:03.134671 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130917 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:03.134671 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130920 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:03.134671 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130922 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:03.134671 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130925 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:03.135158 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130928 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:03.135158 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130930 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:03.135158 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130933 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:03.135158 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130935 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:03.135158 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130938 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:03.135158 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130941 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:03.135158 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130943 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:03.135158 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130945 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:03.135158 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130948 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:03.135158 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130950 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:03.135158 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130953 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:03.135158 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130955 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:03.135158 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130958 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:03.135158 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130961 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:03.135158 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130963 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:03.135158 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130965 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:03.135158 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130968 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:03.135158 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130970 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:03.135158 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130973 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:03.135158 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130976 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:03.135627 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130978 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:03.135627 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130980 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:03.135627 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130983 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:03.135627 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130986 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:03.135627 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130988 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:03.135627 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130991 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:03.135627 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130993 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:03.135627 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130996 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:03.135627 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.130998 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:03.135627 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.131001 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:03.135627 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.131003 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:03.135627 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:03.131006 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:03.135627 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.131010 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:03.135627 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.131836 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:27:03.135627 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.134162 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:27:03.135983 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.135244 2573 server.go:1019] "Starting client certificate rotation" Apr 24 21:27:03.135983 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.135337 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:27:03.136225 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.136212 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:27:03.165524 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.165506 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:27:03.173185 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.173159 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:27:03.189269 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.189249 2573 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:27:03.195992 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.195976 2573 log.go:25] "Validated CRI v1 image API" Apr 24 21:27:03.197422 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.197394 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:27:03.202855 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.202833 2573 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 b661f0ff-88e2-410b-82e7-d65d6a154731:/dev/nvme0n1p4 fe53105e-9bbc-4d71-8602-ec3383575146:/dev/nvme0n1p3] Apr 24 21:27:03.202932 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.202853 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:27:03.209316 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.209183 2573 manager.go:217] Machine: {Timestamp:2026-04-24 21:27:03.207047741 +0000 UTC m=+0.467616567 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3185458 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec210c3e672873ab772b7fa411081ba7 SystemUUID:ec210c3e-6728-73ab-772b-7fa411081ba7 BootID:b99843a6-e91f-48a5-bf2c-cbf4b398ad9f Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:f8:ea:5f:42:51 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:f8:ea:5f:42:51 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:1e:55:82:0f:54:fa Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:27:03.209316 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.209313 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:27:03.209466 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.209450 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:27:03.210608 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.210580 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:27:03.210761 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.210609 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-201.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:27:03.210807 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.210771 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:27:03.210807 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.210779 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:27:03.210807 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.210793 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:27:03.211661 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.211650 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:27:03.213755 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.213743 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:27:03.213859 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.213850 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:27:03.216749 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.216738 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:27:03.216790 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.216754 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:27:03.216790 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.216766 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:27:03.216790 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.216776 2573 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:27:03.216790 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.216784 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:27:03.217947 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.217935 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:27:03.218001 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.217953 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:27:03.218896 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.218879 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:27:03.223589 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.223565 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:27:03.226146 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.226131 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:27:03.228725 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.228707 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:27:03.228791 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.228734 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:27:03.228791 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.228740 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:27:03.228791 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.228747 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:27:03.228791 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.228752 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:27:03.228791 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.228759 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:27:03.228791 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.228764 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:27:03.228791 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.228770 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:27:03.228791 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.228777 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:27:03.228791 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.228783 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:27:03.229026 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.228798 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:27:03.229026 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.228807 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:27:03.230773 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.230761 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:27:03.230805 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.230777 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:27:03.234711 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.234691 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:27:03.234771 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.234745 2573 server.go:1295] "Started kubelet" Apr 24 21:27:03.234891 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.234861 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:27:03.234977 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.234860 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:27:03.234977 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.234928 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:27:03.235297 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:03.235271 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-201.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:27:03.235450 ip-10-0-136-201 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:27:03.235569 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.235550 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-201.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:27:03.235846 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:03.235671 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:27:03.236968 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.236941 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:27:03.236968 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.236968 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:27:03.243809 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.243791 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:27:03.244526 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.244512 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:27:03.245326 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.245311 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:27:03.245448 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.245317 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:27:03.245520 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.245512 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:27:03.245703 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.245693 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:27:03.245766 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.245760 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:27:03.247003 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:03.245043 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-201.ec2.internal.18a96822015236ee default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-201.ec2.internal,UID:ip-10-0-136-201.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-136-201.ec2.internal,},FirstTimestamp:2026-04-24 21:27:03.234713326 +0000 UTC m=+0.495282151,LastTimestamp:2026-04-24 21:27:03.234713326 +0000 UTC m=+0.495282151,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-201.ec2.internal,}" Apr 24 21:27:03.247003 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.246787 2573 factory.go:55] Registering systemd factory Apr 24 21:27:03.247003 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.246805 2573 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:27:03.247003 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:03.246807 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-201.ec2.internal\" not found" Apr 24 21:27:03.247358 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.247069 2573 factory.go:153] Registering CRI-O factory Apr 24 21:27:03.247358 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.247086 2573 factory.go:223] Registration of the crio container factory successfully Apr 24 21:27:03.247358 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.247203 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:27:03.247358 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.247227 2573 factory.go:103] Registering Raw factory Apr 24 21:27:03.247358 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.247244 2573 manager.go:1196] Started watching for new ooms in manager Apr 24 21:27:03.247652 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.247637 2573 manager.go:319] Starting recovery of all containers Apr 24 21:27:03.247903 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:03.247880 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 21:27:03.253629 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:03.253600 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 21:27:03.253768 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:03.253635 2573 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-136-201.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 21:27:03.260488 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.260471 2573 manager.go:324] Recovery completed Apr 24 21:27:03.264399 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.264387 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:03.266827 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.266810 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:03.266896 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.266839 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:03.266896 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.266849 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:03.267357 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.267344 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:27:03.267413 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.267358 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:27:03.267413 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.267374 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:27:03.269287 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:03.269222 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-201.ec2.internal.18a96822033c3ab0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-201.ec2.internal,UID:ip-10-0-136-201.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-136-201.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-136-201.ec2.internal,},FirstTimestamp:2026-04-24 21:27:03.266826928 +0000 UTC m=+0.527395749,LastTimestamp:2026-04-24 21:27:03.266826928 +0000 UTC m=+0.527395749,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-201.ec2.internal,}" Apr 24 21:27:03.271190 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.271178 2573 policy_none.go:49] "None policy: Start" Apr 24 21:27:03.271241 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.271193 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:27:03.271241 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.271203 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:27:03.280358 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:03.280296 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-201.ec2.internal.18a96822033c7bae default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-201.ec2.internal,UID:ip-10-0-136-201.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-136-201.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-136-201.ec2.internal,},FirstTimestamp:2026-04-24 21:27:03.266843566 +0000 UTC m=+0.527412388,LastTimestamp:2026-04-24 21:27:03.266843566 +0000 UTC m=+0.527412388,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-201.ec2.internal,}" Apr 24 21:27:03.289078 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:03.289017 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-201.ec2.internal.18a96822033ca0b2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-201.ec2.internal,UID:ip-10-0-136-201.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-136-201.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-136-201.ec2.internal,},FirstTimestamp:2026-04-24 21:27:03.266853042 +0000 UTC m=+0.527421864,LastTimestamp:2026-04-24 21:27:03.266853042 +0000 UTC m=+0.527421864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-201.ec2.internal,}" Apr 24 21:27:03.304372 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.304352 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-699tp" Apr 24 21:27:03.326793 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.306672 2573 manager.go:341] "Starting Device Plugin manager" Apr 24 21:27:03.326793 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:03.306747 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:27:03.326793 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.306761 2573 server.go:85] "Starting device plugin registration server" Apr 24 21:27:03.326793 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.307026 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:27:03.326793 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.307037 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:27:03.326793 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.307159 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:27:03.326793 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.307232 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:27:03.326793 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.307241 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:27:03.326793 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:03.307720 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:27:03.326793 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:03.307755 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-201.ec2.internal\" not found" Apr 24 21:27:03.326793 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.311066 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-699tp" Apr 24 21:27:03.350337 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.350307 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:27:03.351567 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.351544 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:27:03.351666 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.351571 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:27:03.351666 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.351588 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:27:03.351666 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.351596 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:27:03.351666 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:03.351627 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:27:03.354223 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.354207 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:03.407585 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.407519 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:03.408415 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.408399 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:03.408500 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.408430 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:03.408500 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.408441 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:03.408500 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.408465 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-201.ec2.internal" Apr 24 21:27:03.420808 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.420788 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-201.ec2.internal" Apr 24 21:27:03.420900 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:03.420814 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-201.ec2.internal\": node \"ip-10-0-136-201.ec2.internal\" not found" Apr 24 21:27:03.441031 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:03.441010 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-201.ec2.internal\" not found" Apr 24 21:27:03.452156 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.452132 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-136-201.ec2.internal"] Apr 24 21:27:03.452211 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.452200 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:03.453585 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.453569 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:03.453665 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.453600 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:03.453665 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.453613 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:03.455932 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.455919 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:03.456129 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.456112 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal" Apr 24 21:27:03.456178 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.456149 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:03.456825 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.456810 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:03.456892 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.456846 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:03.456892 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.456857 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:03.456892 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.456812 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:03.456987 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.456913 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:03.456987 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.456923 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:03.458974 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.458962 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-201.ec2.internal" Apr 24 21:27:03.459024 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.458985 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:03.459587 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.459574 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:03.459634 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.459598 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:03.459634 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.459610 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:03.495748 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:03.495728 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-201.ec2.internal\" not found" node="ip-10-0-136-201.ec2.internal" Apr 24 21:27:03.500110 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:03.500078 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-201.ec2.internal\" not found" node="ip-10-0-136-201.ec2.internal" Apr 24 21:27:03.541782 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:03.541760 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-201.ec2.internal\" not found" Apr 24 21:27:03.548004 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.547985 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bcf950b6b9eef658d8ab20236281bc71-config\") pod \"kube-apiserver-proxy-ip-10-0-136-201.ec2.internal\" (UID: \"bcf950b6b9eef658d8ab20236281bc71\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-201.ec2.internal" Apr 24 21:27:03.548064 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.548012 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0eeb12378a06f0692d8b8888ada8c1bc-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal\" (UID: \"0eeb12378a06f0692d8b8888ada8c1bc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal" Apr 24 21:27:03.548064 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.548038 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0eeb12378a06f0692d8b8888ada8c1bc-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal\" (UID: \"0eeb12378a06f0692d8b8888ada8c1bc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal" Apr 24 21:27:03.642455 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:03.642416 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-201.ec2.internal\" not found" Apr 24 21:27:03.648814 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.648794 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bcf950b6b9eef658d8ab20236281bc71-config\") pod \"kube-apiserver-proxy-ip-10-0-136-201.ec2.internal\" (UID: \"bcf950b6b9eef658d8ab20236281bc71\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-201.ec2.internal" Apr 24 21:27:03.648869 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.648826 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0eeb12378a06f0692d8b8888ada8c1bc-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal\" (UID: \"0eeb12378a06f0692d8b8888ada8c1bc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal" Apr 24 21:27:03.648869 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.648847 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0eeb12378a06f0692d8b8888ada8c1bc-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal\" (UID: \"0eeb12378a06f0692d8b8888ada8c1bc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal" Apr 24 21:27:03.648931 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.648884 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bcf950b6b9eef658d8ab20236281bc71-config\") pod \"kube-apiserver-proxy-ip-10-0-136-201.ec2.internal\" (UID: \"bcf950b6b9eef658d8ab20236281bc71\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-201.ec2.internal" Apr 24 21:27:03.648931 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.648927 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0eeb12378a06f0692d8b8888ada8c1bc-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal\" (UID: \"0eeb12378a06f0692d8b8888ada8c1bc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal" Apr 24 21:27:03.648994 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.648945 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0eeb12378a06f0692d8b8888ada8c1bc-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal\" (UID: \"0eeb12378a06f0692d8b8888ada8c1bc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal" Apr 24 21:27:03.743278 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:03.743210 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-201.ec2.internal\" not found" Apr 24 21:27:03.797781 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.797755 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal" Apr 24 21:27:03.802387 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:03.802362 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-201.ec2.internal" Apr 24 21:27:03.844056 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:03.844023 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-201.ec2.internal\" not found" Apr 24 21:27:03.944599 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:03.944563 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-201.ec2.internal\" not found" Apr 24 21:27:04.045076 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:04.044993 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-201.ec2.internal\" not found" Apr 24 21:27:04.135416 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:04.135385 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:27:04.136057 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:04.135530 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:27:04.145551 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:04.145520 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-201.ec2.internal\" not found" Apr 24 21:27:04.244310 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:04.244279 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:27:04.246523 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:04.246496 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-201.ec2.internal\" not found" Apr 24 21:27:04.258014 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:04.257990 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:27:04.286086 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:04.286056 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-8wzfq" Apr 24 21:27:04.293833 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:04.293813 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-8wzfq" Apr 24 21:27:04.313201 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:04.313126 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:22:03 +0000 UTC" deadline="2027-11-19 17:56:00.925818694 +0000 UTC" Apr 24 21:27:04.313201 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:04.313162 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13772h28m56.612664662s" Apr 24 21:27:04.347491 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:04.347465 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-201.ec2.internal\" not found" Apr 24 21:27:04.353164 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:04.353134 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcf950b6b9eef658d8ab20236281bc71.slice/crio-2325d5091243fe4dc9aa8e1fa5fd4f8e08ae4dd8a1214de44e96504b461e275b WatchSource:0}: Error finding container 2325d5091243fe4dc9aa8e1fa5fd4f8e08ae4dd8a1214de44e96504b461e275b: Status 404 returned error can't find the container with id 2325d5091243fe4dc9aa8e1fa5fd4f8e08ae4dd8a1214de44e96504b461e275b Apr 24 21:27:04.353570 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:04.353545 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0eeb12378a06f0692d8b8888ada8c1bc.slice/crio-5e7ead3bebded7aaa7a2c0505428f7283fe109bd71a6e2454313748c7e5f74d7 WatchSource:0}: Error finding container 5e7ead3bebded7aaa7a2c0505428f7283fe109bd71a6e2454313748c7e5f74d7: Status 404 returned error can't find the container with id 5e7ead3bebded7aaa7a2c0505428f7283fe109bd71a6e2454313748c7e5f74d7 Apr 24 21:27:04.359023 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:04.359006 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:27:04.414045 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:04.414012 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:04.448161 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:04.448130 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-201.ec2.internal\" not found" Apr 24 21:27:04.548635 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:04.548597 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-201.ec2.internal\" not found" Apr 24 21:27:04.649351 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:04.649274 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-201.ec2.internal\" not found" Apr 24 21:27:04.656882 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:04.656856 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:04.749500 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:04.749465 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-201.ec2.internal\" not found" Apr 24 21:27:04.804318 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:04.804285 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:04.845590 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:04.845330 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal" Apr 24 21:27:04.862758 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:04.862732 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:27:04.864089 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:04.863861 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-201.ec2.internal" Apr 24 21:27:04.873609 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:04.873586 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:27:05.217306 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.217273 2573 apiserver.go:52] "Watching apiserver" Apr 24 21:27:05.222920 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.222894 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:27:05.223958 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.223935 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kcl2z","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ss89p","openshift-cluster-node-tuning-operator/tuned-jhprx","openshift-dns/node-resolver-z65xz","openshift-multus/network-metrics-daemon-kpb2c","openshift-network-operator/iptables-alerter-h2b2b","kube-system/konnectivity-agent-4smd5","kube-system/kube-apiserver-proxy-ip-10-0-136-201.ec2.internal","openshift-image-registry/node-ca-gtgdx","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal","openshift-multus/multus-29bpz","openshift-multus/multus-additional-cni-plugins-5fn4d","openshift-network-diagnostics/network-check-target-6xchx"] Apr 24 21:27:05.226910 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.226891 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4smd5" Apr 24 21:27:05.229202 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.229181 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ss89p" Apr 24 21:27:05.229617 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.229594 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-6c2sv\"" Apr 24 21:27:05.229821 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.229805 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:27:05.229902 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.229829 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:27:05.231452 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.231429 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.231903 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.231845 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:27:05.231903 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.231883 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:27:05.232057 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.231924 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:27:05.232057 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.231885 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-74bs9\"" Apr 24 21:27:05.233661 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.233644 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-z65xz" Apr 24 21:27:05.238291 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.238260 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:27:05.238392 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:05.238334 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpb2c" podUID="066d9094-f992-4853-86f1-b25700fe6070" Apr 24 21:27:05.238392 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.238354 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-h2b2b" Apr 24 21:27:05.240643 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.240623 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.242890 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.242827 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:05.242940 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.242899 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gtgdx" Apr 24 21:27:05.243381 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.243358 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:05.243466 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.243384 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-m2s9z\"" Apr 24 21:27:05.243648 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.243626 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:27:05.243702 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.243669 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:05.245284 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.245253 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.246205 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.246165 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:05.248994 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.248973 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:27:05.249356 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.249336 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:27:05.249492 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.249472 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:27:05.250144 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.249861 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:27:05.250272 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.250256 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:27:05.250473 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.250414 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8knfc\"" Apr 24 21:27:05.250473 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.250454 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-gm5vw\"" Apr 24 21:27:05.250619 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.250585 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:27:05.250676 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.250638 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:27:05.250739 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.250723 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:27:05.250791 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.250589 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:27:05.250791 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.250768 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:27:05.250891 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.250793 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-j52cm\"" Apr 24 21:27:05.250891 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.250831 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-44d8l\"" Apr 24 21:27:05.250983 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.250588 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-vmqdm\"" Apr 24 21:27:05.251337 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.251010 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:27:05.251426 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.251332 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6xchx" Apr 24 21:27:05.251426 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.251344 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5fn4d" Apr 24 21:27:05.251426 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.251267 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:27:05.251525 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:05.251449 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6xchx" podUID="1612b97a-f223-4e83-8710-5764a3765126" Apr 24 21:27:05.251697 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.251296 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:27:05.251923 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.251073 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:27:05.252669 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.252648 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:27:05.255993 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.255972 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-host\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.256135 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.256014 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-run-ovn\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.256135 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.256058 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-etc-kubernetes\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.256135 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.256122 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8b726083-445b-4ee0-8797-c710268b6b65-etc-tuned\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.256292 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.256149 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r5zg\" (UniqueName: \"kubernetes.io/projected/8b726083-445b-4ee0-8797-c710268b6b65-kube-api-access-7r5zg\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.256292 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.256176 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/302e0256-bc9b-454a-8fe1-15f7d4a40459-sys-fs\") pod \"aws-ebs-csi-driver-node-ss89p\" (UID: \"302e0256-bc9b-454a-8fe1-15f7d4a40459\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ss89p" Apr 24 21:27:05.256292 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.256223 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbxpx\" (UniqueName: \"kubernetes.io/projected/701e782c-df09-4f2a-a23e-72ec0675fcb7-kube-api-access-qbxpx\") pod \"iptables-alerter-h2b2b\" (UID: \"701e782c-df09-4f2a-a23e-72ec0675fcb7\") " pod="openshift-network-operator/iptables-alerter-h2b2b" Apr 24 21:27:05.256292 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.256257 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e61e2625-9935-4ec2-88cd-d2ac5c781886-host\") pod \"node-ca-gtgdx\" (UID: \"e61e2625-9935-4ec2-88cd-d2ac5c781886\") " pod="openshift-image-registry/node-ca-gtgdx" Apr 24 21:27:05.256469 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.256297 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6p9k\" (UniqueName: \"kubernetes.io/projected/e61e2625-9935-4ec2-88cd-d2ac5c781886-kube-api-access-h6p9k\") pod \"node-ca-gtgdx\" (UID: \"e61e2625-9935-4ec2-88cd-d2ac5c781886\") " pod="openshift-image-registry/node-ca-gtgdx" Apr 24 21:27:05.256469 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.256324 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3807cdc4-74ff-4e27-bde0-2ed93b428a58-cni-binary-copy\") pod \"multus-additional-cni-plugins-5fn4d\" (UID: \"3807cdc4-74ff-4e27-bde0-2ed93b428a58\") " pod="openshift-multus/multus-additional-cni-plugins-5fn4d" Apr 24 21:27:05.256469 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.256350 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-host-cni-netd\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.256469 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.256374 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c595912d-1543-4c43-8a11-3dbcf0f15050-ovnkube-config\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.256469 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.256392 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-multus-cni-dir\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.256469 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.256416 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3807cdc4-74ff-4e27-bde0-2ed93b428a58-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5fn4d\" (UID: \"3807cdc4-74ff-4e27-bde0-2ed93b428a58\") " pod="openshift-multus/multus-additional-cni-plugins-5fn4d" Apr 24 21:27:05.256469 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.256462 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d43e5662-c703-427c-ba4c-60f3f0029405-konnectivity-ca\") pod \"konnectivity-agent-4smd5\" (UID: \"d43e5662-c703-427c-ba4c-60f3f0029405\") " pod="kube-system/konnectivity-agent-4smd5" Apr 24 21:27:05.256817 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.256523 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj2tv\" (UniqueName: \"kubernetes.io/projected/302e0256-bc9b-454a-8fe1-15f7d4a40459-kube-api-access-gj2tv\") pod \"aws-ebs-csi-driver-node-ss89p\" (UID: \"302e0256-bc9b-454a-8fe1-15f7d4a40459\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ss89p" Apr 24 21:27:05.256817 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.256568 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8159f631-f735-47c0-8dd1-8342be18cbcf-hosts-file\") pod \"node-resolver-z65xz\" (UID: \"8159f631-f735-47c0-8dd1-8342be18cbcf\") " pod="openshift-dns/node-resolver-z65xz" Apr 24 21:27:05.256817 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.256604 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-host-var-lib-cni-bin\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.256817 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.256631 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-hostroot\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.256817 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.256656 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-run\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.256817 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.256682 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-multus-conf-dir\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.256817 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.256705 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/066d9094-f992-4853-86f1-b25700fe6070-metrics-certs\") pod \"network-metrics-daemon-kpb2c\" (UID: \"066d9094-f992-4853-86f1-b25700fe6070\") " pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:27:05.256817 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.256727 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-log-socket\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.256817 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.256748 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh8fr\" (UniqueName: \"kubernetes.io/projected/c595912d-1543-4c43-8a11-3dbcf0f15050-kube-api-access-sh8fr\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.256817 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.256773 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-os-release\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.256817 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.256793 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-host-run-k8s-cni-cncf-io\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.256817 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.256815 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-host-run-netns\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.257303 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.256847 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/102ddca3-9b65-4682-a3d0-5bf546504d17-multus-daemon-config\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.257303 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.256880 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-etc-sysctl-d\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.257303 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.256927 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3807cdc4-74ff-4e27-bde0-2ed93b428a58-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5fn4d\" (UID: \"3807cdc4-74ff-4e27-bde0-2ed93b428a58\") " pod="openshift-multus/multus-additional-cni-plugins-5fn4d" Apr 24 21:27:05.257303 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.256977 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c595912d-1543-4c43-8a11-3dbcf0f15050-env-overrides\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.257303 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257006 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/102ddca3-9b65-4682-a3d0-5bf546504d17-cni-binary-copy\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.257303 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257031 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-multus-socket-dir-parent\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.257303 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257085 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-host-var-lib-cni-multus\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.257303 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257127 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-etc-systemd\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.257303 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257153 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3807cdc4-74ff-4e27-bde0-2ed93b428a58-system-cni-dir\") pod \"multus-additional-cni-plugins-5fn4d\" (UID: \"3807cdc4-74ff-4e27-bde0-2ed93b428a58\") " pod="openshift-multus/multus-additional-cni-plugins-5fn4d" Apr 24 21:27:05.257303 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257190 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qkgz\" (UniqueName: \"kubernetes.io/projected/3807cdc4-74ff-4e27-bde0-2ed93b428a58-kube-api-access-4qkgz\") pod \"multus-additional-cni-plugins-5fn4d\" (UID: \"3807cdc4-74ff-4e27-bde0-2ed93b428a58\") " pod="openshift-multus/multus-additional-cni-plugins-5fn4d" Apr 24 21:27:05.257303 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257217 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8159f631-f735-47c0-8dd1-8342be18cbcf-tmp-dir\") pod \"node-resolver-z65xz\" (UID: \"8159f631-f735-47c0-8dd1-8342be18cbcf\") " pod="openshift-dns/node-resolver-z65xz" Apr 24 21:27:05.257303 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257243 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-host-kubelet\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.257303 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257281 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c595912d-1543-4c43-8a11-3dbcf0f15050-ovnkube-script-lib\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.257814 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257312 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-etc-kubernetes\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.257814 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257336 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-host-cni-bin\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.257814 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257363 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.257814 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257390 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-etc-sysconfig\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.257814 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257417 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-etc-sysctl-conf\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.257814 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257443 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-lib-modules\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.257814 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257466 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e61e2625-9935-4ec2-88cd-d2ac5c781886-serviceca\") pod \"node-ca-gtgdx\" (UID: \"e61e2625-9935-4ec2-88cd-d2ac5c781886\") " pod="openshift-image-registry/node-ca-gtgdx" Apr 24 21:27:05.257814 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257491 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/302e0256-bc9b-454a-8fe1-15f7d4a40459-etc-selinux\") pod \"aws-ebs-csi-driver-node-ss89p\" (UID: \"302e0256-bc9b-454a-8fe1-15f7d4a40459\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ss89p" Apr 24 21:27:05.257814 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257517 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw49d\" (UniqueName: \"kubernetes.io/projected/8159f631-f735-47c0-8dd1-8342be18cbcf-kube-api-access-lw49d\") pod \"node-resolver-z65xz\" (UID: \"8159f631-f735-47c0-8dd1-8342be18cbcf\") " pod="openshift-dns/node-resolver-z65xz" Apr 24 21:27:05.257814 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257543 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/701e782c-df09-4f2a-a23e-72ec0675fcb7-iptables-alerter-script\") pod \"iptables-alerter-h2b2b\" (UID: \"701e782c-df09-4f2a-a23e-72ec0675fcb7\") " pod="openshift-network-operator/iptables-alerter-h2b2b" Apr 24 21:27:05.257814 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257567 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-host-run-ovn-kubernetes\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.257814 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257592 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/302e0256-bc9b-454a-8fe1-15f7d4a40459-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ss89p\" (UID: \"302e0256-bc9b-454a-8fe1-15f7d4a40459\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ss89p" Apr 24 21:27:05.257814 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257616 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-run-openvswitch\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.257814 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257646 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/701e782c-df09-4f2a-a23e-72ec0675fcb7-host-slash\") pod \"iptables-alerter-h2b2b\" (UID: \"701e782c-df09-4f2a-a23e-72ec0675fcb7\") " pod="openshift-network-operator/iptables-alerter-h2b2b" Apr 24 21:27:05.257814 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257677 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-host-run-netns\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.257814 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257701 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-var-lib-kubelet\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.258510 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257725 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff85k\" (UniqueName: \"kubernetes.io/projected/066d9094-f992-4853-86f1-b25700fe6070-kube-api-access-ff85k\") pod \"network-metrics-daemon-kpb2c\" (UID: \"066d9094-f992-4853-86f1-b25700fe6070\") " pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:27:05.258510 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257759 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3807cdc4-74ff-4e27-bde0-2ed93b428a58-os-release\") pod \"multus-additional-cni-plugins-5fn4d\" (UID: \"3807cdc4-74ff-4e27-bde0-2ed93b428a58\") " pod="openshift-multus/multus-additional-cni-plugins-5fn4d" Apr 24 21:27:05.258510 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257776 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d43e5662-c703-427c-ba4c-60f3f0029405-agent-certs\") pod \"konnectivity-agent-4smd5\" (UID: \"d43e5662-c703-427c-ba4c-60f3f0029405\") " pod="kube-system/konnectivity-agent-4smd5" Apr 24 21:27:05.258510 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257798 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-systemd-units\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.258510 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257826 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-system-cni-dir\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.258510 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257865 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-host-run-multus-certs\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.258510 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257906 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-etc-modprobe-d\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.258510 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257944 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/302e0256-bc9b-454a-8fe1-15f7d4a40459-socket-dir\") pod \"aws-ebs-csi-driver-node-ss89p\" (UID: \"302e0256-bc9b-454a-8fe1-15f7d4a40459\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ss89p" Apr 24 21:27:05.258510 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.257977 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/302e0256-bc9b-454a-8fe1-15f7d4a40459-registration-dir\") pod \"aws-ebs-csi-driver-node-ss89p\" (UID: \"302e0256-bc9b-454a-8fe1-15f7d4a40459\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ss89p" Apr 24 21:27:05.258510 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.258007 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-host-slash\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.258510 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.258047 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-var-lib-openvswitch\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.258510 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.258079 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c595912d-1543-4c43-8a11-3dbcf0f15050-ovn-node-metrics-cert\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.258510 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.258119 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-cnibin\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.258510 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.258137 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-sys\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.258510 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.258153 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3807cdc4-74ff-4e27-bde0-2ed93b428a58-cnibin\") pod \"multus-additional-cni-plugins-5fn4d\" (UID: \"3807cdc4-74ff-4e27-bde0-2ed93b428a58\") " pod="openshift-multus/multus-additional-cni-plugins-5fn4d" Apr 24 21:27:05.258510 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.258169 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3807cdc4-74ff-4e27-bde0-2ed93b428a58-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5fn4d\" (UID: \"3807cdc4-74ff-4e27-bde0-2ed93b428a58\") " pod="openshift-multus/multus-additional-cni-plugins-5fn4d" Apr 24 21:27:05.259188 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.258196 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-run-systemd\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.259188 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.258222 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-node-log\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.259188 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.258239 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-host-var-lib-kubelet\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.259188 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.258263 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8b726083-445b-4ee0-8797-c710268b6b65-tmp\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.259188 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.258311 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-etc-openvswitch\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.259188 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.258339 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfcpn\" (UniqueName: \"kubernetes.io/projected/102ddca3-9b65-4682-a3d0-5bf546504d17-kube-api-access-vfcpn\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.259188 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.258365 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/302e0256-bc9b-454a-8fe1-15f7d4a40459-device-dir\") pod \"aws-ebs-csi-driver-node-ss89p\" (UID: \"302e0256-bc9b-454a-8fe1-15f7d4a40459\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ss89p" Apr 24 21:27:05.259188 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.258624 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:27:05.259188 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.258637 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:27:05.259188 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.258662 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-l82zb\"" Apr 24 21:27:05.294800 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.294769 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:22:04 +0000 UTC" deadline="2027-11-04 05:09:44.609188795 +0000 UTC" Apr 24 21:27:05.294800 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.294798 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13399h42m39.314394327s" Apr 24 21:27:05.321035 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.321004 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:05.346256 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.346218 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:27:05.355299 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.355249 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-201.ec2.internal" event={"ID":"bcf950b6b9eef658d8ab20236281bc71","Type":"ContainerStarted","Data":"2325d5091243fe4dc9aa8e1fa5fd4f8e08ae4dd8a1214de44e96504b461e275b"} Apr 24 21:27:05.356173 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.356144 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal" event={"ID":"0eeb12378a06f0692d8b8888ada8c1bc","Type":"ContainerStarted","Data":"5e7ead3bebded7aaa7a2c0505428f7283fe109bd71a6e2454313748c7e5f74d7"} Apr 24 21:27:05.359024 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.358999 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-log-socket\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.359158 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.359040 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sh8fr\" (UniqueName: \"kubernetes.io/projected/c595912d-1543-4c43-8a11-3dbcf0f15050-kube-api-access-sh8fr\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.359158 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.359120 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-os-release\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.359158 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.359128 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-log-socket\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.359158 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.359147 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-host-run-k8s-cni-cncf-io\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.359323 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.359171 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-host-run-netns\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.359323 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.359194 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/102ddca3-9b65-4682-a3d0-5bf546504d17-multus-daemon-config\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.359323 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.359213 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-etc-sysctl-d\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.359323 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.359228 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3807cdc4-74ff-4e27-bde0-2ed93b428a58-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5fn4d\" (UID: \"3807cdc4-74ff-4e27-bde0-2ed93b428a58\") " pod="openshift-multus/multus-additional-cni-plugins-5fn4d" Apr 24 21:27:05.359323 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.359245 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c595912d-1543-4c43-8a11-3dbcf0f15050-env-overrides\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.359323 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.359263 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/102ddca3-9b65-4682-a3d0-5bf546504d17-cni-binary-copy\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.359323 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.359300 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-multus-socket-dir-parent\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.359575 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.359336 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-host-var-lib-cni-multus\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.359575 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.359369 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-etc-systemd\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.359575 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.359388 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3807cdc4-74ff-4e27-bde0-2ed93b428a58-system-cni-dir\") pod \"multus-additional-cni-plugins-5fn4d\" (UID: \"3807cdc4-74ff-4e27-bde0-2ed93b428a58\") " pod="openshift-multus/multus-additional-cni-plugins-5fn4d" Apr 24 21:27:05.359575 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.359419 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qkgz\" (UniqueName: \"kubernetes.io/projected/3807cdc4-74ff-4e27-bde0-2ed93b428a58-kube-api-access-4qkgz\") pod \"multus-additional-cni-plugins-5fn4d\" (UID: \"3807cdc4-74ff-4e27-bde0-2ed93b428a58\") " pod="openshift-multus/multus-additional-cni-plugins-5fn4d" Apr 24 21:27:05.359575 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.359436 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-etc-sysctl-d\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.359575 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.359447 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8159f631-f735-47c0-8dd1-8342be18cbcf-tmp-dir\") pod \"node-resolver-z65xz\" (UID: \"8159f631-f735-47c0-8dd1-8342be18cbcf\") " pod="openshift-dns/node-resolver-z65xz" Apr 24 21:27:05.359575 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.359495 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-host-kubelet\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.359575 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.359526 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c595912d-1543-4c43-8a11-3dbcf0f15050-ovnkube-script-lib\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.359575 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.359558 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-etc-kubernetes\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.359952 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.359582 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-host-cni-bin\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.359952 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.359606 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.359952 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.359658 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.359952 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.359694 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8159f631-f735-47c0-8dd1-8342be18cbcf-tmp-dir\") pod \"node-resolver-z65xz\" (UID: \"8159f631-f735-47c0-8dd1-8342be18cbcf\") " pod="openshift-dns/node-resolver-z65xz" Apr 24 21:27:05.359952 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.359722 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-multus-socket-dir-parent\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.359952 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.359765 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-host-var-lib-cni-multus\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.359952 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.359775 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-etc-systemd\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.359952 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.359815 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3807cdc4-74ff-4e27-bde0-2ed93b428a58-system-cni-dir\") pod \"multus-additional-cni-plugins-5fn4d\" (UID: \"3807cdc4-74ff-4e27-bde0-2ed93b428a58\") " pod="openshift-multus/multus-additional-cni-plugins-5fn4d" Apr 24 21:27:05.359952 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.359891 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3807cdc4-74ff-4e27-bde0-2ed93b428a58-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5fn4d\" (UID: \"3807cdc4-74ff-4e27-bde0-2ed93b428a58\") " pod="openshift-multus/multus-additional-cni-plugins-5fn4d" Apr 24 21:27:05.359952 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.359228 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-os-release\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.360434 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.360118 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-etc-kubernetes\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.360434 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.360190 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-host-run-netns\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.360434 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.360296 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-host-cni-bin\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.360434 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.360348 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-host-kubelet\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.360621 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.360459 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-etc-sysconfig\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.360621 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.360491 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-etc-sysctl-conf\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.360621 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.360542 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-etc-sysconfig\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.360621 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.360553 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-host-run-k8s-cni-cncf-io\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.360621 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.360549 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-lib-modules\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.360621 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.360609 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e61e2625-9935-4ec2-88cd-d2ac5c781886-serviceca\") pod \"node-ca-gtgdx\" (UID: \"e61e2625-9935-4ec2-88cd-d2ac5c781886\") " pod="openshift-image-registry/node-ca-gtgdx" Apr 24 21:27:05.360871 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.360633 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c595912d-1543-4c43-8a11-3dbcf0f15050-env-overrides\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.360871 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.360641 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/302e0256-bc9b-454a-8fe1-15f7d4a40459-etc-selinux\") pod \"aws-ebs-csi-driver-node-ss89p\" (UID: \"302e0256-bc9b-454a-8fe1-15f7d4a40459\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ss89p" Apr 24 21:27:05.360871 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.360657 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-lib-modules\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.360871 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.360674 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-etc-sysctl-conf\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.361050 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.360670 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lw49d\" (UniqueName: \"kubernetes.io/projected/8159f631-f735-47c0-8dd1-8342be18cbcf-kube-api-access-lw49d\") pod \"node-resolver-z65xz\" (UID: \"8159f631-f735-47c0-8dd1-8342be18cbcf\") " pod="openshift-dns/node-resolver-z65xz" Apr 24 21:27:05.361109 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.361068 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/701e782c-df09-4f2a-a23e-72ec0675fcb7-iptables-alerter-script\") pod \"iptables-alerter-h2b2b\" (UID: \"701e782c-df09-4f2a-a23e-72ec0675fcb7\") " pod="openshift-network-operator/iptables-alerter-h2b2b" Apr 24 21:27:05.361163 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.361143 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-host-run-ovn-kubernetes\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.361163 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.361154 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/102ddca3-9b65-4682-a3d0-5bf546504d17-cni-binary-copy\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.361259 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.361180 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/302e0256-bc9b-454a-8fe1-15f7d4a40459-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ss89p\" (UID: \"302e0256-bc9b-454a-8fe1-15f7d4a40459\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ss89p" Apr 24 21:27:05.361259 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.361211 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-run-openvswitch\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.361259 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.361228 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/102ddca3-9b65-4682-a3d0-5bf546504d17-multus-daemon-config\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.361259 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.361242 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/701e782c-df09-4f2a-a23e-72ec0675fcb7-host-slash\") pod \"iptables-alerter-h2b2b\" (UID: \"701e782c-df09-4f2a-a23e-72ec0675fcb7\") " pod="openshift-network-operator/iptables-alerter-h2b2b" Apr 24 21:27:05.361431 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.361299 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/701e782c-df09-4f2a-a23e-72ec0675fcb7-host-slash\") pod \"iptables-alerter-h2b2b\" (UID: \"701e782c-df09-4f2a-a23e-72ec0675fcb7\") " pod="openshift-network-operator/iptables-alerter-h2b2b" Apr 24 21:27:05.361431 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.361309 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-host-run-netns\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.361431 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.361349 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-var-lib-kubelet\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.361431 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.361386 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ff85k\" (UniqueName: \"kubernetes.io/projected/066d9094-f992-4853-86f1-b25700fe6070-kube-api-access-ff85k\") pod \"network-metrics-daemon-kpb2c\" (UID: \"066d9094-f992-4853-86f1-b25700fe6070\") " pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:27:05.361431 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.361426 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c595912d-1543-4c43-8a11-3dbcf0f15050-ovnkube-script-lib\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.361636 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.361440 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e61e2625-9935-4ec2-88cd-d2ac5c781886-serviceca\") pod \"node-ca-gtgdx\" (UID: \"e61e2625-9935-4ec2-88cd-d2ac5c781886\") " pod="openshift-image-registry/node-ca-gtgdx" Apr 24 21:27:05.361636 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.361493 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/302e0256-bc9b-454a-8fe1-15f7d4a40459-etc-selinux\") pod \"aws-ebs-csi-driver-node-ss89p\" (UID: \"302e0256-bc9b-454a-8fe1-15f7d4a40459\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ss89p" Apr 24 21:27:05.361636 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.361522 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-var-lib-kubelet\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.361636 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.361542 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-host-run-netns\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.361636 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.361585 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-host-run-ovn-kubernetes\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.361636 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.361598 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-run-openvswitch\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.361636 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.361609 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/302e0256-bc9b-454a-8fe1-15f7d4a40459-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ss89p\" (UID: \"302e0256-bc9b-454a-8fe1-15f7d4a40459\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ss89p" Apr 24 21:27:05.361916 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.361704 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3807cdc4-74ff-4e27-bde0-2ed93b428a58-os-release\") pod \"multus-additional-cni-plugins-5fn4d\" (UID: \"3807cdc4-74ff-4e27-bde0-2ed93b428a58\") " pod="openshift-multus/multus-additional-cni-plugins-5fn4d" Apr 24 21:27:05.361916 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.361741 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d43e5662-c703-427c-ba4c-60f3f0029405-agent-certs\") pod \"konnectivity-agent-4smd5\" (UID: \"d43e5662-c703-427c-ba4c-60f3f0029405\") " pod="kube-system/konnectivity-agent-4smd5" Apr 24 21:27:05.361916 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.361764 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-systemd-units\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.361916 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.361794 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-system-cni-dir\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.361916 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.361825 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-host-run-multus-certs\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.361916 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.361855 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-etc-modprobe-d\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.361916 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.361886 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/302e0256-bc9b-454a-8fe1-15f7d4a40459-socket-dir\") pod \"aws-ebs-csi-driver-node-ss89p\" (UID: \"302e0256-bc9b-454a-8fe1-15f7d4a40459\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ss89p" Apr 24 21:27:05.362224 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.361916 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/302e0256-bc9b-454a-8fe1-15f7d4a40459-registration-dir\") pod \"aws-ebs-csi-driver-node-ss89p\" (UID: \"302e0256-bc9b-454a-8fe1-15f7d4a40459\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ss89p" Apr 24 21:27:05.362224 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.361926 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-systemd-units\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.362224 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.361946 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-host-slash\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.362224 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.361984 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-var-lib-openvswitch\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.362224 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.362000 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3807cdc4-74ff-4e27-bde0-2ed93b428a58-os-release\") pod \"multus-additional-cni-plugins-5fn4d\" (UID: \"3807cdc4-74ff-4e27-bde0-2ed93b428a58\") " pod="openshift-multus/multus-additional-cni-plugins-5fn4d" Apr 24 21:27:05.362224 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.362015 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c595912d-1543-4c43-8a11-3dbcf0f15050-ovn-node-metrics-cert\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.362224 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.362047 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-cnibin\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.362224 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.362076 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-sys\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.362224 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.362122 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3807cdc4-74ff-4e27-bde0-2ed93b428a58-cnibin\") pod \"multus-additional-cni-plugins-5fn4d\" (UID: \"3807cdc4-74ff-4e27-bde0-2ed93b428a58\") " pod="openshift-multus/multus-additional-cni-plugins-5fn4d" Apr 24 21:27:05.362224 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.362157 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3807cdc4-74ff-4e27-bde0-2ed93b428a58-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5fn4d\" (UID: \"3807cdc4-74ff-4e27-bde0-2ed93b428a58\") " pod="openshift-multus/multus-additional-cni-plugins-5fn4d" Apr 24 21:27:05.362224 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.362176 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/302e0256-bc9b-454a-8fe1-15f7d4a40459-registration-dir\") pod \"aws-ebs-csi-driver-node-ss89p\" (UID: \"302e0256-bc9b-454a-8fe1-15f7d4a40459\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ss89p" Apr 24 21:27:05.362224 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.362188 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-run-systemd\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.362224 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.362219 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-node-log\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.362792 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.362244 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-system-cni-dir\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.362792 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.362251 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-host-var-lib-kubelet\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.362792 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.362294 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-host-run-multus-certs\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.362792 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.362311 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-host-var-lib-kubelet\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.362792 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.362342 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:27:05.362792 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.362372 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-host-slash\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.362792 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.362423 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-var-lib-openvswitch\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.362792 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.362673 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3807cdc4-74ff-4e27-bde0-2ed93b428a58-cnibin\") pod \"multus-additional-cni-plugins-5fn4d\" (UID: \"3807cdc4-74ff-4e27-bde0-2ed93b428a58\") " pod="openshift-multus/multus-additional-cni-plugins-5fn4d" Apr 24 21:27:05.362792 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.362747 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-cnibin\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.363177 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.362800 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-sys\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.363177 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.362848 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-run-systemd\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.363262 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.363200 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/701e782c-df09-4f2a-a23e-72ec0675fcb7-iptables-alerter-script\") pod \"iptables-alerter-h2b2b\" (UID: \"701e782c-df09-4f2a-a23e-72ec0675fcb7\") " pod="openshift-network-operator/iptables-alerter-h2b2b" Apr 24 21:27:05.363344 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.363326 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-etc-modprobe-d\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.363392 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.363382 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-node-log\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.363487 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.363473 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/302e0256-bc9b-454a-8fe1-15f7d4a40459-socket-dir\") pod \"aws-ebs-csi-driver-node-ss89p\" (UID: \"302e0256-bc9b-454a-8fe1-15f7d4a40459\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ss89p" Apr 24 21:27:05.363534 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.362350 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8b726083-445b-4ee0-8797-c710268b6b65-tmp\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.363534 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.363524 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-etc-openvswitch\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.363623 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.363555 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vfcpn\" (UniqueName: \"kubernetes.io/projected/102ddca3-9b65-4682-a3d0-5bf546504d17-kube-api-access-vfcpn\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.363623 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.363600 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/302e0256-bc9b-454a-8fe1-15f7d4a40459-device-dir\") pod \"aws-ebs-csi-driver-node-ss89p\" (UID: \"302e0256-bc9b-454a-8fe1-15f7d4a40459\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ss89p" Apr 24 21:27:05.363714 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.363629 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-host\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.363714 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.363659 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-run-ovn\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.363714 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.363681 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-etc-kubernetes\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.363714 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.363709 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8b726083-445b-4ee0-8797-c710268b6b65-etc-tuned\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.363874 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.363737 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7r5zg\" (UniqueName: \"kubernetes.io/projected/8b726083-445b-4ee0-8797-c710268b6b65-kube-api-access-7r5zg\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.363874 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.363768 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/302e0256-bc9b-454a-8fe1-15f7d4a40459-sys-fs\") pod \"aws-ebs-csi-driver-node-ss89p\" (UID: \"302e0256-bc9b-454a-8fe1-15f7d4a40459\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ss89p" Apr 24 21:27:05.363874 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.363796 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbxpx\" (UniqueName: \"kubernetes.io/projected/701e782c-df09-4f2a-a23e-72ec0675fcb7-kube-api-access-qbxpx\") pod \"iptables-alerter-h2b2b\" (UID: \"701e782c-df09-4f2a-a23e-72ec0675fcb7\") " pod="openshift-network-operator/iptables-alerter-h2b2b" Apr 24 21:27:05.363874 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.363819 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e61e2625-9935-4ec2-88cd-d2ac5c781886-host\") pod \"node-ca-gtgdx\" (UID: \"e61e2625-9935-4ec2-88cd-d2ac5c781886\") " pod="openshift-image-registry/node-ca-gtgdx" Apr 24 21:27:05.363874 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.363846 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6p9k\" (UniqueName: \"kubernetes.io/projected/e61e2625-9935-4ec2-88cd-d2ac5c781886-kube-api-access-h6p9k\") pod \"node-ca-gtgdx\" (UID: \"e61e2625-9935-4ec2-88cd-d2ac5c781886\") " pod="openshift-image-registry/node-ca-gtgdx" Apr 24 21:27:05.364128 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.363876 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3807cdc4-74ff-4e27-bde0-2ed93b428a58-cni-binary-copy\") pod \"multus-additional-cni-plugins-5fn4d\" (UID: \"3807cdc4-74ff-4e27-bde0-2ed93b428a58\") " pod="openshift-multus/multus-additional-cni-plugins-5fn4d" Apr 24 21:27:05.364128 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.363909 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgrqn\" (UniqueName: \"kubernetes.io/projected/1612b97a-f223-4e83-8710-5764a3765126-kube-api-access-xgrqn\") pod \"network-check-target-6xchx\" (UID: \"1612b97a-f223-4e83-8710-5764a3765126\") " pod="openshift-network-diagnostics/network-check-target-6xchx" Apr 24 21:27:05.364128 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.363940 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-host-cni-netd\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.364128 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.363964 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c595912d-1543-4c43-8a11-3dbcf0f15050-ovnkube-config\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.364128 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.363993 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-multus-cni-dir\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.364128 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.364034 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3807cdc4-74ff-4e27-bde0-2ed93b428a58-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5fn4d\" (UID: \"3807cdc4-74ff-4e27-bde0-2ed93b428a58\") " pod="openshift-multus/multus-additional-cni-plugins-5fn4d" Apr 24 21:27:05.364128 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.364082 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d43e5662-c703-427c-ba4c-60f3f0029405-konnectivity-ca\") pod \"konnectivity-agent-4smd5\" (UID: \"d43e5662-c703-427c-ba4c-60f3f0029405\") " pod="kube-system/konnectivity-agent-4smd5" Apr 24 21:27:05.364424 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.364131 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gj2tv\" (UniqueName: \"kubernetes.io/projected/302e0256-bc9b-454a-8fe1-15f7d4a40459-kube-api-access-gj2tv\") pod \"aws-ebs-csi-driver-node-ss89p\" (UID: \"302e0256-bc9b-454a-8fe1-15f7d4a40459\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ss89p" Apr 24 21:27:05.364424 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.364160 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8159f631-f735-47c0-8dd1-8342be18cbcf-hosts-file\") pod \"node-resolver-z65xz\" (UID: \"8159f631-f735-47c0-8dd1-8342be18cbcf\") " pod="openshift-dns/node-resolver-z65xz" Apr 24 21:27:05.364424 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.364190 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-host-var-lib-cni-bin\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.364424 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.364219 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-hostroot\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.364424 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.364247 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-run\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.364424 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.364276 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-multus-conf-dir\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.364424 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.364302 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/066d9094-f992-4853-86f1-b25700fe6070-metrics-certs\") pod \"network-metrics-daemon-kpb2c\" (UID: \"066d9094-f992-4853-86f1-b25700fe6070\") " pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:27:05.364731 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:05.364454 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:05.364731 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:05.364544 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/066d9094-f992-4853-86f1-b25700fe6070-metrics-certs podName:066d9094-f992-4853-86f1-b25700fe6070 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:05.86451227 +0000 UTC m=+3.125081098 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/066d9094-f992-4853-86f1-b25700fe6070-metrics-certs") pod "network-metrics-daemon-kpb2c" (UID: "066d9094-f992-4853-86f1-b25700fe6070") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:05.364862 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.364775 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-etc-openvswitch\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.367760 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.365035 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/302e0256-bc9b-454a-8fe1-15f7d4a40459-device-dir\") pod \"aws-ebs-csi-driver-node-ss89p\" (UID: \"302e0256-bc9b-454a-8fe1-15f7d4a40459\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ss89p" Apr 24 21:27:05.367760 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.365112 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-host\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.367760 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.365163 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-run-ovn\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.367760 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.365222 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-etc-kubernetes\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.367760 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.365569 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-multus-cni-dir\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.367760 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.365607 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c595912d-1543-4c43-8a11-3dbcf0f15050-ovn-node-metrics-cert\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.367760 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.365895 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3807cdc4-74ff-4e27-bde0-2ed93b428a58-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5fn4d\" (UID: \"3807cdc4-74ff-4e27-bde0-2ed93b428a58\") " pod="openshift-multus/multus-additional-cni-plugins-5fn4d" Apr 24 21:27:05.367760 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.365978 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-host-var-lib-cni-bin\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.367760 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.366189 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3807cdc4-74ff-4e27-bde0-2ed93b428a58-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5fn4d\" (UID: \"3807cdc4-74ff-4e27-bde0-2ed93b428a58\") " pod="openshift-multus/multus-additional-cni-plugins-5fn4d" Apr 24 21:27:05.367760 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.366257 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-hostroot\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.367760 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.366316 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8b726083-445b-4ee0-8797-c710268b6b65-run\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.367760 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.366371 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/102ddca3-9b65-4682-a3d0-5bf546504d17-multus-conf-dir\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.367760 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.366478 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d43e5662-c703-427c-ba4c-60f3f0029405-konnectivity-ca\") pod \"konnectivity-agent-4smd5\" (UID: \"d43e5662-c703-427c-ba4c-60f3f0029405\") " pod="kube-system/konnectivity-agent-4smd5" Apr 24 21:27:05.367760 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.366538 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e61e2625-9935-4ec2-88cd-d2ac5c781886-host\") pod \"node-ca-gtgdx\" (UID: \"e61e2625-9935-4ec2-88cd-d2ac5c781886\") " pod="openshift-image-registry/node-ca-gtgdx" Apr 24 21:27:05.367760 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.366763 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/302e0256-bc9b-454a-8fe1-15f7d4a40459-sys-fs\") pod \"aws-ebs-csi-driver-node-ss89p\" (UID: \"302e0256-bc9b-454a-8fe1-15f7d4a40459\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ss89p" Apr 24 21:27:05.367760 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.366898 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3807cdc4-74ff-4e27-bde0-2ed93b428a58-cni-binary-copy\") pod \"multus-additional-cni-plugins-5fn4d\" (UID: \"3807cdc4-74ff-4e27-bde0-2ed93b428a58\") " pod="openshift-multus/multus-additional-cni-plugins-5fn4d" Apr 24 21:27:05.367760 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.366948 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c595912d-1543-4c43-8a11-3dbcf0f15050-host-cni-netd\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.367760 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.367004 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8159f631-f735-47c0-8dd1-8342be18cbcf-hosts-file\") pod \"node-resolver-z65xz\" (UID: \"8159f631-f735-47c0-8dd1-8342be18cbcf\") " pod="openshift-dns/node-resolver-z65xz" Apr 24 21:27:05.368746 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.367446 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c595912d-1543-4c43-8a11-3dbcf0f15050-ovnkube-config\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.368746 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.367667 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8b726083-445b-4ee0-8797-c710268b6b65-tmp\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.368746 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.367669 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d43e5662-c703-427c-ba4c-60f3f0029405-agent-certs\") pod \"konnectivity-agent-4smd5\" (UID: \"d43e5662-c703-427c-ba4c-60f3f0029405\") " pod="kube-system/konnectivity-agent-4smd5" Apr 24 21:27:05.368908 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.368759 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh8fr\" (UniqueName: \"kubernetes.io/projected/c595912d-1543-4c43-8a11-3dbcf0f15050-kube-api-access-sh8fr\") pod \"ovnkube-node-kcl2z\" (UID: \"c595912d-1543-4c43-8a11-3dbcf0f15050\") " pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.369421 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.369381 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8b726083-445b-4ee0-8797-c710268b6b65-etc-tuned\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.369674 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.369656 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qkgz\" (UniqueName: \"kubernetes.io/projected/3807cdc4-74ff-4e27-bde0-2ed93b428a58-kube-api-access-4qkgz\") pod \"multus-additional-cni-plugins-5fn4d\" (UID: \"3807cdc4-74ff-4e27-bde0-2ed93b428a58\") " pod="openshift-multus/multus-additional-cni-plugins-5fn4d" Apr 24 21:27:05.370625 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.370605 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw49d\" (UniqueName: \"kubernetes.io/projected/8159f631-f735-47c0-8dd1-8342be18cbcf-kube-api-access-lw49d\") pod \"node-resolver-z65xz\" (UID: \"8159f631-f735-47c0-8dd1-8342be18cbcf\") " pod="openshift-dns/node-resolver-z65xz" Apr 24 21:27:05.372694 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.372668 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff85k\" (UniqueName: \"kubernetes.io/projected/066d9094-f992-4853-86f1-b25700fe6070-kube-api-access-ff85k\") pod \"network-metrics-daemon-kpb2c\" (UID: \"066d9094-f992-4853-86f1-b25700fe6070\") " pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:27:05.378175 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.378156 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6p9k\" (UniqueName: \"kubernetes.io/projected/e61e2625-9935-4ec2-88cd-d2ac5c781886-kube-api-access-h6p9k\") pod \"node-ca-gtgdx\" (UID: \"e61e2625-9935-4ec2-88cd-d2ac5c781886\") " pod="openshift-image-registry/node-ca-gtgdx" Apr 24 21:27:05.378175 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.378170 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r5zg\" (UniqueName: \"kubernetes.io/projected/8b726083-445b-4ee0-8797-c710268b6b65-kube-api-access-7r5zg\") pod \"tuned-jhprx\" (UID: \"8b726083-445b-4ee0-8797-c710268b6b65\") " pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.378552 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.378535 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj2tv\" (UniqueName: \"kubernetes.io/projected/302e0256-bc9b-454a-8fe1-15f7d4a40459-kube-api-access-gj2tv\") pod \"aws-ebs-csi-driver-node-ss89p\" (UID: \"302e0256-bc9b-454a-8fe1-15f7d4a40459\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ss89p" Apr 24 21:27:05.378983 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.378960 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfcpn\" (UniqueName: \"kubernetes.io/projected/102ddca3-9b65-4682-a3d0-5bf546504d17-kube-api-access-vfcpn\") pod \"multus-29bpz\" (UID: \"102ddca3-9b65-4682-a3d0-5bf546504d17\") " pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.379194 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.379172 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbxpx\" (UniqueName: \"kubernetes.io/projected/701e782c-df09-4f2a-a23e-72ec0675fcb7-kube-api-access-qbxpx\") pod \"iptables-alerter-h2b2b\" (UID: \"701e782c-df09-4f2a-a23e-72ec0675fcb7\") " pod="openshift-network-operator/iptables-alerter-h2b2b" Apr 24 21:27:05.465061 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.465029 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xgrqn\" (UniqueName: \"kubernetes.io/projected/1612b97a-f223-4e83-8710-5764a3765126-kube-api-access-xgrqn\") pod \"network-check-target-6xchx\" (UID: \"1612b97a-f223-4e83-8710-5764a3765126\") " pod="openshift-network-diagnostics/network-check-target-6xchx" Apr 24 21:27:05.471515 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:05.471438 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:05.471515 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:05.471467 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:05.471515 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:05.471482 2573 projected.go:194] Error preparing data for projected volume kube-api-access-xgrqn for pod openshift-network-diagnostics/network-check-target-6xchx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:05.471748 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:05.471564 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1612b97a-f223-4e83-8710-5764a3765126-kube-api-access-xgrqn podName:1612b97a-f223-4e83-8710-5764a3765126 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:05.971545604 +0000 UTC m=+3.232114415 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xgrqn" (UniqueName: "kubernetes.io/projected/1612b97a-f223-4e83-8710-5764a3765126-kube-api-access-xgrqn") pod "network-check-target-6xchx" (UID: "1612b97a-f223-4e83-8710-5764a3765126") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:05.540317 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.540272 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4smd5" Apr 24 21:27:05.547173 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.547144 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-jhprx" Apr 24 21:27:05.558789 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.558761 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ss89p" Apr 24 21:27:05.562826 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.562804 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-z65xz" Apr 24 21:27:05.569463 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.569439 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-h2b2b" Apr 24 21:27:05.576116 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.576081 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:05.581632 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.581614 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gtgdx" Apr 24 21:27:05.588204 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.588182 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-29bpz" Apr 24 21:27:05.591736 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.591716 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5fn4d" Apr 24 21:27:05.867457 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:05.867384 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/066d9094-f992-4853-86f1-b25700fe6070-metrics-certs\") pod \"network-metrics-daemon-kpb2c\" (UID: \"066d9094-f992-4853-86f1-b25700fe6070\") " pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:27:05.867589 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:05.867515 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:05.867589 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:05.867571 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/066d9094-f992-4853-86f1-b25700fe6070-metrics-certs podName:066d9094-f992-4853-86f1-b25700fe6070 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:06.867556092 +0000 UTC m=+4.128124905 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/066d9094-f992-4853-86f1-b25700fe6070-metrics-certs") pod "network-metrics-daemon-kpb2c" (UID: "066d9094-f992-4853-86f1-b25700fe6070") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:05.932355 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:05.932131 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode61e2625_9935_4ec2_88cd_d2ac5c781886.slice/crio-8aadec508362bd1729906ec01f62eb380192d0c769787488d9342979d9946cf5 WatchSource:0}: Error finding container 8aadec508362bd1729906ec01f62eb380192d0c769787488d9342979d9946cf5: Status 404 returned error can't find the container with id 8aadec508362bd1729906ec01f62eb380192d0c769787488d9342979d9946cf5 Apr 24 21:27:05.933275 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:05.933240 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd43e5662_c703_427c_ba4c_60f3f0029405.slice/crio-4ae46994ab37915239cefad470a3bd91212176ca966316144792c7d4f465a3d1 WatchSource:0}: Error finding container 4ae46994ab37915239cefad470a3bd91212176ca966316144792c7d4f465a3d1: Status 404 returned error can't find the container with id 4ae46994ab37915239cefad470a3bd91212176ca966316144792c7d4f465a3d1 Apr 24 21:27:05.933864 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:05.933780 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod701e782c_df09_4f2a_a23e_72ec0675fcb7.slice/crio-5b9599f322d3da0ed361abcdd8f8e1520f786c36392b8e1856e4015b38866d06 WatchSource:0}: Error finding container 5b9599f322d3da0ed361abcdd8f8e1520f786c36392b8e1856e4015b38866d06: Status 404 returned error can't find the container with id 5b9599f322d3da0ed361abcdd8f8e1520f786c36392b8e1856e4015b38866d06 Apr 24 21:27:05.938374 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:05.938312 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8159f631_f735_47c0_8dd1_8342be18cbcf.slice/crio-74ab4997ae4d0fffba53534d4c2ca2537534e108e0d047caa1e4df2c3ecf56b4 WatchSource:0}: Error finding container 74ab4997ae4d0fffba53534d4c2ca2537534e108e0d047caa1e4df2c3ecf56b4: Status 404 returned error can't find the container with id 74ab4997ae4d0fffba53534d4c2ca2537534e108e0d047caa1e4df2c3ecf56b4 Apr 24 21:27:05.939202 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:05.939179 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3807cdc4_74ff_4e27_bde0_2ed93b428a58.slice/crio-98ce30305b1ada1ba9221c1d9a6eafc5ad9a42760550a9e096fed86ee456ac7f WatchSource:0}: Error finding container 98ce30305b1ada1ba9221c1d9a6eafc5ad9a42760550a9e096fed86ee456ac7f: Status 404 returned error can't find the container with id 98ce30305b1ada1ba9221c1d9a6eafc5ad9a42760550a9e096fed86ee456ac7f Apr 24 21:27:05.943464 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:05.942746 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc595912d_1543_4c43_8a11_3dbcf0f15050.slice/crio-2be05e8b80c8896c7e35e9b58ee27027a8ffbdc3124faecf8a3e9b506f831674 WatchSource:0}: Error finding container 2be05e8b80c8896c7e35e9b58ee27027a8ffbdc3124faecf8a3e9b506f831674: Status 404 returned error can't find the container with id 2be05e8b80c8896c7e35e9b58ee27027a8ffbdc3124faecf8a3e9b506f831674 Apr 24 21:27:05.944532 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:05.944508 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod302e0256_bc9b_454a_8fe1_15f7d4a40459.slice/crio-b1066af99216789d37b87e171b4801dc23b0cd9ad7947c964b9ef3002dfa22a3 WatchSource:0}: Error finding container b1066af99216789d37b87e171b4801dc23b0cd9ad7947c964b9ef3002dfa22a3: Status 404 returned error can't find the container with id b1066af99216789d37b87e171b4801dc23b0cd9ad7947c964b9ef3002dfa22a3 Apr 24 21:27:06.068977 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:06.068942 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xgrqn\" (UniqueName: \"kubernetes.io/projected/1612b97a-f223-4e83-8710-5764a3765126-kube-api-access-xgrqn\") pod \"network-check-target-6xchx\" (UID: \"1612b97a-f223-4e83-8710-5764a3765126\") " pod="openshift-network-diagnostics/network-check-target-6xchx" Apr 24 21:27:06.069148 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:06.069123 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:06.069148 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:06.069144 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:06.069243 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:06.069155 2573 projected.go:194] Error preparing data for projected volume kube-api-access-xgrqn for pod openshift-network-diagnostics/network-check-target-6xchx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:06.069243 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:06.069211 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1612b97a-f223-4e83-8710-5764a3765126-kube-api-access-xgrqn podName:1612b97a-f223-4e83-8710-5764a3765126 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:07.069196629 +0000 UTC m=+4.329765443 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-xgrqn" (UniqueName: "kubernetes.io/projected/1612b97a-f223-4e83-8710-5764a3765126-kube-api-access-xgrqn") pod "network-check-target-6xchx" (UID: "1612b97a-f223-4e83-8710-5764a3765126") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:06.296618 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:06.296474 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:22:04 +0000 UTC" deadline="2027-12-23 00:26:11.873597912 +0000 UTC" Apr 24 21:27:06.296618 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:06.296533 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14570h59m5.577074152s" Apr 24 21:27:06.352809 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:06.352243 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6xchx" Apr 24 21:27:06.352809 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:06.352396 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6xchx" podUID="1612b97a-f223-4e83-8710-5764a3765126" Apr 24 21:27:06.369807 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:06.369732 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fn4d" event={"ID":"3807cdc4-74ff-4e27-bde0-2ed93b428a58","Type":"ContainerStarted","Data":"98ce30305b1ada1ba9221c1d9a6eafc5ad9a42760550a9e096fed86ee456ac7f"} Apr 24 21:27:06.373970 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:06.373815 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-h2b2b" event={"ID":"701e782c-df09-4f2a-a23e-72ec0675fcb7","Type":"ContainerStarted","Data":"5b9599f322d3da0ed361abcdd8f8e1520f786c36392b8e1856e4015b38866d06"} Apr 24 21:27:06.378823 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:06.378780 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gtgdx" event={"ID":"e61e2625-9935-4ec2-88cd-d2ac5c781886","Type":"ContainerStarted","Data":"8aadec508362bd1729906ec01f62eb380192d0c769787488d9342979d9946cf5"} Apr 24 21:27:06.383980 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:06.383942 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ss89p" event={"ID":"302e0256-bc9b-454a-8fe1-15f7d4a40459","Type":"ContainerStarted","Data":"b1066af99216789d37b87e171b4801dc23b0cd9ad7947c964b9ef3002dfa22a3"} Apr 24 21:27:06.387008 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:06.386943 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-z65xz" event={"ID":"8159f631-f735-47c0-8dd1-8342be18cbcf","Type":"ContainerStarted","Data":"74ab4997ae4d0fffba53534d4c2ca2537534e108e0d047caa1e4df2c3ecf56b4"} Apr 24 21:27:06.394018 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:06.393961 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4smd5" event={"ID":"d43e5662-c703-427c-ba4c-60f3f0029405","Type":"ContainerStarted","Data":"4ae46994ab37915239cefad470a3bd91212176ca966316144792c7d4f465a3d1"} Apr 24 21:27:06.404574 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:06.404542 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-201.ec2.internal" event={"ID":"bcf950b6b9eef658d8ab20236281bc71","Type":"ContainerStarted","Data":"0364a01e530720c6ff46daaf224dcbdeb6bc4ccf31a5577de7d6e1dd92ca2367"} Apr 24 21:27:06.412387 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:06.412353 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" event={"ID":"c595912d-1543-4c43-8a11-3dbcf0f15050","Type":"ContainerStarted","Data":"2be05e8b80c8896c7e35e9b58ee27027a8ffbdc3124faecf8a3e9b506f831674"} Apr 24 21:27:06.420281 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:06.420054 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jhprx" event={"ID":"8b726083-445b-4ee0-8797-c710268b6b65","Type":"ContainerStarted","Data":"e8e0214403cbf7108bd4cf3766eb4e7efdd70cc0a71cccf70691ba3aaea18722"} Apr 24 21:27:06.426853 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:06.426795 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-29bpz" event={"ID":"102ddca3-9b65-4682-a3d0-5bf546504d17","Type":"ContainerStarted","Data":"97d8f729e7871d6cc77bf0a518d9ccbfeb205f3f01bcdb0086f97545748c3f2e"} Apr 24 21:27:06.877154 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:06.876541 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/066d9094-f992-4853-86f1-b25700fe6070-metrics-certs\") pod \"network-metrics-daemon-kpb2c\" (UID: \"066d9094-f992-4853-86f1-b25700fe6070\") " pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:27:06.877154 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:06.876701 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:06.877154 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:06.876768 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/066d9094-f992-4853-86f1-b25700fe6070-metrics-certs podName:066d9094-f992-4853-86f1-b25700fe6070 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:08.876748463 +0000 UTC m=+6.137317286 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/066d9094-f992-4853-86f1-b25700fe6070-metrics-certs") pod "network-metrics-daemon-kpb2c" (UID: "066d9094-f992-4853-86f1-b25700fe6070") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:07.078880 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:07.078236 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xgrqn\" (UniqueName: \"kubernetes.io/projected/1612b97a-f223-4e83-8710-5764a3765126-kube-api-access-xgrqn\") pod \"network-check-target-6xchx\" (UID: \"1612b97a-f223-4e83-8710-5764a3765126\") " pod="openshift-network-diagnostics/network-check-target-6xchx" Apr 24 21:27:07.078880 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:07.078436 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:07.078880 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:07.078459 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:07.078880 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:07.078471 2573 projected.go:194] Error preparing data for projected volume kube-api-access-xgrqn for pod openshift-network-diagnostics/network-check-target-6xchx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:07.078880 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:07.078531 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1612b97a-f223-4e83-8710-5764a3765126-kube-api-access-xgrqn podName:1612b97a-f223-4e83-8710-5764a3765126 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:09.078513279 +0000 UTC m=+6.339082102 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-xgrqn" (UniqueName: "kubernetes.io/projected/1612b97a-f223-4e83-8710-5764a3765126-kube-api-access-xgrqn") pod "network-check-target-6xchx" (UID: "1612b97a-f223-4e83-8710-5764a3765126") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:07.354916 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:07.354437 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:27:07.354916 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:07.354579 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpb2c" podUID="066d9094-f992-4853-86f1-b25700fe6070" Apr 24 21:27:07.443339 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:07.443250 2573 generic.go:358] "Generic (PLEG): container finished" podID="0eeb12378a06f0692d8b8888ada8c1bc" containerID="8cae188831454bd270c053eb73fee029415215eb5165427fda12a17c64e0d95f" exitCode=0 Apr 24 21:27:07.444214 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:07.444153 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal" event={"ID":"0eeb12378a06f0692d8b8888ada8c1bc","Type":"ContainerDied","Data":"8cae188831454bd270c053eb73fee029415215eb5165427fda12a17c64e0d95f"} Apr 24 21:27:07.459393 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:07.459342 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-201.ec2.internal" podStartSLOduration=3.459323724 podStartE2EDuration="3.459323724s" podCreationTimestamp="2026-04-24 21:27:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:06.420231551 +0000 UTC m=+3.680800387" watchObservedRunningTime="2026-04-24 21:27:07.459323724 +0000 UTC m=+4.719892560" Apr 24 21:27:08.352130 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:08.352081 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6xchx" Apr 24 21:27:08.352332 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:08.352231 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6xchx" podUID="1612b97a-f223-4e83-8710-5764a3765126" Apr 24 21:27:08.456961 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:08.456923 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal" event={"ID":"0eeb12378a06f0692d8b8888ada8c1bc","Type":"ContainerStarted","Data":"4ea1ad71ecc589de670a7095692dec6076b0088c928b32ad720e1de6d69bb24e"} Apr 24 21:27:08.471933 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:08.471877 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-201.ec2.internal" podStartSLOduration=4.471857577 podStartE2EDuration="4.471857577s" podCreationTimestamp="2026-04-24 21:27:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:08.471016624 +0000 UTC m=+5.731585455" watchObservedRunningTime="2026-04-24 21:27:08.471857577 +0000 UTC m=+5.732426409" Apr 24 21:27:08.897587 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:08.897558 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/066d9094-f992-4853-86f1-b25700fe6070-metrics-certs\") pod \"network-metrics-daemon-kpb2c\" (UID: \"066d9094-f992-4853-86f1-b25700fe6070\") " pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:27:08.902780 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:08.902280 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:08.902780 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:08.902376 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/066d9094-f992-4853-86f1-b25700fe6070-metrics-certs podName:066d9094-f992-4853-86f1-b25700fe6070 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:12.902355236 +0000 UTC m=+10.162924049 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/066d9094-f992-4853-86f1-b25700fe6070-metrics-certs") pod "network-metrics-daemon-kpb2c" (UID: "066d9094-f992-4853-86f1-b25700fe6070") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:09.100153 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:09.100111 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xgrqn\" (UniqueName: \"kubernetes.io/projected/1612b97a-f223-4e83-8710-5764a3765126-kube-api-access-xgrqn\") pod \"network-check-target-6xchx\" (UID: \"1612b97a-f223-4e83-8710-5764a3765126\") " pod="openshift-network-diagnostics/network-check-target-6xchx" Apr 24 21:27:09.100334 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:09.100306 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:09.100334 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:09.100333 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:09.100450 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:09.100347 2573 projected.go:194] Error preparing data for projected volume kube-api-access-xgrqn for pod openshift-network-diagnostics/network-check-target-6xchx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:09.100450 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:09.100403 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1612b97a-f223-4e83-8710-5764a3765126-kube-api-access-xgrqn podName:1612b97a-f223-4e83-8710-5764a3765126 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:13.100384143 +0000 UTC m=+10.360952956 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-xgrqn" (UniqueName: "kubernetes.io/projected/1612b97a-f223-4e83-8710-5764a3765126-kube-api-access-xgrqn") pod "network-check-target-6xchx" (UID: "1612b97a-f223-4e83-8710-5764a3765126") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:09.353545 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:09.353460 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:27:09.353699 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:09.353623 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpb2c" podUID="066d9094-f992-4853-86f1-b25700fe6070" Apr 24 21:27:10.351874 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:10.351840 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6xchx" Apr 24 21:27:10.352394 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:10.351969 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6xchx" podUID="1612b97a-f223-4e83-8710-5764a3765126" Apr 24 21:27:11.352796 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:11.352746 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:27:11.353279 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:11.352886 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpb2c" podUID="066d9094-f992-4853-86f1-b25700fe6070" Apr 24 21:27:12.352219 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:12.352188 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6xchx" Apr 24 21:27:12.352401 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:12.352301 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6xchx" podUID="1612b97a-f223-4e83-8710-5764a3765126" Apr 24 21:27:12.934199 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:12.934112 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/066d9094-f992-4853-86f1-b25700fe6070-metrics-certs\") pod \"network-metrics-daemon-kpb2c\" (UID: \"066d9094-f992-4853-86f1-b25700fe6070\") " pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:27:12.934648 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:12.934329 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:12.934648 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:12.934411 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/066d9094-f992-4853-86f1-b25700fe6070-metrics-certs podName:066d9094-f992-4853-86f1-b25700fe6070 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:20.934390487 +0000 UTC m=+18.194959297 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/066d9094-f992-4853-86f1-b25700fe6070-metrics-certs") pod "network-metrics-daemon-kpb2c" (UID: "066d9094-f992-4853-86f1-b25700fe6070") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:13.136371 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:13.136332 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xgrqn\" (UniqueName: \"kubernetes.io/projected/1612b97a-f223-4e83-8710-5764a3765126-kube-api-access-xgrqn\") pod \"network-check-target-6xchx\" (UID: \"1612b97a-f223-4e83-8710-5764a3765126\") " pod="openshift-network-diagnostics/network-check-target-6xchx" Apr 24 21:27:13.136546 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:13.136506 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:13.136546 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:13.136528 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:13.136546 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:13.136541 2573 projected.go:194] Error preparing data for projected volume kube-api-access-xgrqn for pod openshift-network-diagnostics/network-check-target-6xchx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:13.136731 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:13.136614 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1612b97a-f223-4e83-8710-5764a3765126-kube-api-access-xgrqn podName:1612b97a-f223-4e83-8710-5764a3765126 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:21.136592404 +0000 UTC m=+18.397161217 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-xgrqn" (UniqueName: "kubernetes.io/projected/1612b97a-f223-4e83-8710-5764a3765126-kube-api-access-xgrqn") pod "network-check-target-6xchx" (UID: "1612b97a-f223-4e83-8710-5764a3765126") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:13.353382 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:13.353252 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:27:13.353545 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:13.353391 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpb2c" podUID="066d9094-f992-4853-86f1-b25700fe6070" Apr 24 21:27:14.352379 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:14.352321 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6xchx" Apr 24 21:27:14.352867 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:14.352471 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6xchx" podUID="1612b97a-f223-4e83-8710-5764a3765126" Apr 24 21:27:15.351820 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:15.351783 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:27:15.352012 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:15.351901 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpb2c" podUID="066d9094-f992-4853-86f1-b25700fe6070" Apr 24 21:27:16.352166 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:16.352126 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6xchx" Apr 24 21:27:16.352592 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:16.352282 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6xchx" podUID="1612b97a-f223-4e83-8710-5764a3765126" Apr 24 21:27:17.355199 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:17.355164 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:27:17.355605 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:17.355293 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpb2c" podUID="066d9094-f992-4853-86f1-b25700fe6070" Apr 24 21:27:18.352206 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:18.352166 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6xchx" Apr 24 21:27:18.352383 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:18.352295 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6xchx" podUID="1612b97a-f223-4e83-8710-5764a3765126" Apr 24 21:27:19.352347 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:19.352305 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:27:19.352903 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:19.352448 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpb2c" podUID="066d9094-f992-4853-86f1-b25700fe6070" Apr 24 21:27:20.352243 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:20.352209 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6xchx" Apr 24 21:27:20.352442 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:20.352325 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6xchx" podUID="1612b97a-f223-4e83-8710-5764a3765126" Apr 24 21:27:20.992456 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:20.992412 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/066d9094-f992-4853-86f1-b25700fe6070-metrics-certs\") pod \"network-metrics-daemon-kpb2c\" (UID: \"066d9094-f992-4853-86f1-b25700fe6070\") " pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:27:20.992630 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:20.992568 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:20.992630 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:20.992626 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/066d9094-f992-4853-86f1-b25700fe6070-metrics-certs podName:066d9094-f992-4853-86f1-b25700fe6070 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:36.992612202 +0000 UTC m=+34.253181017 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/066d9094-f992-4853-86f1-b25700fe6070-metrics-certs") pod "network-metrics-daemon-kpb2c" (UID: "066d9094-f992-4853-86f1-b25700fe6070") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:21.194563 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:21.194529 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xgrqn\" (UniqueName: \"kubernetes.io/projected/1612b97a-f223-4e83-8710-5764a3765126-kube-api-access-xgrqn\") pod \"network-check-target-6xchx\" (UID: \"1612b97a-f223-4e83-8710-5764a3765126\") " pod="openshift-network-diagnostics/network-check-target-6xchx" Apr 24 21:27:21.194754 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:21.194688 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:21.194754 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:21.194705 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:21.194754 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:21.194715 2573 projected.go:194] Error preparing data for projected volume kube-api-access-xgrqn for pod openshift-network-diagnostics/network-check-target-6xchx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:21.194925 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:21.194770 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1612b97a-f223-4e83-8710-5764a3765126-kube-api-access-xgrqn podName:1612b97a-f223-4e83-8710-5764a3765126 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:37.19475362 +0000 UTC m=+34.455322436 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-xgrqn" (UniqueName: "kubernetes.io/projected/1612b97a-f223-4e83-8710-5764a3765126-kube-api-access-xgrqn") pod "network-check-target-6xchx" (UID: "1612b97a-f223-4e83-8710-5764a3765126") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:21.352204 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:21.352178 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:27:21.352381 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:21.352291 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpb2c" podUID="066d9094-f992-4853-86f1-b25700fe6070" Apr 24 21:27:21.808722 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:21.808625 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-8bd94"] Apr 24 21:27:21.909346 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:21.909309 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8bd94" Apr 24 21:27:21.909508 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:21.909403 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8bd94" podUID="e9999088-0209-4da1-b3c6-92c0d2e48409" Apr 24 21:27:22.001827 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:22.001784 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e9999088-0209-4da1-b3c6-92c0d2e48409-dbus\") pod \"global-pull-secret-syncer-8bd94\" (UID: \"e9999088-0209-4da1-b3c6-92c0d2e48409\") " pod="kube-system/global-pull-secret-syncer-8bd94" Apr 24 21:27:22.002007 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:22.001848 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e9999088-0209-4da1-b3c6-92c0d2e48409-kubelet-config\") pod \"global-pull-secret-syncer-8bd94\" (UID: \"e9999088-0209-4da1-b3c6-92c0d2e48409\") " pod="kube-system/global-pull-secret-syncer-8bd94" Apr 24 21:27:22.002007 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:22.001878 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e9999088-0209-4da1-b3c6-92c0d2e48409-original-pull-secret\") pod \"global-pull-secret-syncer-8bd94\" (UID: \"e9999088-0209-4da1-b3c6-92c0d2e48409\") " pod="kube-system/global-pull-secret-syncer-8bd94" Apr 24 21:27:22.103145 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:22.103103 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e9999088-0209-4da1-b3c6-92c0d2e48409-kubelet-config\") pod \"global-pull-secret-syncer-8bd94\" (UID: \"e9999088-0209-4da1-b3c6-92c0d2e48409\") " pod="kube-system/global-pull-secret-syncer-8bd94" Apr 24 21:27:22.103328 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:22.103161 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e9999088-0209-4da1-b3c6-92c0d2e48409-original-pull-secret\") pod \"global-pull-secret-syncer-8bd94\" (UID: \"e9999088-0209-4da1-b3c6-92c0d2e48409\") " pod="kube-system/global-pull-secret-syncer-8bd94" Apr 24 21:27:22.103328 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:22.103215 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e9999088-0209-4da1-b3c6-92c0d2e48409-kubelet-config\") pod \"global-pull-secret-syncer-8bd94\" (UID: \"e9999088-0209-4da1-b3c6-92c0d2e48409\") " pod="kube-system/global-pull-secret-syncer-8bd94" Apr 24 21:27:22.103328 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:22.103247 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e9999088-0209-4da1-b3c6-92c0d2e48409-dbus\") pod \"global-pull-secret-syncer-8bd94\" (UID: \"e9999088-0209-4da1-b3c6-92c0d2e48409\") " pod="kube-system/global-pull-secret-syncer-8bd94" Apr 24 21:27:22.103475 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:22.103339 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:22.103475 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:22.103403 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9999088-0209-4da1-b3c6-92c0d2e48409-original-pull-secret podName:e9999088-0209-4da1-b3c6-92c0d2e48409 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:22.60338416 +0000 UTC m=+19.863952970 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e9999088-0209-4da1-b3c6-92c0d2e48409-original-pull-secret") pod "global-pull-secret-syncer-8bd94" (UID: "e9999088-0209-4da1-b3c6-92c0d2e48409") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:22.103475 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:22.103437 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e9999088-0209-4da1-b3c6-92c0d2e48409-dbus\") pod \"global-pull-secret-syncer-8bd94\" (UID: \"e9999088-0209-4da1-b3c6-92c0d2e48409\") " pod="kube-system/global-pull-secret-syncer-8bd94" Apr 24 21:27:22.352806 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:22.352777 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6xchx" Apr 24 21:27:22.352953 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:22.352880 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6xchx" podUID="1612b97a-f223-4e83-8710-5764a3765126" Apr 24 21:27:22.605982 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:22.605541 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e9999088-0209-4da1-b3c6-92c0d2e48409-original-pull-secret\") pod \"global-pull-secret-syncer-8bd94\" (UID: \"e9999088-0209-4da1-b3c6-92c0d2e48409\") " pod="kube-system/global-pull-secret-syncer-8bd94" Apr 24 21:27:22.605982 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:22.605945 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:22.606146 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:22.606009 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9999088-0209-4da1-b3c6-92c0d2e48409-original-pull-secret podName:e9999088-0209-4da1-b3c6-92c0d2e48409 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:23.60599014 +0000 UTC m=+20.866558952 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e9999088-0209-4da1-b3c6-92c0d2e48409-original-pull-secret") pod "global-pull-secret-syncer-8bd94" (UID: "e9999088-0209-4da1-b3c6-92c0d2e48409") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:23.353078 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:23.352856 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8bd94" Apr 24 21:27:23.353968 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:23.352937 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:27:23.353968 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:23.353228 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8bd94" podUID="e9999088-0209-4da1-b3c6-92c0d2e48409" Apr 24 21:27:23.353968 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:23.353360 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpb2c" podUID="066d9094-f992-4853-86f1-b25700fe6070" Apr 24 21:27:23.486437 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:23.486405 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" event={"ID":"c595912d-1543-4c43-8a11-3dbcf0f15050","Type":"ContainerStarted","Data":"69eedaee88e764fc072875f22c847e19f10ff3406d654c9c11dd90a04bb73d20"} Apr 24 21:27:23.486582 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:23.486445 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" event={"ID":"c595912d-1543-4c43-8a11-3dbcf0f15050","Type":"ContainerStarted","Data":"91fa4d6f9af56fe9a99d4e960b2babf9d1d253de4d8a6acadc23d034666b4c81"} Apr 24 21:27:23.486582 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:23.486457 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" event={"ID":"c595912d-1543-4c43-8a11-3dbcf0f15050","Type":"ContainerStarted","Data":"0cb3d1d35555c01022594e94ee5202cc970cf82bec10bfc32beded85f69c7a67"} Apr 24 21:27:23.486582 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:23.486469 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" event={"ID":"c595912d-1543-4c43-8a11-3dbcf0f15050","Type":"ContainerStarted","Data":"6f124bd9dd07910e029e4892cafd656458233277a4eb0582b8095d4bd3f1149a"} Apr 24 21:27:23.486582 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:23.486479 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" event={"ID":"c595912d-1543-4c43-8a11-3dbcf0f15050","Type":"ContainerStarted","Data":"91908493eb8d6a37a9b3d3bf0794c1926b6c2a9e33d15aa1c06ec2f3d81939d4"} Apr 24 21:27:23.486582 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:23.486489 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" event={"ID":"c595912d-1543-4c43-8a11-3dbcf0f15050","Type":"ContainerStarted","Data":"86d6c100b2ed1bac6a769488e1fe9e0eeffc6f5806b5ae038bb38800ebaadbca"} Apr 24 21:27:23.488082 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:23.488034 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-jhprx" event={"ID":"8b726083-445b-4ee0-8797-c710268b6b65","Type":"ContainerStarted","Data":"bd2e4cfa29f0892e7bc9d5d2f8895942f082beff8b1bddbb21bcbed14e02787d"} Apr 24 21:27:23.489721 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:23.489692 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-29bpz" event={"ID":"102ddca3-9b65-4682-a3d0-5bf546504d17","Type":"ContainerStarted","Data":"c8fd28d69dc9c1fabca4d35075c437a040e3d10d58221d4e19fd8468535a8777"} Apr 24 21:27:23.491322 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:23.491298 2573 generic.go:358] "Generic (PLEG): container finished" podID="3807cdc4-74ff-4e27-bde0-2ed93b428a58" containerID="7033964ec7422dc41330e2c3e6613759205540e0e50ed33429897f0c9d0a8aac" exitCode=0 Apr 24 21:27:23.491421 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:23.491363 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fn4d" event={"ID":"3807cdc4-74ff-4e27-bde0-2ed93b428a58","Type":"ContainerDied","Data":"7033964ec7422dc41330e2c3e6613759205540e0e50ed33429897f0c9d0a8aac"} Apr 24 21:27:23.492744 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:23.492724 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gtgdx" event={"ID":"e61e2625-9935-4ec2-88cd-d2ac5c781886","Type":"ContainerStarted","Data":"953c52166fde98e4a3f9f343dd1da348a9f3fd47a15bc10955ab0bc8d0e501e8"} Apr 24 21:27:23.494515 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:23.494493 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ss89p" event={"ID":"302e0256-bc9b-454a-8fe1-15f7d4a40459","Type":"ContainerStarted","Data":"14d484eddd36c8f63551f9ba282d1b17b190701c4944a90302194533d1cf0bba"} Apr 24 21:27:23.495875 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:23.495851 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-z65xz" event={"ID":"8159f631-f735-47c0-8dd1-8342be18cbcf","Type":"ContainerStarted","Data":"0c9046bd927313dc2d674e22880003895132daedd5adb219f193436a81d1855e"} Apr 24 21:27:23.497360 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:23.497340 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4smd5" event={"ID":"d43e5662-c703-427c-ba4c-60f3f0029405","Type":"ContainerStarted","Data":"c621ece1651a1a2c424f8e4b14f1a303a1b521ac8b8a1ce71c8a9c34baec2f5c"} Apr 24 21:27:23.513559 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:23.513517 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-jhprx" podStartSLOduration=4.036303954 podStartE2EDuration="20.51350482s" podCreationTimestamp="2026-04-24 21:27:03 +0000 UTC" firstStartedPulling="2026-04-24 21:27:05.942922506 +0000 UTC m=+3.203491326" lastFinishedPulling="2026-04-24 21:27:22.420123379 +0000 UTC m=+19.680692192" observedRunningTime="2026-04-24 21:27:23.513088392 +0000 UTC m=+20.773657225" watchObservedRunningTime="2026-04-24 21:27:23.51350482 +0000 UTC m=+20.774073650" Apr 24 21:27:23.531711 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:23.531655 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-29bpz" podStartSLOduration=4.018639486 podStartE2EDuration="20.531639365s" podCreationTimestamp="2026-04-24 21:27:03 +0000 UTC" firstStartedPulling="2026-04-24 21:27:05.941859117 +0000 UTC m=+3.202427931" lastFinishedPulling="2026-04-24 21:27:22.45485899 +0000 UTC m=+19.715427810" observedRunningTime="2026-04-24 21:27:23.531638816 +0000 UTC m=+20.792207647" watchObservedRunningTime="2026-04-24 21:27:23.531639365 +0000 UTC m=+20.792208197" Apr 24 21:27:23.581216 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:23.581164 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-4smd5" podStartSLOduration=8.375103879 podStartE2EDuration="20.581148524s" podCreationTimestamp="2026-04-24 21:27:03 +0000 UTC" firstStartedPulling="2026-04-24 21:27:05.935797666 +0000 UTC m=+3.196366481" lastFinishedPulling="2026-04-24 21:27:18.141842315 +0000 UTC m=+15.402411126" observedRunningTime="2026-04-24 21:27:23.580868161 +0000 UTC m=+20.841436989" watchObservedRunningTime="2026-04-24 21:27:23.581148524 +0000 UTC m=+20.841717333" Apr 24 21:27:23.594956 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:23.594912 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-z65xz" podStartSLOduration=4.115998737 podStartE2EDuration="20.594898954s" podCreationTimestamp="2026-04-24 21:27:03 +0000 UTC" firstStartedPulling="2026-04-24 21:27:05.939885766 +0000 UTC m=+3.200454575" lastFinishedPulling="2026-04-24 21:27:22.418785982 +0000 UTC m=+19.679354792" observedRunningTime="2026-04-24 21:27:23.59475432 +0000 UTC m=+20.855323174" watchObservedRunningTime="2026-04-24 21:27:23.594898954 +0000 UTC m=+20.855467783" Apr 24 21:27:23.613764 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:23.613726 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e9999088-0209-4da1-b3c6-92c0d2e48409-original-pull-secret\") pod \"global-pull-secret-syncer-8bd94\" (UID: \"e9999088-0209-4da1-b3c6-92c0d2e48409\") " pod="kube-system/global-pull-secret-syncer-8bd94" Apr 24 21:27:23.613914 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:23.613879 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:23.613992 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:23.613948 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9999088-0209-4da1-b3c6-92c0d2e48409-original-pull-secret podName:e9999088-0209-4da1-b3c6-92c0d2e48409 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:25.613930606 +0000 UTC m=+22.874499427 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e9999088-0209-4da1-b3c6-92c0d2e48409-original-pull-secret") pod "global-pull-secret-syncer-8bd94" (UID: "e9999088-0209-4da1-b3c6-92c0d2e48409") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:23.929643 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:23.929482 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:27:24.319837 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:24.319712 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:27:23.929637765Z","UUID":"e77bef18-af98-47ac-b5e7-f08c2090151d","Handler":null,"Name":"","Endpoint":""} Apr 24 21:27:24.322118 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:24.322083 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:27:24.322259 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:24.322127 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:27:24.352127 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:24.352077 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6xchx" Apr 24 21:27:24.352312 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:24.352223 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6xchx" podUID="1612b97a-f223-4e83-8710-5764a3765126" Apr 24 21:27:24.501173 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:24.501137 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-h2b2b" event={"ID":"701e782c-df09-4f2a-a23e-72ec0675fcb7","Type":"ContainerStarted","Data":"bb6e6f513d26cda82df71a922e08599a1fc107146ecaaf11c187404dba7ae415"} Apr 24 21:27:24.502878 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:24.502779 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ss89p" event={"ID":"302e0256-bc9b-454a-8fe1-15f7d4a40459","Type":"ContainerStarted","Data":"b324f4532a7d3ecf65104894a3577049eeb52329ab2d380a78b5b5dd00ff4e29"} Apr 24 21:27:24.513527 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:24.513480 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-gtgdx" podStartSLOduration=5.028906661 podStartE2EDuration="21.513462816s" podCreationTimestamp="2026-04-24 21:27:03 +0000 UTC" firstStartedPulling="2026-04-24 21:27:05.933835976 +0000 UTC m=+3.194404791" lastFinishedPulling="2026-04-24 21:27:22.418392132 +0000 UTC m=+19.678960946" observedRunningTime="2026-04-24 21:27:23.609435737 +0000 UTC m=+20.870004567" watchObservedRunningTime="2026-04-24 21:27:24.513462816 +0000 UTC m=+21.774031648" Apr 24 21:27:25.249170 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:25.249134 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-4smd5" Apr 24 21:27:25.249878 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:25.249851 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-4smd5" Apr 24 21:27:25.289350 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:25.289302 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-h2b2b" podStartSLOduration=5.807532156 podStartE2EDuration="22.28928565s" podCreationTimestamp="2026-04-24 21:27:03 +0000 UTC" firstStartedPulling="2026-04-24 21:27:05.936658553 +0000 UTC m=+3.197227367" lastFinishedPulling="2026-04-24 21:27:22.418412037 +0000 UTC m=+19.678980861" observedRunningTime="2026-04-24 21:27:24.513382147 +0000 UTC m=+21.773950992" watchObservedRunningTime="2026-04-24 21:27:25.28928565 +0000 UTC m=+22.549854533" Apr 24 21:27:25.351834 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:25.351792 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:27:25.351834 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:25.351825 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8bd94" Apr 24 21:27:25.352032 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:25.351928 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpb2c" podUID="066d9094-f992-4853-86f1-b25700fe6070" Apr 24 21:27:25.352123 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:25.352084 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8bd94" podUID="e9999088-0209-4da1-b3c6-92c0d2e48409" Apr 24 21:27:25.507364 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:25.507328 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ss89p" event={"ID":"302e0256-bc9b-454a-8fe1-15f7d4a40459","Type":"ContainerStarted","Data":"022af0ec003bfcc307d0431f11d073ea1dc2cd0bbb4c9e53cfb3db0fdcb1fc41"} Apr 24 21:27:25.508111 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:25.508076 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-4smd5" Apr 24 21:27:25.508600 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:25.508566 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-4smd5" Apr 24 21:27:25.534215 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:25.534168 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ss89p" podStartSLOduration=3.649144501 podStartE2EDuration="22.534151134s" podCreationTimestamp="2026-04-24 21:27:03 +0000 UTC" firstStartedPulling="2026-04-24 21:27:05.946835018 +0000 UTC m=+3.207403840" lastFinishedPulling="2026-04-24 21:27:24.831841664 +0000 UTC m=+22.092410473" observedRunningTime="2026-04-24 21:27:25.533867691 +0000 UTC m=+22.794436521" watchObservedRunningTime="2026-04-24 21:27:25.534151134 +0000 UTC m=+22.794719966" Apr 24 21:27:25.631205 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:25.631169 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e9999088-0209-4da1-b3c6-92c0d2e48409-original-pull-secret\") pod \"global-pull-secret-syncer-8bd94\" (UID: \"e9999088-0209-4da1-b3c6-92c0d2e48409\") " pod="kube-system/global-pull-secret-syncer-8bd94" Apr 24 21:27:25.631383 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:25.631300 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:25.631383 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:25.631383 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9999088-0209-4da1-b3c6-92c0d2e48409-original-pull-secret podName:e9999088-0209-4da1-b3c6-92c0d2e48409 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:29.631363759 +0000 UTC m=+26.891932570 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e9999088-0209-4da1-b3c6-92c0d2e48409-original-pull-secret") pod "global-pull-secret-syncer-8bd94" (UID: "e9999088-0209-4da1-b3c6-92c0d2e48409") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:26.351856 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:26.351827 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6xchx" Apr 24 21:27:26.352030 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:26.351940 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6xchx" podUID="1612b97a-f223-4e83-8710-5764a3765126" Apr 24 21:27:26.513136 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:26.513083 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" event={"ID":"c595912d-1543-4c43-8a11-3dbcf0f15050","Type":"ContainerStarted","Data":"c1c246a51cc2f03c6c0fcb7b92dbb255fb2c60c1942c8c3cb0b61c1a60b7f9b4"} Apr 24 21:27:27.351924 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:27.351891 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8bd94" Apr 24 21:27:27.352135 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:27.351894 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:27:27.352135 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:27.351998 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8bd94" podUID="e9999088-0209-4da1-b3c6-92c0d2e48409" Apr 24 21:27:27.352135 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:27.352083 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpb2c" podUID="066d9094-f992-4853-86f1-b25700fe6070" Apr 24 21:27:27.516035 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:27.515993 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fn4d" event={"ID":"3807cdc4-74ff-4e27-bde0-2ed93b428a58","Type":"ContainerStarted","Data":"f5fcc833edec30ca0605e52130a5b8e75549ecbe3a1a00082a9733660a9ca7f3"} Apr 24 21:27:28.352444 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.352258 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6xchx" Apr 24 21:27:28.352624 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:28.352522 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6xchx" podUID="1612b97a-f223-4e83-8710-5764a3765126" Apr 24 21:27:28.520705 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.520668 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" event={"ID":"c595912d-1543-4c43-8a11-3dbcf0f15050","Type":"ContainerStarted","Data":"6762e331a6f6cac0d79b8d78180b2b3cd436c30640dc1e1b54dc942dc34c6c65"} Apr 24 21:27:28.521200 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.520987 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:28.521200 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.521023 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:28.521200 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.521037 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:28.522337 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.522309 2573 generic.go:358] "Generic (PLEG): container finished" podID="3807cdc4-74ff-4e27-bde0-2ed93b428a58" containerID="f5fcc833edec30ca0605e52130a5b8e75549ecbe3a1a00082a9733660a9ca7f3" exitCode=0 Apr 24 21:27:28.522430 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.522362 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fn4d" event={"ID":"3807cdc4-74ff-4e27-bde0-2ed93b428a58","Type":"ContainerDied","Data":"f5fcc833edec30ca0605e52130a5b8e75549ecbe3a1a00082a9733660a9ca7f3"} Apr 24 21:27:28.536354 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.536327 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:28.536480 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.536397 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:27:28.551044 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:28.550999 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" podStartSLOduration=8.788710788 podStartE2EDuration="25.550986395s" podCreationTimestamp="2026-04-24 21:27:03 +0000 UTC" firstStartedPulling="2026-04-24 21:27:05.945320997 +0000 UTC m=+3.205889816" lastFinishedPulling="2026-04-24 21:27:22.70759661 +0000 UTC m=+19.968165423" observedRunningTime="2026-04-24 21:27:28.549834504 +0000 UTC m=+25.810403334" watchObservedRunningTime="2026-04-24 21:27:28.550986395 +0000 UTC m=+25.811555225" Apr 24 21:27:29.352175 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:29.352132 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:27:29.352328 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:29.352140 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8bd94" Apr 24 21:27:29.352328 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:29.352240 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpb2c" podUID="066d9094-f992-4853-86f1-b25700fe6070" Apr 24 21:27:29.352434 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:29.352336 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8bd94" podUID="e9999088-0209-4da1-b3c6-92c0d2e48409" Apr 24 21:27:29.526350 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:29.526265 2573 generic.go:358] "Generic (PLEG): container finished" podID="3807cdc4-74ff-4e27-bde0-2ed93b428a58" containerID="5aefc3b7f7a3ea8d866390050d3504fc51fb3e6149c77697cca6ee2dc7b1f28a" exitCode=0 Apr 24 21:27:29.526741 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:29.526362 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fn4d" event={"ID":"3807cdc4-74ff-4e27-bde0-2ed93b428a58","Type":"ContainerDied","Data":"5aefc3b7f7a3ea8d866390050d3504fc51fb3e6149c77697cca6ee2dc7b1f28a"} Apr 24 21:27:29.583280 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:29.583251 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8bd94"] Apr 24 21:27:29.583421 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:29.583339 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8bd94" Apr 24 21:27:29.583456 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:29.583415 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8bd94" podUID="e9999088-0209-4da1-b3c6-92c0d2e48409" Apr 24 21:27:29.588999 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:29.588963 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kpb2c"] Apr 24 21:27:29.589151 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:29.589132 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:27:29.589868 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:29.589291 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpb2c" podUID="066d9094-f992-4853-86f1-b25700fe6070" Apr 24 21:27:29.599357 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:29.599312 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6xchx"] Apr 24 21:27:29.599488 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:29.599424 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6xchx" Apr 24 21:27:29.599550 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:29.599510 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6xchx" podUID="1612b97a-f223-4e83-8710-5764a3765126" Apr 24 21:27:29.665196 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:29.665163 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e9999088-0209-4da1-b3c6-92c0d2e48409-original-pull-secret\") pod \"global-pull-secret-syncer-8bd94\" (UID: \"e9999088-0209-4da1-b3c6-92c0d2e48409\") " pod="kube-system/global-pull-secret-syncer-8bd94" Apr 24 21:27:29.665616 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:29.665589 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:29.665725 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:29.665668 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9999088-0209-4da1-b3c6-92c0d2e48409-original-pull-secret podName:e9999088-0209-4da1-b3c6-92c0d2e48409 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:37.665647595 +0000 UTC m=+34.926216419 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e9999088-0209-4da1-b3c6-92c0d2e48409-original-pull-secret") pod "global-pull-secret-syncer-8bd94" (UID: "e9999088-0209-4da1-b3c6-92c0d2e48409") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:30.529984 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:30.529891 2573 generic.go:358] "Generic (PLEG): container finished" podID="3807cdc4-74ff-4e27-bde0-2ed93b428a58" containerID="b62c034051c6c4d4ac15ae66b76991e23b3ea3db3cf9da9db5e8b01ba4f916ba" exitCode=0 Apr 24 21:27:30.530439 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:30.529978 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fn4d" event={"ID":"3807cdc4-74ff-4e27-bde0-2ed93b428a58","Type":"ContainerDied","Data":"b62c034051c6c4d4ac15ae66b76991e23b3ea3db3cf9da9db5e8b01ba4f916ba"} Apr 24 21:27:31.352843 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:31.352804 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6xchx" Apr 24 21:27:31.353061 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:31.352929 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:27:31.353061 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:31.352936 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6xchx" podUID="1612b97a-f223-4e83-8710-5764a3765126" Apr 24 21:27:31.353061 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:31.353030 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpb2c" podUID="066d9094-f992-4853-86f1-b25700fe6070" Apr 24 21:27:31.353214 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:31.353076 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8bd94" Apr 24 21:27:31.353214 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:31.353166 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8bd94" podUID="e9999088-0209-4da1-b3c6-92c0d2e48409" Apr 24 21:27:33.352881 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:33.352678 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6xchx" Apr 24 21:27:33.353493 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:33.352767 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:27:33.353493 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:33.352976 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6xchx" podUID="1612b97a-f223-4e83-8710-5764a3765126" Apr 24 21:27:33.353493 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:33.352798 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8bd94" Apr 24 21:27:33.353493 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:33.353075 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpb2c" podUID="066d9094-f992-4853-86f1-b25700fe6070" Apr 24 21:27:33.353493 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:33.353163 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8bd94" podUID="e9999088-0209-4da1-b3c6-92c0d2e48409" Apr 24 21:27:35.352277 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:35.352243 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8bd94" Apr 24 21:27:35.352940 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:35.352244 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:27:35.352940 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:35.352369 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8bd94" podUID="e9999088-0209-4da1-b3c6-92c0d2e48409" Apr 24 21:27:35.352940 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:35.352244 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6xchx" Apr 24 21:27:35.352940 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:35.352459 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpb2c" podUID="066d9094-f992-4853-86f1-b25700fe6070" Apr 24 21:27:35.352940 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:35.352509 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6xchx" podUID="1612b97a-f223-4e83-8710-5764a3765126" Apr 24 21:27:36.543272 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.543232 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fn4d" event={"ID":"3807cdc4-74ff-4e27-bde0-2ed93b428a58","Type":"ContainerStarted","Data":"6ce996e9cbf9549483a5cc8dd32dd26f433c0a38ca00a93920c9324378ff0676"} Apr 24 21:27:36.545310 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.545287 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-201.ec2.internal" event="NodeReady" Apr 24 21:27:36.548526 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.546126 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:27:36.587871 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.587835 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7qths"] Apr 24 21:27:36.602919 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.602892 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-4nztl"] Apr 24 21:27:36.603066 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.603042 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7qths" Apr 24 21:27:36.605481 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.605460 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:36.605639 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.605623 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:36.611145 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.611123 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-l7c9n\"" Apr 24 21:27:36.615657 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.615633 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5cd4fd4699-r7szn"] Apr 24 21:27:36.615832 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.615810 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-4nztl" Apr 24 21:27:36.627954 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.627924 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-44vlp"] Apr 24 21:27:36.628173 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.628152 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:27:36.631785 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.631762 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:36.632354 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.632336 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-hc897\"" Apr 24 21:27:36.633984 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.633549 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 24 21:27:36.636050 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.635792 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:36.640361 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.640209 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 24 21:27:36.640361 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.640236 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-bckfb\"" Apr 24 21:27:36.647724 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.642195 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 21:27:36.647724 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.642404 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 21:27:36.647724 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.646297 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsh7k"] Apr 24 21:27:36.647724 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.647192 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 21:27:36.647724 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.647495 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 21:27:36.648230 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.648019 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-44vlp" Apr 24 21:27:36.651125 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.651055 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 24 21:27:36.654246 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.653181 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:27:36.654246 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.653349 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-kxrnt\"" Apr 24 21:27:36.654246 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.653597 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:27:36.654246 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.653781 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 24 21:27:36.656850 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.656551 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 24 21:27:36.661675 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.661597 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 24 21:27:36.676229 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.676204 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-58974b8966-z2xx8"] Apr 24 21:27:36.676370 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.676352 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsh7k" Apr 24 21:27:36.678638 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.678617 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 24 21:27:36.679275 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.679257 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-795kl\"" Apr 24 21:27:36.679380 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.679309 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:36.679729 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.679714 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:36.691551 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.691530 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7qths"] Apr 24 21:27:36.691551 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.691557 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-4nztl"] Apr 24 21:27:36.691718 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.691568 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-59fzh"] Apr 24 21:27:36.691718 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.691703 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:27:36.694025 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.694008 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 24 21:27:36.694120 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.694053 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 24 21:27:36.694557 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.694536 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 21:27:36.694599 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.694583 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 21:27:36.695081 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.695066 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 24 21:27:36.695656 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.695634 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 24 21:27:36.696017 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.696002 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-99vck\"" Apr 24 21:27:36.716667 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.716641 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-registry-certificates\") pod \"image-registry-5cd4fd4699-r7szn\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:27:36.716776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.716695 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3e83451-d37f-4c5b-837f-6440bed57b91-trusted-ca\") pod \"console-operator-9d4b6777b-4nztl\" (UID: \"b3e83451-d37f-4c5b-837f-6440bed57b91\") " pod="openshift-console-operator/console-operator-9d4b6777b-4nztl" Apr 24 21:27:36.716776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.716753 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c4788265-16ce-4770-b838-025f0f7d06aa-tmp\") pod \"insights-operator-585dfdc468-44vlp\" (UID: \"c4788265-16ce-4770-b838-025f0f7d06aa\") " pod="openshift-insights/insights-operator-585dfdc468-44vlp" Apr 24 21:27:36.716887 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.716789 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4788265-16ce-4770-b838-025f0f7d06aa-serving-cert\") pod \"insights-operator-585dfdc468-44vlp\" (UID: \"c4788265-16ce-4770-b838-025f0f7d06aa\") " pod="openshift-insights/insights-operator-585dfdc468-44vlp" Apr 24 21:27:36.716887 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.716839 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3e83451-d37f-4c5b-837f-6440bed57b91-config\") pod \"console-operator-9d4b6777b-4nztl\" (UID: \"b3e83451-d37f-4c5b-837f-6440bed57b91\") " pod="openshift-console-operator/console-operator-9d4b6777b-4nztl" Apr 24 21:27:36.716887 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.716860 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3e83451-d37f-4c5b-837f-6440bed57b91-serving-cert\") pod \"console-operator-9d4b6777b-4nztl\" (UID: \"b3e83451-d37f-4c5b-837f-6440bed57b91\") " pod="openshift-console-operator/console-operator-9d4b6777b-4nztl" Apr 24 21:27:36.716887 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.716877 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f7md\" (UniqueName: \"kubernetes.io/projected/b3e83451-d37f-4c5b-837f-6440bed57b91-kube-api-access-7f7md\") pod \"console-operator-9d4b6777b-4nztl\" (UID: \"b3e83451-d37f-4c5b-837f-6440bed57b91\") " pod="openshift-console-operator/console-operator-9d4b6777b-4nztl" Apr 24 21:27:36.717078 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.716901 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4788265-16ce-4770-b838-025f0f7d06aa-service-ca-bundle\") pod \"insights-operator-585dfdc468-44vlp\" (UID: \"c4788265-16ce-4770-b838-025f0f7d06aa\") " pod="openshift-insights/insights-operator-585dfdc468-44vlp" Apr 24 21:27:36.717078 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.716937 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-image-registry-private-configuration\") pod \"image-registry-5cd4fd4699-r7szn\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:27:36.717078 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.716987 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c4788265-16ce-4770-b838-025f0f7d06aa-snapshots\") pod \"insights-operator-585dfdc468-44vlp\" (UID: \"c4788265-16ce-4770-b838-025f0f7d06aa\") " pod="openshift-insights/insights-operator-585dfdc468-44vlp" Apr 24 21:27:36.717078 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.717011 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-registry-tls\") pod \"image-registry-5cd4fd4699-r7szn\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:27:36.717078 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.717027 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-ca-trust-extracted\") pod \"image-registry-5cd4fd4699-r7szn\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:27:36.717078 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.717042 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-trusted-ca\") pod \"image-registry-5cd4fd4699-r7szn\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:27:36.717078 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.717062 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p96wc\" (UniqueName: \"kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-kube-api-access-p96wc\") pod \"image-registry-5cd4fd4699-r7szn\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:27:36.717078 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.717079 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrwkv\" (UniqueName: \"kubernetes.io/projected/c4788265-16ce-4770-b838-025f0f7d06aa-kube-api-access-vrwkv\") pod \"insights-operator-585dfdc468-44vlp\" (UID: \"c4788265-16ce-4770-b838-025f0f7d06aa\") " pod="openshift-insights/insights-operator-585dfdc468-44vlp" Apr 24 21:27:36.717496 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.717117 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-installation-pull-secrets\") pod \"image-registry-5cd4fd4699-r7szn\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:27:36.717496 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.717156 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-bound-sa-token\") pod \"image-registry-5cd4fd4699-r7szn\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:27:36.717496 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.717183 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4788265-16ce-4770-b838-025f0f7d06aa-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-44vlp\" (UID: \"c4788265-16ce-4770-b838-025f0f7d06aa\") " pod="openshift-insights/insights-operator-585dfdc468-44vlp" Apr 24 21:27:36.717496 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.717211 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxxl8\" (UniqueName: \"kubernetes.io/projected/f2cf6003-5a3e-497b-82b1-afb7021314fc-kube-api-access-qxxl8\") pod \"volume-data-source-validator-7c6cbb6c87-7qths\" (UID: \"f2cf6003-5a3e-497b-82b1-afb7021314fc\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7qths" Apr 24 21:27:36.727906 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.727884 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v4rfn"] Apr 24 21:27:36.728070 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.728034 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59fzh" Apr 24 21:27:36.730846 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.730825 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 24 21:27:36.730846 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.730840 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-4c5kw\"" Apr 24 21:27:36.731113 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.730848 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:27:36.731174 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.731150 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:27:36.731255 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.731240 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 24 21:27:36.743569 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.743548 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-bxds4"] Apr 24 21:27:36.744011 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.743986 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v4rfn" Apr 24 21:27:36.746259 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.746239 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 24 21:27:36.746361 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.746321 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 24 21:27:36.746411 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.746373 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:36.746411 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.746391 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:36.746550 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.746535 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-fjw92\"" Apr 24 21:27:36.755392 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.755375 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsh7k"] Apr 24 21:27:36.755392 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.755395 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-wh284"] Apr 24 21:27:36.755524 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.755511 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bxds4" Apr 24 21:27:36.758494 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.758475 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:27:36.759162 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.759112 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:27:36.759522 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.759510 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wtdzs\"" Apr 24 21:27:36.767611 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.767594 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-746f4dc565-zwj7p"] Apr 24 21:27:36.767726 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.767712 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wh284" Apr 24 21:27:36.770150 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.770135 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-vsgnj\"" Apr 24 21:27:36.770657 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.770642 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 24 21:27:36.770902 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.770890 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 24 21:27:36.779827 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.779810 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8rjd5"] Apr 24 21:27:36.779950 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.779936 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-746f4dc565-zwj7p" Apr 24 21:27:36.782257 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.782237 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 21:27:36.782339 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.782285 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 21:27:36.782527 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.782511 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 21:27:36.782808 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.782792 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 24 21:27:36.782853 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.782839 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-5qblf\"" Apr 24 21:27:36.791735 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.791713 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vlq8x"] Apr 24 21:27:36.791869 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.791854 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8rjd5" Apr 24 21:27:36.794302 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.794285 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:27:36.794387 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.794303 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:27:36.794387 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.794292 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:27:36.794660 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.794648 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gftjb\"" Apr 24 21:27:36.806275 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.806260 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9"] Apr 24 21:27:36.806492 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.806470 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vlq8x" Apr 24 21:27:36.808910 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.808890 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-sk9w9\"" Apr 24 21:27:36.809001 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.808929 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:27:36.809001 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.808971 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:27:36.817649 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.817633 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3e83451-d37f-4c5b-837f-6440bed57b91-serving-cert\") pod \"console-operator-9d4b6777b-4nztl\" (UID: \"b3e83451-d37f-4c5b-837f-6440bed57b91\") " pod="openshift-console-operator/console-operator-9d4b6777b-4nztl" Apr 24 21:27:36.817741 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.817659 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7f7md\" (UniqueName: \"kubernetes.io/projected/b3e83451-d37f-4c5b-837f-6440bed57b91-kube-api-access-7f7md\") pod \"console-operator-9d4b6777b-4nztl\" (UID: \"b3e83451-d37f-4c5b-837f-6440bed57b91\") " pod="openshift-console-operator/console-operator-9d4b6777b-4nztl" Apr 24 21:27:36.817741 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.817679 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-image-registry-private-configuration\") pod \"image-registry-5cd4fd4699-r7szn\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:27:36.817741 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.817697 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-registry-tls\") pod \"image-registry-5cd4fd4699-r7szn\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:27:36.817741 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.817715 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c4788265-16ce-4770-b838-025f0f7d06aa-snapshots\") pod \"insights-operator-585dfdc468-44vlp\" (UID: \"c4788265-16ce-4770-b838-025f0f7d06aa\") " pod="openshift-insights/insights-operator-585dfdc468-44vlp" Apr 24 21:27:36.817937 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.817744 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/494a20d0-ace3-481d-ae39-780b958f7150-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-59fzh\" (UID: \"494a20d0-ace3-481d-ae39-780b958f7150\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59fzh" Apr 24 21:27:36.817937 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:36.817796 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:36.817937 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:36.817809 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cd4fd4699-r7szn: secret "image-registry-tls" not found Apr 24 21:27:36.817937 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.817838 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-ca-trust-extracted\") pod \"image-registry-5cd4fd4699-r7szn\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:27:36.817937 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:36.817858 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-registry-tls podName:4c9bfa95-cfa8-4d7f-9dda-3d100a02d251 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:37.317842602 +0000 UTC m=+34.578411427 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-registry-tls") pod "image-registry-5cd4fd4699-r7szn" (UID: "4c9bfa95-cfa8-4d7f-9dda-3d100a02d251") : secret "image-registry-tls" not found Apr 24 21:27:36.817937 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.817877 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-trusted-ca\") pod \"image-registry-5cd4fd4699-r7szn\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:27:36.817937 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.817897 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp4vq\" (UniqueName: \"kubernetes.io/projected/1768c1b9-8390-4833-9613-4efec510f36b-kube-api-access-cp4vq\") pod \"router-default-58974b8966-z2xx8\" (UID: \"1768c1b9-8390-4833-9613-4efec510f36b\") " pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:27:36.817937 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.817929 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vrwkv\" (UniqueName: \"kubernetes.io/projected/c4788265-16ce-4770-b838-025f0f7d06aa-kube-api-access-vrwkv\") pod \"insights-operator-585dfdc468-44vlp\" (UID: \"c4788265-16ce-4770-b838-025f0f7d06aa\") " pod="openshift-insights/insights-operator-585dfdc468-44vlp" Apr 24 21:27:36.819377 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.817954 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1768c1b9-8390-4833-9613-4efec510f36b-metrics-certs\") pod \"router-default-58974b8966-z2xx8\" (UID: \"1768c1b9-8390-4833-9613-4efec510f36b\") " pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:27:36.819377 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.817983 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4788265-16ce-4770-b838-025f0f7d06aa-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-44vlp\" (UID: \"c4788265-16ce-4770-b838-025f0f7d06aa\") " pod="openshift-insights/insights-operator-585dfdc468-44vlp" Apr 24 21:27:36.819377 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.818012 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x7cp\" (UniqueName: \"kubernetes.io/projected/d8768ef0-e03a-46c3-97b6-ca6035eec03f-kube-api-access-4x7cp\") pod \"cluster-samples-operator-6dc5bdb6b4-fsh7k\" (UID: \"d8768ef0-e03a-46c3-97b6-ca6035eec03f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsh7k" Apr 24 21:27:36.819377 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.818043 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1768c1b9-8390-4833-9613-4efec510f36b-default-certificate\") pod \"router-default-58974b8966-z2xx8\" (UID: \"1768c1b9-8390-4833-9613-4efec510f36b\") " pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:27:36.819377 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.818067 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k229z\" (UniqueName: \"kubernetes.io/projected/bd2084eb-fcc8-42e3-b526-171c67ac7a71-kube-api-access-k229z\") pod \"kube-storage-version-migrator-operator-6769c5d45-v4rfn\" (UID: \"bd2084eb-fcc8-42e3-b526-171c67ac7a71\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v4rfn" Apr 24 21:27:36.819377 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.818115 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-registry-certificates\") pod \"image-registry-5cd4fd4699-r7szn\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:27:36.819377 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.818151 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3e83451-d37f-4c5b-837f-6440bed57b91-trusted-ca\") pod \"console-operator-9d4b6777b-4nztl\" (UID: \"b3e83451-d37f-4c5b-837f-6440bed57b91\") " pod="openshift-console-operator/console-operator-9d4b6777b-4nztl" Apr 24 21:27:36.819377 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.818236 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-ca-trust-extracted\") pod \"image-registry-5cd4fd4699-r7szn\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:27:36.819377 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.818293 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3e83451-d37f-4c5b-837f-6440bed57b91-config\") pod \"console-operator-9d4b6777b-4nztl\" (UID: \"b3e83451-d37f-4c5b-837f-6440bed57b91\") " pod="openshift-console-operator/console-operator-9d4b6777b-4nztl" Apr 24 21:27:36.819377 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.818324 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4788265-16ce-4770-b838-025f0f7d06aa-service-ca-bundle\") pod \"insights-operator-585dfdc468-44vlp\" (UID: \"c4788265-16ce-4770-b838-025f0f7d06aa\") " pod="openshift-insights/insights-operator-585dfdc468-44vlp" Apr 24 21:27:36.819377 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.818338 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c4788265-16ce-4770-b838-025f0f7d06aa-snapshots\") pod \"insights-operator-585dfdc468-44vlp\" (UID: \"c4788265-16ce-4770-b838-025f0f7d06aa\") " pod="openshift-insights/insights-operator-585dfdc468-44vlp" Apr 24 21:27:36.819377 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.818406 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrzp8\" (UniqueName: \"kubernetes.io/projected/494a20d0-ace3-481d-ae39-780b958f7150-kube-api-access-wrzp8\") pod \"cluster-monitoring-operator-75587bd455-59fzh\" (UID: \"494a20d0-ace3-481d-ae39-780b958f7150\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59fzh" Apr 24 21:27:36.819377 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.818460 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1768c1b9-8390-4833-9613-4efec510f36b-service-ca-bundle\") pod \"router-default-58974b8966-z2xx8\" (UID: \"1768c1b9-8390-4833-9613-4efec510f36b\") " pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:27:36.819377 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.818747 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-installation-pull-secrets\") pod \"image-registry-5cd4fd4699-r7szn\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:27:36.819377 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.818898 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-trusted-ca\") pod \"image-registry-5cd4fd4699-r7szn\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:27:36.820116 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.818930 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3e83451-d37f-4c5b-837f-6440bed57b91-trusted-ca\") pod \"console-operator-9d4b6777b-4nztl\" (UID: \"b3e83451-d37f-4c5b-837f-6440bed57b91\") " pod="openshift-console-operator/console-operator-9d4b6777b-4nztl" Apr 24 21:27:36.820116 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.818933 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4788265-16ce-4770-b838-025f0f7d06aa-service-ca-bundle\") pod \"insights-operator-585dfdc468-44vlp\" (UID: \"c4788265-16ce-4770-b838-025f0f7d06aa\") " pod="openshift-insights/insights-operator-585dfdc468-44vlp" Apr 24 21:27:36.820116 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.818985 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4788265-16ce-4770-b838-025f0f7d06aa-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-44vlp\" (UID: \"c4788265-16ce-4770-b838-025f0f7d06aa\") " pod="openshift-insights/insights-operator-585dfdc468-44vlp" Apr 24 21:27:36.820116 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.819026 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p96wc\" (UniqueName: \"kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-kube-api-access-p96wc\") pod \"image-registry-5cd4fd4699-r7szn\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:27:36.820116 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.819113 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd2084eb-fcc8-42e3-b526-171c67ac7a71-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-v4rfn\" (UID: \"bd2084eb-fcc8-42e3-b526-171c67ac7a71\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v4rfn" Apr 24 21:27:36.820116 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.819137 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1768c1b9-8390-4833-9613-4efec510f36b-stats-auth\") pod \"router-default-58974b8966-z2xx8\" (UID: \"1768c1b9-8390-4833-9613-4efec510f36b\") " pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:27:36.820116 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.819163 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/494a20d0-ace3-481d-ae39-780b958f7150-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-59fzh\" (UID: \"494a20d0-ace3-481d-ae39-780b958f7150\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59fzh" Apr 24 21:27:36.820116 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.819189 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd2084eb-fcc8-42e3-b526-171c67ac7a71-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-v4rfn\" (UID: \"bd2084eb-fcc8-42e3-b526-171c67ac7a71\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v4rfn" Apr 24 21:27:36.820116 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.819226 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-bound-sa-token\") pod \"image-registry-5cd4fd4699-r7szn\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:27:36.820116 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.819261 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxxl8\" (UniqueName: \"kubernetes.io/projected/f2cf6003-5a3e-497b-82b1-afb7021314fc-kube-api-access-qxxl8\") pod \"volume-data-source-validator-7c6cbb6c87-7qths\" (UID: \"f2cf6003-5a3e-497b-82b1-afb7021314fc\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7qths" Apr 24 21:27:36.820116 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.819295 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8768ef0-e03a-46c3-97b6-ca6035eec03f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fsh7k\" (UID: \"d8768ef0-e03a-46c3-97b6-ca6035eec03f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsh7k" Apr 24 21:27:36.820116 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.819316 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3e83451-d37f-4c5b-837f-6440bed57b91-config\") pod \"console-operator-9d4b6777b-4nztl\" (UID: \"b3e83451-d37f-4c5b-837f-6440bed57b91\") " pod="openshift-console-operator/console-operator-9d4b6777b-4nztl" Apr 24 21:27:36.820116 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.819340 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c4788265-16ce-4770-b838-025f0f7d06aa-tmp\") pod \"insights-operator-585dfdc468-44vlp\" (UID: \"c4788265-16ce-4770-b838-025f0f7d06aa\") " pod="openshift-insights/insights-operator-585dfdc468-44vlp" Apr 24 21:27:36.820116 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.819369 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4788265-16ce-4770-b838-025f0f7d06aa-serving-cert\") pod \"insights-operator-585dfdc468-44vlp\" (UID: \"c4788265-16ce-4770-b838-025f0f7d06aa\") " pod="openshift-insights/insights-operator-585dfdc468-44vlp" Apr 24 21:27:36.820116 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.819560 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c4788265-16ce-4770-b838-025f0f7d06aa-tmp\") pod \"insights-operator-585dfdc468-44vlp\" (UID: \"c4788265-16ce-4770-b838-025f0f7d06aa\") " pod="openshift-insights/insights-operator-585dfdc468-44vlp" Apr 24 21:27:36.822297 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.822278 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3e83451-d37f-4c5b-837f-6440bed57b91-serving-cert\") pod \"console-operator-9d4b6777b-4nztl\" (UID: \"b3e83451-d37f-4c5b-837f-6440bed57b91\") " pod="openshift-console-operator/console-operator-9d4b6777b-4nztl" Apr 24 21:27:36.822395 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.822345 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-installation-pull-secrets\") pod \"image-registry-5cd4fd4699-r7szn\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:27:36.822395 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.822369 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4788265-16ce-4770-b838-025f0f7d06aa-serving-cert\") pod \"insights-operator-585dfdc468-44vlp\" (UID: \"c4788265-16ce-4770-b838-025f0f7d06aa\") " pod="openshift-insights/insights-operator-585dfdc468-44vlp" Apr 24 21:27:36.822475 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.822436 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-image-registry-private-configuration\") pod \"image-registry-5cd4fd4699-r7szn\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:27:36.823664 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.823644 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cbb4986b-6xzmw"] Apr 24 21:27:36.823802 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.823786 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" Apr 24 21:27:36.825768 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.825744 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-registry-certificates\") pod \"image-registry-5cd4fd4699-r7szn\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:27:36.826586 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.826547 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 24 21:27:36.828914 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.827283 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 24 21:27:36.828914 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.827525 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrwkv\" (UniqueName: \"kubernetes.io/projected/c4788265-16ce-4770-b838-025f0f7d06aa-kube-api-access-vrwkv\") pod \"insights-operator-585dfdc468-44vlp\" (UID: \"c4788265-16ce-4770-b838-025f0f7d06aa\") " pod="openshift-insights/insights-operator-585dfdc468-44vlp" Apr 24 21:27:36.828914 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.827551 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f7md\" (UniqueName: \"kubernetes.io/projected/b3e83451-d37f-4c5b-837f-6440bed57b91-kube-api-access-7f7md\") pod \"console-operator-9d4b6777b-4nztl\" (UID: \"b3e83451-d37f-4c5b-837f-6440bed57b91\") " pod="openshift-console-operator/console-operator-9d4b6777b-4nztl" Apr 24 21:27:36.828914 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.827956 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 24 21:27:36.828914 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.827971 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 24 21:27:36.830865 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.830341 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxxl8\" (UniqueName: \"kubernetes.io/projected/f2cf6003-5a3e-497b-82b1-afb7021314fc-kube-api-access-qxxl8\") pod \"volume-data-source-validator-7c6cbb6c87-7qths\" (UID: \"f2cf6003-5a3e-497b-82b1-afb7021314fc\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7qths" Apr 24 21:27:36.831773 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.831494 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-bound-sa-token\") pod \"image-registry-5cd4fd4699-r7szn\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:27:36.832840 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.832819 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p96wc\" (UniqueName: \"kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-kube-api-access-p96wc\") pod \"image-registry-5cd4fd4699-r7szn\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:27:36.844822 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.844800 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-44vlp"] Apr 24 21:27:36.844822 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.844824 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jnvn"] Apr 24 21:27:36.844968 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.844952 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cbb4986b-6xzmw" Apr 24 21:27:36.847524 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.847506 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 24 21:27:36.853375 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.853322 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-58974b8966-z2xx8"] Apr 24 21:27:36.853375 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.853340 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5cd4fd4699-r7szn"] Apr 24 21:27:36.853375 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.853348 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v4rfn"] Apr 24 21:27:36.853375 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.853357 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-59fzh"] Apr 24 21:27:36.853375 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.853366 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bxds4"] Apr 24 21:27:36.853375 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.853374 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-wh284"] Apr 24 21:27:36.853375 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.853381 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8rjd5"] Apr 24 21:27:36.853608 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.853389 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-746f4dc565-zwj7p"] Apr 24 21:27:36.853608 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.853399 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cbb4986b-6xzmw"] Apr 24 21:27:36.853608 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.853406 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jnvn"] Apr 24 21:27:36.853608 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.853413 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vlq8x"] Apr 24 21:27:36.853608 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.853421 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9"] Apr 24 21:27:36.853608 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.853455 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jnvn" Apr 24 21:27:36.855571 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.855557 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:36.856145 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.856132 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:36.856183 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.856157 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 24 21:27:36.856218 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.856175 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 24 21:27:36.856361 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.856349 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-dk8m6\"" Apr 24 21:27:36.913472 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.913446 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7qths" Apr 24 21:27:36.920377 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.920351 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4x7cp\" (UniqueName: \"kubernetes.io/projected/d8768ef0-e03a-46c3-97b6-ca6035eec03f-kube-api-access-4x7cp\") pod \"cluster-samples-operator-6dc5bdb6b4-fsh7k\" (UID: \"d8768ef0-e03a-46c3-97b6-ca6035eec03f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsh7k" Apr 24 21:27:36.920501 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.920389 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/c78eaad7-e680-4b37-9286-453234917ab8-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-84459467dd-t4mr9\" (UID: \"c78eaad7-e680-4b37-9286-453234917ab8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" Apr 24 21:27:36.920501 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.920419 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1768c1b9-8390-4833-9613-4efec510f36b-default-certificate\") pod \"router-default-58974b8966-z2xx8\" (UID: \"1768c1b9-8390-4833-9613-4efec510f36b\") " pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:27:36.920501 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.920445 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k229z\" (UniqueName: \"kubernetes.io/projected/bd2084eb-fcc8-42e3-b526-171c67ac7a71-kube-api-access-k229z\") pod \"kube-storage-version-migrator-operator-6769c5d45-v4rfn\" (UID: \"bd2084eb-fcc8-42e3-b526-171c67ac7a71\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v4rfn" Apr 24 21:27:36.920501 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.920472 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/c78eaad7-e680-4b37-9286-453234917ab8-ca\") pod \"cluster-proxy-proxy-agent-84459467dd-t4mr9\" (UID: \"c78eaad7-e680-4b37-9286-453234917ab8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" Apr 24 21:27:36.920501 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.920496 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/c78eaad7-e680-4b37-9286-453234917ab8-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-84459467dd-t4mr9\" (UID: \"c78eaad7-e680-4b37-9286-453234917ab8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" Apr 24 21:27:36.920761 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.920532 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wrzp8\" (UniqueName: \"kubernetes.io/projected/494a20d0-ace3-481d-ae39-780b958f7150-kube-api-access-wrzp8\") pod \"cluster-monitoring-operator-75587bd455-59fzh\" (UID: \"494a20d0-ace3-481d-ae39-780b958f7150\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59fzh" Apr 24 21:27:36.920935 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.920747 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wh284\" (UID: \"5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wh284" Apr 24 21:27:36.920935 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.920793 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7xpj\" (UniqueName: \"kubernetes.io/projected/2cabf83e-0da3-4fe7-ae79-15b6bc9e5179-kube-api-access-h7xpj\") pod \"managed-serviceaccount-addon-agent-746f4dc565-zwj7p\" (UID: \"2cabf83e-0da3-4fe7-ae79-15b6bc9e5179\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-746f4dc565-zwj7p" Apr 24 21:27:36.920935 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.920814 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/69c189c7-8879-4a31-8147-83424613b103-tmp\") pod \"klusterlet-addon-workmgr-7cbb4986b-6xzmw\" (UID: \"69c189c7-8879-4a31-8147-83424613b103\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cbb4986b-6xzmw" Apr 24 21:27:36.920935 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.920841 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1768c1b9-8390-4833-9613-4efec510f36b-service-ca-bundle\") pod \"router-default-58974b8966-z2xx8\" (UID: \"1768c1b9-8390-4833-9613-4efec510f36b\") " pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:27:36.920935 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.920873 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c78eaad7-e680-4b37-9286-453234917ab8-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-84459467dd-t4mr9\" (UID: \"c78eaad7-e680-4b37-9286-453234917ab8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" Apr 24 21:27:36.920935 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.920928 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/c78eaad7-e680-4b37-9286-453234917ab8-hub\") pod \"cluster-proxy-proxy-agent-84459467dd-t4mr9\" (UID: \"c78eaad7-e680-4b37-9286-453234917ab8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" Apr 24 21:27:36.921180 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:36.920971 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1768c1b9-8390-4833-9613-4efec510f36b-service-ca-bundle podName:1768c1b9-8390-4833-9613-4efec510f36b nodeName:}" failed. No retries permitted until 2026-04-24 21:27:37.420948137 +0000 UTC m=+34.681516947 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1768c1b9-8390-4833-9613-4efec510f36b-service-ca-bundle") pod "router-default-58974b8966-z2xx8" (UID: "1768c1b9-8390-4833-9613-4efec510f36b") : configmap references non-existent config key: service-ca.crt Apr 24 21:27:36.921180 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.920997 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlddc\" (UniqueName: \"kubernetes.io/projected/9b1819a7-8889-48bc-9e09-7f5fef3c6672-kube-api-access-zlddc\") pod \"network-check-source-8894fc9bd-vlq8x\" (UID: \"9b1819a7-8889-48bc-9e09-7f5fef3c6672\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vlq8x" Apr 24 21:27:36.921180 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.921024 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd2084eb-fcc8-42e3-b526-171c67ac7a71-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-v4rfn\" (UID: \"bd2084eb-fcc8-42e3-b526-171c67ac7a71\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v4rfn" Apr 24 21:27:36.921180 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.921058 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1768c1b9-8390-4833-9613-4efec510f36b-stats-auth\") pod \"router-default-58974b8966-z2xx8\" (UID: \"1768c1b9-8390-4833-9613-4efec510f36b\") " pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:27:36.921180 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.921086 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/69c189c7-8879-4a31-8147-83424613b103-klusterlet-config\") pod \"klusterlet-addon-workmgr-7cbb4986b-6xzmw\" (UID: \"69c189c7-8879-4a31-8147-83424613b103\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cbb4986b-6xzmw" Apr 24 21:27:36.921180 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.921131 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/494a20d0-ace3-481d-ae39-780b958f7150-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-59fzh\" (UID: \"494a20d0-ace3-481d-ae39-780b958f7150\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59fzh" Apr 24 21:27:36.921180 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.921158 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd2084eb-fcc8-42e3-b526-171c67ac7a71-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-v4rfn\" (UID: \"bd2084eb-fcc8-42e3-b526-171c67ac7a71\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v4rfn" Apr 24 21:27:36.921517 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.921209 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj4xt\" (UniqueName: \"kubernetes.io/projected/49cf726e-12df-4887-9b95-ee4375115c11-kube-api-access-nj4xt\") pod \"dns-default-bxds4\" (UID: \"49cf726e-12df-4887-9b95-ee4375115c11\") " pod="openshift-dns/dns-default-bxds4" Apr 24 21:27:36.921517 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:36.921274 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:36.921517 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:36.921321 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/494a20d0-ace3-481d-ae39-780b958f7150-cluster-monitoring-operator-tls podName:494a20d0-ace3-481d-ae39-780b958f7150 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:37.421307535 +0000 UTC m=+34.681876351 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/494a20d0-ace3-481d-ae39-780b958f7150-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-59fzh" (UID: "494a20d0-ace3-481d-ae39-780b958f7150") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:36.921517 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.921340 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ghd8\" (UniqueName: \"kubernetes.io/projected/6024e94b-b234-46ca-80d6-f505949e48ac-kube-api-access-9ghd8\") pod \"ingress-canary-8rjd5\" (UID: \"6024e94b-b234-46ca-80d6-f505949e48ac\") " pod="openshift-ingress-canary/ingress-canary-8rjd5" Apr 24 21:27:36.921517 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.921391 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8768ef0-e03a-46c3-97b6-ca6035eec03f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fsh7k\" (UID: \"d8768ef0-e03a-46c3-97b6-ca6035eec03f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsh7k" Apr 24 21:27:36.921517 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.921460 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49cf726e-12df-4887-9b95-ee4375115c11-config-volume\") pod \"dns-default-bxds4\" (UID: \"49cf726e-12df-4887-9b95-ee4375115c11\") " pod="openshift-dns/dns-default-bxds4" Apr 24 21:27:36.921517 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:36.921478 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:27:36.921517 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.921509 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6024e94b-b234-46ca-80d6-f505949e48ac-cert\") pod \"ingress-canary-8rjd5\" (UID: \"6024e94b-b234-46ca-80d6-f505949e48ac\") " pod="openshift-ingress-canary/ingress-canary-8rjd5" Apr 24 21:27:36.921877 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.921589 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/49cf726e-12df-4887-9b95-ee4375115c11-tmp-dir\") pod \"dns-default-bxds4\" (UID: \"49cf726e-12df-4887-9b95-ee4375115c11\") " pod="openshift-dns/dns-default-bxds4" Apr 24 21:27:36.921877 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:36.921598 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8768ef0-e03a-46c3-97b6-ca6035eec03f-samples-operator-tls podName:d8768ef0-e03a-46c3-97b6-ca6035eec03f nodeName:}" failed. No retries permitted until 2026-04-24 21:27:37.421581 +0000 UTC m=+34.682149818 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d8768ef0-e03a-46c3-97b6-ca6035eec03f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fsh7k" (UID: "d8768ef0-e03a-46c3-97b6-ca6035eec03f") : secret "samples-operator-tls" not found Apr 24 21:27:36.921877 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.921628 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-wh284\" (UID: \"5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wh284" Apr 24 21:27:36.921877 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.921656 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2cabf83e-0da3-4fe7-ae79-15b6bc9e5179-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-746f4dc565-zwj7p\" (UID: \"2cabf83e-0da3-4fe7-ae79-15b6bc9e5179\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-746f4dc565-zwj7p" Apr 24 21:27:36.921877 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.921689 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/494a20d0-ace3-481d-ae39-780b958f7150-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-59fzh\" (UID: \"494a20d0-ace3-481d-ae39-780b958f7150\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59fzh" Apr 24 21:27:36.921877 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.921716 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/49cf726e-12df-4887-9b95-ee4375115c11-metrics-tls\") pod \"dns-default-bxds4\" (UID: \"49cf726e-12df-4887-9b95-ee4375115c11\") " pod="openshift-dns/dns-default-bxds4" Apr 24 21:27:36.921877 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.921746 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkhds\" (UniqueName: \"kubernetes.io/projected/c78eaad7-e680-4b37-9286-453234917ab8-kube-api-access-pkhds\") pod \"cluster-proxy-proxy-agent-84459467dd-t4mr9\" (UID: \"c78eaad7-e680-4b37-9286-453234917ab8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" Apr 24 21:27:36.921877 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.921779 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cp4vq\" (UniqueName: \"kubernetes.io/projected/1768c1b9-8390-4833-9613-4efec510f36b-kube-api-access-cp4vq\") pod \"router-default-58974b8966-z2xx8\" (UID: \"1768c1b9-8390-4833-9613-4efec510f36b\") " pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:27:36.921877 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.921821 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1768c1b9-8390-4833-9613-4efec510f36b-metrics-certs\") pod \"router-default-58974b8966-z2xx8\" (UID: \"1768c1b9-8390-4833-9613-4efec510f36b\") " pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:27:36.921877 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.921850 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8b6q\" (UniqueName: \"kubernetes.io/projected/69c189c7-8879-4a31-8147-83424613b103-kube-api-access-k8b6q\") pod \"klusterlet-addon-workmgr-7cbb4986b-6xzmw\" (UID: \"69c189c7-8879-4a31-8147-83424613b103\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cbb4986b-6xzmw" Apr 24 21:27:36.922352 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:36.922188 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:27:36.922352 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:36.922250 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1768c1b9-8390-4833-9613-4efec510f36b-metrics-certs podName:1768c1b9-8390-4833-9613-4efec510f36b nodeName:}" failed. No retries permitted until 2026-04-24 21:27:37.422233664 +0000 UTC m=+34.682802487 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1768c1b9-8390-4833-9613-4efec510f36b-metrics-certs") pod "router-default-58974b8966-z2xx8" (UID: "1768c1b9-8390-4833-9613-4efec510f36b") : secret "router-metrics-certs-default" not found Apr 24 21:27:36.922537 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.922513 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/494a20d0-ace3-481d-ae39-780b958f7150-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-59fzh\" (UID: \"494a20d0-ace3-481d-ae39-780b958f7150\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59fzh" Apr 24 21:27:36.923419 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.923396 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1768c1b9-8390-4833-9613-4efec510f36b-default-certificate\") pod \"router-default-58974b8966-z2xx8\" (UID: \"1768c1b9-8390-4833-9613-4efec510f36b\") " pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:27:36.923724 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.923707 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1768c1b9-8390-4833-9613-4efec510f36b-stats-auth\") pod \"router-default-58974b8966-z2xx8\" (UID: \"1768c1b9-8390-4833-9613-4efec510f36b\") " pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:27:36.926385 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.926366 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-4nztl" Apr 24 21:27:36.933766 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.933729 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd2084eb-fcc8-42e3-b526-171c67ac7a71-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-v4rfn\" (UID: \"bd2084eb-fcc8-42e3-b526-171c67ac7a71\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v4rfn" Apr 24 21:27:36.933859 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.933838 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd2084eb-fcc8-42e3-b526-171c67ac7a71-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-v4rfn\" (UID: \"bd2084eb-fcc8-42e3-b526-171c67ac7a71\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v4rfn" Apr 24 21:27:36.934973 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.934952 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrzp8\" (UniqueName: \"kubernetes.io/projected/494a20d0-ace3-481d-ae39-780b958f7150-kube-api-access-wrzp8\") pod \"cluster-monitoring-operator-75587bd455-59fzh\" (UID: \"494a20d0-ace3-481d-ae39-780b958f7150\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59fzh" Apr 24 21:27:36.937150 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.937125 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k229z\" (UniqueName: \"kubernetes.io/projected/bd2084eb-fcc8-42e3-b526-171c67ac7a71-kube-api-access-k229z\") pod \"kube-storage-version-migrator-operator-6769c5d45-v4rfn\" (UID: \"bd2084eb-fcc8-42e3-b526-171c67ac7a71\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v4rfn" Apr 24 21:27:36.939817 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.939792 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp4vq\" (UniqueName: \"kubernetes.io/projected/1768c1b9-8390-4833-9613-4efec510f36b-kube-api-access-cp4vq\") pod \"router-default-58974b8966-z2xx8\" (UID: \"1768c1b9-8390-4833-9613-4efec510f36b\") " pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:27:36.941636 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:36.941616 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x7cp\" (UniqueName: \"kubernetes.io/projected/d8768ef0-e03a-46c3-97b6-ca6035eec03f-kube-api-access-4x7cp\") pod \"cluster-samples-operator-6dc5bdb6b4-fsh7k\" (UID: \"d8768ef0-e03a-46c3-97b6-ca6035eec03f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsh7k" Apr 24 21:27:37.023160 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.022350 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-44vlp" Apr 24 21:27:37.023160 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:37.022832 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:27:37.023160 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:37.022919 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4-networking-console-plugin-cert podName:5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:37.522891642 +0000 UTC m=+34.783460466 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wh284" (UID: "5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4") : secret "networking-console-plugin-cert" not found Apr 24 21:27:37.023416 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.023242 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wh284\" (UID: \"5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wh284" Apr 24 21:27:37.023416 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.023294 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7xpj\" (UniqueName: \"kubernetes.io/projected/2cabf83e-0da3-4fe7-ae79-15b6bc9e5179-kube-api-access-h7xpj\") pod \"managed-serviceaccount-addon-agent-746f4dc565-zwj7p\" (UID: \"2cabf83e-0da3-4fe7-ae79-15b6bc9e5179\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-746f4dc565-zwj7p" Apr 24 21:27:37.023416 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.023325 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/69c189c7-8879-4a31-8147-83424613b103-tmp\") pod \"klusterlet-addon-workmgr-7cbb4986b-6xzmw\" (UID: \"69c189c7-8879-4a31-8147-83424613b103\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cbb4986b-6xzmw" Apr 24 21:27:37.023416 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.023388 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c78eaad7-e680-4b37-9286-453234917ab8-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-84459467dd-t4mr9\" (UID: \"c78eaad7-e680-4b37-9286-453234917ab8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" Apr 24 21:27:37.023708 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.023422 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/c78eaad7-e680-4b37-9286-453234917ab8-hub\") pod \"cluster-proxy-proxy-agent-84459467dd-t4mr9\" (UID: \"c78eaad7-e680-4b37-9286-453234917ab8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" Apr 24 21:27:37.023708 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.023449 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zlddc\" (UniqueName: \"kubernetes.io/projected/9b1819a7-8889-48bc-9e09-7f5fef3c6672-kube-api-access-zlddc\") pod \"network-check-source-8894fc9bd-vlq8x\" (UID: \"9b1819a7-8889-48bc-9e09-7f5fef3c6672\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vlq8x" Apr 24 21:27:37.023708 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.023481 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/69c189c7-8879-4a31-8147-83424613b103-klusterlet-config\") pod \"klusterlet-addon-workmgr-7cbb4986b-6xzmw\" (UID: \"69c189c7-8879-4a31-8147-83424613b103\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cbb4986b-6xzmw" Apr 24 21:27:37.023708 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.023540 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nj4xt\" (UniqueName: \"kubernetes.io/projected/49cf726e-12df-4887-9b95-ee4375115c11-kube-api-access-nj4xt\") pod \"dns-default-bxds4\" (UID: \"49cf726e-12df-4887-9b95-ee4375115c11\") " pod="openshift-dns/dns-default-bxds4" Apr 24 21:27:37.024777 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.024348 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ghd8\" (UniqueName: \"kubernetes.io/projected/6024e94b-b234-46ca-80d6-f505949e48ac-kube-api-access-9ghd8\") pod \"ingress-canary-8rjd5\" (UID: \"6024e94b-b234-46ca-80d6-f505949e48ac\") " pod="openshift-ingress-canary/ingress-canary-8rjd5" Apr 24 21:27:37.024777 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.024391 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/066d9094-f992-4853-86f1-b25700fe6070-metrics-certs\") pod \"network-metrics-daemon-kpb2c\" (UID: \"066d9094-f992-4853-86f1-b25700fe6070\") " pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:27:37.024777 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.024452 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49cf726e-12df-4887-9b95-ee4375115c11-config-volume\") pod \"dns-default-bxds4\" (UID: \"49cf726e-12df-4887-9b95-ee4375115c11\") " pod="openshift-dns/dns-default-bxds4" Apr 24 21:27:37.024777 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.024479 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6024e94b-b234-46ca-80d6-f505949e48ac-cert\") pod \"ingress-canary-8rjd5\" (UID: \"6024e94b-b234-46ca-80d6-f505949e48ac\") " pod="openshift-ingress-canary/ingress-canary-8rjd5" Apr 24 21:27:37.024777 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.024526 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/49cf726e-12df-4887-9b95-ee4375115c11-tmp-dir\") pod \"dns-default-bxds4\" (UID: \"49cf726e-12df-4887-9b95-ee4375115c11\") " pod="openshift-dns/dns-default-bxds4" Apr 24 21:27:37.024777 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.024552 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-wh284\" (UID: \"5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wh284" Apr 24 21:27:37.024777 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.024584 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7fe0f94-c8a8-42d9-b783-cfecce93ab05-serving-cert\") pod \"service-ca-operator-d6fc45fc5-9jnvn\" (UID: \"b7fe0f94-c8a8-42d9-b783-cfecce93ab05\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jnvn" Apr 24 21:27:37.024777 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.024619 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2cabf83e-0da3-4fe7-ae79-15b6bc9e5179-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-746f4dc565-zwj7p\" (UID: \"2cabf83e-0da3-4fe7-ae79-15b6bc9e5179\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-746f4dc565-zwj7p" Apr 24 21:27:37.024777 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.024653 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/49cf726e-12df-4887-9b95-ee4375115c11-metrics-tls\") pod \"dns-default-bxds4\" (UID: \"49cf726e-12df-4887-9b95-ee4375115c11\") " pod="openshift-dns/dns-default-bxds4" Apr 24 21:27:37.024777 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.024685 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pkhds\" (UniqueName: \"kubernetes.io/projected/c78eaad7-e680-4b37-9286-453234917ab8-kube-api-access-pkhds\") pod \"cluster-proxy-proxy-agent-84459467dd-t4mr9\" (UID: \"c78eaad7-e680-4b37-9286-453234917ab8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" Apr 24 21:27:37.024777 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.024718 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f26h\" (UniqueName: \"kubernetes.io/projected/b7fe0f94-c8a8-42d9-b783-cfecce93ab05-kube-api-access-9f26h\") pod \"service-ca-operator-d6fc45fc5-9jnvn\" (UID: \"b7fe0f94-c8a8-42d9-b783-cfecce93ab05\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jnvn" Apr 24 21:27:37.024777 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.024777 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8b6q\" (UniqueName: \"kubernetes.io/projected/69c189c7-8879-4a31-8147-83424613b103-kube-api-access-k8b6q\") pod \"klusterlet-addon-workmgr-7cbb4986b-6xzmw\" (UID: \"69c189c7-8879-4a31-8147-83424613b103\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cbb4986b-6xzmw" Apr 24 21:27:37.025425 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.024810 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/c78eaad7-e680-4b37-9286-453234917ab8-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-84459467dd-t4mr9\" (UID: \"c78eaad7-e680-4b37-9286-453234917ab8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" Apr 24 21:27:37.025425 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.024843 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7fe0f94-c8a8-42d9-b783-cfecce93ab05-config\") pod \"service-ca-operator-d6fc45fc5-9jnvn\" (UID: \"b7fe0f94-c8a8-42d9-b783-cfecce93ab05\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jnvn" Apr 24 21:27:37.025425 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.024862 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/69c189c7-8879-4a31-8147-83424613b103-tmp\") pod \"klusterlet-addon-workmgr-7cbb4986b-6xzmw\" (UID: \"69c189c7-8879-4a31-8147-83424613b103\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cbb4986b-6xzmw" Apr 24 21:27:37.025425 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.024878 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/c78eaad7-e680-4b37-9286-453234917ab8-ca\") pod \"cluster-proxy-proxy-agent-84459467dd-t4mr9\" (UID: \"c78eaad7-e680-4b37-9286-453234917ab8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" Apr 24 21:27:37.025425 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.024905 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/c78eaad7-e680-4b37-9286-453234917ab8-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-84459467dd-t4mr9\" (UID: \"c78eaad7-e680-4b37-9286-453234917ab8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" Apr 24 21:27:37.025666 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.025651 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/c78eaad7-e680-4b37-9286-453234917ab8-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-84459467dd-t4mr9\" (UID: \"c78eaad7-e680-4b37-9286-453234917ab8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" Apr 24 21:27:37.026083 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:37.026024 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:37.026083 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:37.026084 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6024e94b-b234-46ca-80d6-f505949e48ac-cert podName:6024e94b-b234-46ca-80d6-f505949e48ac nodeName:}" failed. No retries permitted until 2026-04-24 21:27:37.526065865 +0000 UTC m=+34.786634680 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6024e94b-b234-46ca-80d6-f505949e48ac-cert") pod "ingress-canary-8rjd5" (UID: "6024e94b-b234-46ca-80d6-f505949e48ac") : secret "canary-serving-cert" not found Apr 24 21:27:37.026305 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.026283 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49cf726e-12df-4887-9b95-ee4375115c11-config-volume\") pod \"dns-default-bxds4\" (UID: \"49cf726e-12df-4887-9b95-ee4375115c11\") " pod="openshift-dns/dns-default-bxds4" Apr 24 21:27:37.026385 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.026341 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/49cf726e-12df-4887-9b95-ee4375115c11-tmp-dir\") pod \"dns-default-bxds4\" (UID: \"49cf726e-12df-4887-9b95-ee4375115c11\") " pod="openshift-dns/dns-default-bxds4" Apr 24 21:27:37.026851 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:37.026578 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:37.026851 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:37.026628 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/066d9094-f992-4853-86f1-b25700fe6070-metrics-certs podName:066d9094-f992-4853-86f1-b25700fe6070 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:09.026613357 +0000 UTC m=+66.287182171 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/066d9094-f992-4853-86f1-b25700fe6070-metrics-certs") pod "network-metrics-daemon-kpb2c" (UID: "066d9094-f992-4853-86f1-b25700fe6070") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:37.027048 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.027002 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-wh284\" (UID: \"5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wh284" Apr 24 21:27:37.027445 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:37.027133 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:37.027445 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:37.027191 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49cf726e-12df-4887-9b95-ee4375115c11-metrics-tls podName:49cf726e-12df-4887-9b95-ee4375115c11 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:37.527172925 +0000 UTC m=+34.787741742 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/49cf726e-12df-4887-9b95-ee4375115c11-metrics-tls") pod "dns-default-bxds4" (UID: "49cf726e-12df-4887-9b95-ee4375115c11") : secret "dns-default-metrics-tls" not found Apr 24 21:27:37.031213 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.030233 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/c78eaad7-e680-4b37-9286-453234917ab8-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-84459467dd-t4mr9\" (UID: \"c78eaad7-e680-4b37-9286-453234917ab8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" Apr 24 21:27:37.031213 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.030650 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/c78eaad7-e680-4b37-9286-453234917ab8-hub\") pod \"cluster-proxy-proxy-agent-84459467dd-t4mr9\" (UID: \"c78eaad7-e680-4b37-9286-453234917ab8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" Apr 24 21:27:37.031213 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.031164 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2cabf83e-0da3-4fe7-ae79-15b6bc9e5179-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-746f4dc565-zwj7p\" (UID: \"2cabf83e-0da3-4fe7-ae79-15b6bc9e5179\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-746f4dc565-zwj7p" Apr 24 21:27:37.031574 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.031536 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c78eaad7-e680-4b37-9286-453234917ab8-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-84459467dd-t4mr9\" (UID: \"c78eaad7-e680-4b37-9286-453234917ab8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" Apr 24 21:27:37.032459 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.032419 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/69c189c7-8879-4a31-8147-83424613b103-klusterlet-config\") pod \"klusterlet-addon-workmgr-7cbb4986b-6xzmw\" (UID: \"69c189c7-8879-4a31-8147-83424613b103\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cbb4986b-6xzmw" Apr 24 21:27:37.032683 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.032662 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/c78eaad7-e680-4b37-9286-453234917ab8-ca\") pod \"cluster-proxy-proxy-agent-84459467dd-t4mr9\" (UID: \"c78eaad7-e680-4b37-9286-453234917ab8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" Apr 24 21:27:37.037795 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.037763 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlddc\" (UniqueName: \"kubernetes.io/projected/9b1819a7-8889-48bc-9e09-7f5fef3c6672-kube-api-access-zlddc\") pod \"network-check-source-8894fc9bd-vlq8x\" (UID: \"9b1819a7-8889-48bc-9e09-7f5fef3c6672\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vlq8x" Apr 24 21:27:37.038872 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.038837 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7xpj\" (UniqueName: \"kubernetes.io/projected/2cabf83e-0da3-4fe7-ae79-15b6bc9e5179-kube-api-access-h7xpj\") pod \"managed-serviceaccount-addon-agent-746f4dc565-zwj7p\" (UID: \"2cabf83e-0da3-4fe7-ae79-15b6bc9e5179\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-746f4dc565-zwj7p" Apr 24 21:27:37.039643 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.039621 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkhds\" (UniqueName: \"kubernetes.io/projected/c78eaad7-e680-4b37-9286-453234917ab8-kube-api-access-pkhds\") pod \"cluster-proxy-proxy-agent-84459467dd-t4mr9\" (UID: \"c78eaad7-e680-4b37-9286-453234917ab8\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" Apr 24 21:27:37.041114 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.041064 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj4xt\" (UniqueName: \"kubernetes.io/projected/49cf726e-12df-4887-9b95-ee4375115c11-kube-api-access-nj4xt\") pod \"dns-default-bxds4\" (UID: \"49cf726e-12df-4887-9b95-ee4375115c11\") " pod="openshift-dns/dns-default-bxds4" Apr 24 21:27:37.042360 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.042337 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8b6q\" (UniqueName: \"kubernetes.io/projected/69c189c7-8879-4a31-8147-83424613b103-kube-api-access-k8b6q\") pod \"klusterlet-addon-workmgr-7cbb4986b-6xzmw\" (UID: \"69c189c7-8879-4a31-8147-83424613b103\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cbb4986b-6xzmw" Apr 24 21:27:37.042679 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.042662 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ghd8\" (UniqueName: \"kubernetes.io/projected/6024e94b-b234-46ca-80d6-f505949e48ac-kube-api-access-9ghd8\") pod \"ingress-canary-8rjd5\" (UID: \"6024e94b-b234-46ca-80d6-f505949e48ac\") " pod="openshift-ingress-canary/ingress-canary-8rjd5" Apr 24 21:27:37.051650 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.051621 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v4rfn" Apr 24 21:27:37.100883 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.096958 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7qths"] Apr 24 21:27:37.100883 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.097009 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-4nztl"] Apr 24 21:27:37.102389 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.101417 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-746f4dc565-zwj7p" Apr 24 21:27:37.113663 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.113639 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vlq8x" Apr 24 21:27:37.125954 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.125639 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7fe0f94-c8a8-42d9-b783-cfecce93ab05-serving-cert\") pod \"service-ca-operator-d6fc45fc5-9jnvn\" (UID: \"b7fe0f94-c8a8-42d9-b783-cfecce93ab05\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jnvn" Apr 24 21:27:37.125954 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.125696 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9f26h\" (UniqueName: \"kubernetes.io/projected/b7fe0f94-c8a8-42d9-b783-cfecce93ab05-kube-api-access-9f26h\") pod \"service-ca-operator-d6fc45fc5-9jnvn\" (UID: \"b7fe0f94-c8a8-42d9-b783-cfecce93ab05\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jnvn" Apr 24 21:27:37.125954 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.125751 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7fe0f94-c8a8-42d9-b783-cfecce93ab05-config\") pod \"service-ca-operator-d6fc45fc5-9jnvn\" (UID: \"b7fe0f94-c8a8-42d9-b783-cfecce93ab05\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jnvn" Apr 24 21:27:37.126468 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.126419 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7fe0f94-c8a8-42d9-b783-cfecce93ab05-config\") pod \"service-ca-operator-d6fc45fc5-9jnvn\" (UID: \"b7fe0f94-c8a8-42d9-b783-cfecce93ab05\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jnvn" Apr 24 21:27:37.129375 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.129328 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7fe0f94-c8a8-42d9-b783-cfecce93ab05-serving-cert\") pod \"service-ca-operator-d6fc45fc5-9jnvn\" (UID: \"b7fe0f94-c8a8-42d9-b783-cfecce93ab05\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jnvn" Apr 24 21:27:37.138222 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.138110 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f26h\" (UniqueName: \"kubernetes.io/projected/b7fe0f94-c8a8-42d9-b783-cfecce93ab05-kube-api-access-9f26h\") pod \"service-ca-operator-d6fc45fc5-9jnvn\" (UID: \"b7fe0f94-c8a8-42d9-b783-cfecce93ab05\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jnvn" Apr 24 21:27:37.151151 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.148925 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" Apr 24 21:27:37.167583 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.167546 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cbb4986b-6xzmw" Apr 24 21:27:37.174523 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.172631 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jnvn" Apr 24 21:27:37.203685 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.203628 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-44vlp"] Apr 24 21:27:37.231635 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.227421 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xgrqn\" (UniqueName: \"kubernetes.io/projected/1612b97a-f223-4e83-8710-5764a3765126-kube-api-access-xgrqn\") pod \"network-check-target-6xchx\" (UID: \"1612b97a-f223-4e83-8710-5764a3765126\") " pod="openshift-network-diagnostics/network-check-target-6xchx" Apr 24 21:27:37.243448 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.243331 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgrqn\" (UniqueName: \"kubernetes.io/projected/1612b97a-f223-4e83-8710-5764a3765126-kube-api-access-xgrqn\") pod \"network-check-target-6xchx\" (UID: \"1612b97a-f223-4e83-8710-5764a3765126\") " pod="openshift-network-diagnostics/network-check-target-6xchx" Apr 24 21:27:37.271653 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.271349 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v4rfn"] Apr 24 21:27:37.287332 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:37.287280 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd2084eb_fcc8_42e3_b526_171c67ac7a71.slice/crio-be42bacc322d6e6da7b2f983137c1c43f848bd0f9546f764ee704476af9b550d WatchSource:0}: Error finding container be42bacc322d6e6da7b2f983137c1c43f848bd0f9546f764ee704476af9b550d: Status 404 returned error can't find the container with id be42bacc322d6e6da7b2f983137c1c43f848bd0f9546f764ee704476af9b550d Apr 24 21:27:37.326398 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.326366 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-746f4dc565-zwj7p"] Apr 24 21:27:37.328543 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.328518 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vlq8x"] Apr 24 21:27:37.328664 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.328570 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-registry-tls\") pod \"image-registry-5cd4fd4699-r7szn\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:27:37.328738 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:37.328708 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:37.328738 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:37.328725 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cd4fd4699-r7szn: secret "image-registry-tls" not found Apr 24 21:27:37.328934 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:37.328789 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-registry-tls podName:4c9bfa95-cfa8-4d7f-9dda-3d100a02d251 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:38.328769012 +0000 UTC m=+35.589337826 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-registry-tls") pod "image-registry-5cd4fd4699-r7szn" (UID: "4c9bfa95-cfa8-4d7f-9dda-3d100a02d251") : secret "image-registry-tls" not found Apr 24 21:27:37.329831 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:37.329791 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cabf83e_0da3_4fe7_ae79_15b6bc9e5179.slice/crio-1ea20b8683c93770bdbd909aae0008267a5b112d39f6003de1a62832c899655c WatchSource:0}: Error finding container 1ea20b8683c93770bdbd909aae0008267a5b112d39f6003de1a62832c899655c: Status 404 returned error can't find the container with id 1ea20b8683c93770bdbd909aae0008267a5b112d39f6003de1a62832c899655c Apr 24 21:27:37.334245 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.334206 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9"] Apr 24 21:27:37.341624 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:37.341600 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b1819a7_8889_48bc_9e09_7f5fef3c6672.slice/crio-4f189537b089625ba4f0b03df32fb32a93c67e3046871d4b3c346e303b3e135a WatchSource:0}: Error finding container 4f189537b089625ba4f0b03df32fb32a93c67e3046871d4b3c346e303b3e135a: Status 404 returned error can't find the container with id 4f189537b089625ba4f0b03df32fb32a93c67e3046871d4b3c346e303b3e135a Apr 24 21:27:37.342625 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:37.342583 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc78eaad7_e680_4b37_9286_453234917ab8.slice/crio-d5763507f343a7b4747bb37eb93639ba11e737b7f7053c341255e110055480f7 WatchSource:0}: Error finding container d5763507f343a7b4747bb37eb93639ba11e737b7f7053c341255e110055480f7: Status 404 returned error can't find the container with id d5763507f343a7b4747bb37eb93639ba11e737b7f7053c341255e110055480f7 Apr 24 21:27:37.352502 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.352480 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6xchx" Apr 24 21:27:37.352595 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.352485 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8bd94" Apr 24 21:27:37.352665 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.352484 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:27:37.356314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.356291 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-kgq96\"" Apr 24 21:27:37.356418 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.356329 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:27:37.356418 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.356373 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:27:37.356646 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.356631 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-cbm9w\"" Apr 24 21:27:37.364415 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.364399 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6xchx" Apr 24 21:27:37.369650 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.369625 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jnvn"] Apr 24 21:27:37.372696 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:37.372671 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7fe0f94_c8a8_42d9_b783_cfecce93ab05.slice/crio-ca561f538e2a6d167882fe74edc5a5ba8be2fe92610f5659f4b790afe2b5832b WatchSource:0}: Error finding container ca561f538e2a6d167882fe74edc5a5ba8be2fe92610f5659f4b790afe2b5832b: Status 404 returned error can't find the container with id ca561f538e2a6d167882fe74edc5a5ba8be2fe92610f5659f4b790afe2b5832b Apr 24 21:27:37.373191 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.373172 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cbb4986b-6xzmw"] Apr 24 21:27:37.381537 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:37.381514 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69c189c7_8879_4a31_8147_83424613b103.slice/crio-e72cb6c619dc7666a1121b6e8ac67f3e9b11fcfc7d483a287f8bd85ddae3552d WatchSource:0}: Error finding container e72cb6c619dc7666a1121b6e8ac67f3e9b11fcfc7d483a287f8bd85ddae3552d: Status 404 returned error can't find the container with id e72cb6c619dc7666a1121b6e8ac67f3e9b11fcfc7d483a287f8bd85ddae3552d Apr 24 21:27:37.430237 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.430208 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1768c1b9-8390-4833-9613-4efec510f36b-metrics-certs\") pod \"router-default-58974b8966-z2xx8\" (UID: \"1768c1b9-8390-4833-9613-4efec510f36b\") " pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:27:37.430398 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.430267 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1768c1b9-8390-4833-9613-4efec510f36b-service-ca-bundle\") pod \"router-default-58974b8966-z2xx8\" (UID: \"1768c1b9-8390-4833-9613-4efec510f36b\") " pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:27:37.430398 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.430309 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/494a20d0-ace3-481d-ae39-780b958f7150-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-59fzh\" (UID: \"494a20d0-ace3-481d-ae39-780b958f7150\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59fzh" Apr 24 21:27:37.430398 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.430357 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8768ef0-e03a-46c3-97b6-ca6035eec03f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fsh7k\" (UID: \"d8768ef0-e03a-46c3-97b6-ca6035eec03f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsh7k" Apr 24 21:27:37.430398 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:37.430369 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:27:37.430615 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:37.430444 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1768c1b9-8390-4833-9613-4efec510f36b-metrics-certs podName:1768c1b9-8390-4833-9613-4efec510f36b nodeName:}" failed. No retries permitted until 2026-04-24 21:27:38.430422361 +0000 UTC m=+35.690991191 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1768c1b9-8390-4833-9613-4efec510f36b-metrics-certs") pod "router-default-58974b8966-z2xx8" (UID: "1768c1b9-8390-4833-9613-4efec510f36b") : secret "router-metrics-certs-default" not found Apr 24 21:27:37.430615 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:37.430465 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1768c1b9-8390-4833-9613-4efec510f36b-service-ca-bundle podName:1768c1b9-8390-4833-9613-4efec510f36b nodeName:}" failed. No retries permitted until 2026-04-24 21:27:38.430455132 +0000 UTC m=+35.691023944 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1768c1b9-8390-4833-9613-4efec510f36b-service-ca-bundle") pod "router-default-58974b8966-z2xx8" (UID: "1768c1b9-8390-4833-9613-4efec510f36b") : configmap references non-existent config key: service-ca.crt Apr 24 21:27:37.430615 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:37.430464 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:27:37.430615 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:37.430479 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:37.430615 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:37.430507 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8768ef0-e03a-46c3-97b6-ca6035eec03f-samples-operator-tls podName:d8768ef0-e03a-46c3-97b6-ca6035eec03f nodeName:}" failed. No retries permitted until 2026-04-24 21:27:38.43049737 +0000 UTC m=+35.691066180 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d8768ef0-e03a-46c3-97b6-ca6035eec03f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fsh7k" (UID: "d8768ef0-e03a-46c3-97b6-ca6035eec03f") : secret "samples-operator-tls" not found Apr 24 21:27:37.430615 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:37.430532 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/494a20d0-ace3-481d-ae39-780b958f7150-cluster-monitoring-operator-tls podName:494a20d0-ace3-481d-ae39-780b958f7150 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:38.43051638 +0000 UTC m=+35.691085196 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/494a20d0-ace3-481d-ae39-780b958f7150-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-59fzh" (UID: "494a20d0-ace3-481d-ae39-780b958f7150") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:37.489294 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.489257 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6xchx"] Apr 24 21:27:37.493375 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:27:37.493345 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1612b97a_f223_4e83_8710_5764a3765126.slice/crio-226e00258c757b6213e5d2805551b17f1c7dbea72cebf0e52d1712316cbcc755 WatchSource:0}: Error finding container 226e00258c757b6213e5d2805551b17f1c7dbea72cebf0e52d1712316cbcc755: Status 404 returned error can't find the container with id 226e00258c757b6213e5d2805551b17f1c7dbea72cebf0e52d1712316cbcc755 Apr 24 21:27:37.531031 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.530972 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wh284\" (UID: \"5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wh284" Apr 24 21:27:37.531196 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.531139 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6024e94b-b234-46ca-80d6-f505949e48ac-cert\") pod \"ingress-canary-8rjd5\" (UID: \"6024e94b-b234-46ca-80d6-f505949e48ac\") " pod="openshift-ingress-canary/ingress-canary-8rjd5" Apr 24 21:27:37.531196 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:37.531145 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:27:37.531196 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.531183 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/49cf726e-12df-4887-9b95-ee4375115c11-metrics-tls\") pod \"dns-default-bxds4\" (UID: \"49cf726e-12df-4887-9b95-ee4375115c11\") " pod="openshift-dns/dns-default-bxds4" Apr 24 21:27:37.531334 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:37.531229 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4-networking-console-plugin-cert podName:5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:38.531206119 +0000 UTC m=+35.791774944 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wh284" (UID: "5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4") : secret "networking-console-plugin-cert" not found Apr 24 21:27:37.531334 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:37.531288 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:37.531334 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:37.531295 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:37.531480 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:37.531337 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6024e94b-b234-46ca-80d6-f505949e48ac-cert podName:6024e94b-b234-46ca-80d6-f505949e48ac nodeName:}" failed. No retries permitted until 2026-04-24 21:27:38.531320843 +0000 UTC m=+35.791889658 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6024e94b-b234-46ca-80d6-f505949e48ac-cert") pod "ingress-canary-8rjd5" (UID: "6024e94b-b234-46ca-80d6-f505949e48ac") : secret "canary-serving-cert" not found Apr 24 21:27:37.531480 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:37.531357 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49cf726e-12df-4887-9b95-ee4375115c11-metrics-tls podName:49cf726e-12df-4887-9b95-ee4375115c11 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:38.531347924 +0000 UTC m=+35.791916738 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/49cf726e-12df-4887-9b95-ee4375115c11-metrics-tls") pod "dns-default-bxds4" (UID: "49cf726e-12df-4887-9b95-ee4375115c11") : secret "dns-default-metrics-tls" not found Apr 24 21:27:37.546695 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.546668 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6xchx" event={"ID":"1612b97a-f223-4e83-8710-5764a3765126","Type":"ContainerStarted","Data":"226e00258c757b6213e5d2805551b17f1c7dbea72cebf0e52d1712316cbcc755"} Apr 24 21:27:37.547717 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.547694 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cbb4986b-6xzmw" event={"ID":"69c189c7-8879-4a31-8147-83424613b103","Type":"ContainerStarted","Data":"e72cb6c619dc7666a1121b6e8ac67f3e9b11fcfc7d483a287f8bd85ddae3552d"} Apr 24 21:27:37.548664 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.548633 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vlq8x" event={"ID":"9b1819a7-8889-48bc-9e09-7f5fef3c6672","Type":"ContainerStarted","Data":"4f189537b089625ba4f0b03df32fb32a93c67e3046871d4b3c346e303b3e135a"} Apr 24 21:27:37.549575 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.549553 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v4rfn" event={"ID":"bd2084eb-fcc8-42e3-b526-171c67ac7a71","Type":"ContainerStarted","Data":"be42bacc322d6e6da7b2f983137c1c43f848bd0f9546f764ee704476af9b550d"} Apr 24 21:27:37.550609 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.550588 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jnvn" event={"ID":"b7fe0f94-c8a8-42d9-b783-cfecce93ab05","Type":"ContainerStarted","Data":"ca561f538e2a6d167882fe74edc5a5ba8be2fe92610f5659f4b790afe2b5832b"} Apr 24 21:27:37.552996 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.552973 2573 generic.go:358] "Generic (PLEG): container finished" podID="3807cdc4-74ff-4e27-bde0-2ed93b428a58" containerID="6ce996e9cbf9549483a5cc8dd32dd26f433c0a38ca00a93920c9324378ff0676" exitCode=0 Apr 24 21:27:37.553077 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.553052 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fn4d" event={"ID":"3807cdc4-74ff-4e27-bde0-2ed93b428a58","Type":"ContainerDied","Data":"6ce996e9cbf9549483a5cc8dd32dd26f433c0a38ca00a93920c9324378ff0676"} Apr 24 21:27:37.554116 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.554083 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" event={"ID":"c78eaad7-e680-4b37-9286-453234917ab8","Type":"ContainerStarted","Data":"d5763507f343a7b4747bb37eb93639ba11e737b7f7053c341255e110055480f7"} Apr 24 21:27:37.555182 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.555163 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-746f4dc565-zwj7p" event={"ID":"2cabf83e-0da3-4fe7-ae79-15b6bc9e5179","Type":"ContainerStarted","Data":"1ea20b8683c93770bdbd909aae0008267a5b112d39f6003de1a62832c899655c"} Apr 24 21:27:37.556153 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.556131 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-44vlp" event={"ID":"c4788265-16ce-4770-b838-025f0f7d06aa","Type":"ContainerStarted","Data":"330d80dd0e03010b8434e46d2e8c4844aa865428fb7c895058f09ebb692aec1a"} Apr 24 21:27:37.556917 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.556899 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-4nztl" event={"ID":"b3e83451-d37f-4c5b-837f-6440bed57b91","Type":"ContainerStarted","Data":"01cd545a1a98a1ac706b91218638170f171673dbe71fb7f9e73ee9db36d39c38"} Apr 24 21:27:37.557798 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.557779 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7qths" event={"ID":"f2cf6003-5a3e-497b-82b1-afb7021314fc","Type":"ContainerStarted","Data":"7b224abea3d491e8d29e4182713180e1b67c07f8b4df657dc0510a22e0b85f3a"} Apr 24 21:27:37.732752 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.732545 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e9999088-0209-4da1-b3c6-92c0d2e48409-original-pull-secret\") pod \"global-pull-secret-syncer-8bd94\" (UID: \"e9999088-0209-4da1-b3c6-92c0d2e48409\") " pod="kube-system/global-pull-secret-syncer-8bd94" Apr 24 21:27:37.739049 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.739021 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e9999088-0209-4da1-b3c6-92c0d2e48409-original-pull-secret\") pod \"global-pull-secret-syncer-8bd94\" (UID: \"e9999088-0209-4da1-b3c6-92c0d2e48409\") " pod="kube-system/global-pull-secret-syncer-8bd94" Apr 24 21:27:37.973200 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:37.972727 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8bd94" Apr 24 21:27:38.152269 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:38.152221 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8bd94"] Apr 24 21:27:38.341517 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:38.340239 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-registry-tls\") pod \"image-registry-5cd4fd4699-r7szn\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:27:38.341517 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:38.340453 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:38.341517 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:38.340471 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cd4fd4699-r7szn: secret "image-registry-tls" not found Apr 24 21:27:38.341517 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:38.340533 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-registry-tls podName:4c9bfa95-cfa8-4d7f-9dda-3d100a02d251 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:40.340513161 +0000 UTC m=+37.601081976 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-registry-tls") pod "image-registry-5cd4fd4699-r7szn" (UID: "4c9bfa95-cfa8-4d7f-9dda-3d100a02d251") : secret "image-registry-tls" not found Apr 24 21:27:38.447314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:38.446265 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1768c1b9-8390-4833-9613-4efec510f36b-metrics-certs\") pod \"router-default-58974b8966-z2xx8\" (UID: \"1768c1b9-8390-4833-9613-4efec510f36b\") " pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:27:38.447314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:38.446356 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1768c1b9-8390-4833-9613-4efec510f36b-service-ca-bundle\") pod \"router-default-58974b8966-z2xx8\" (UID: \"1768c1b9-8390-4833-9613-4efec510f36b\") " pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:27:38.447314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:38.446411 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/494a20d0-ace3-481d-ae39-780b958f7150-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-59fzh\" (UID: \"494a20d0-ace3-481d-ae39-780b958f7150\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59fzh" Apr 24 21:27:38.447314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:38.446467 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8768ef0-e03a-46c3-97b6-ca6035eec03f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fsh7k\" (UID: \"d8768ef0-e03a-46c3-97b6-ca6035eec03f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsh7k" Apr 24 21:27:38.447314 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:38.446598 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:27:38.447314 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:38.446658 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8768ef0-e03a-46c3-97b6-ca6035eec03f-samples-operator-tls podName:d8768ef0-e03a-46c3-97b6-ca6035eec03f nodeName:}" failed. No retries permitted until 2026-04-24 21:27:40.446639239 +0000 UTC m=+37.707208060 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d8768ef0-e03a-46c3-97b6-ca6035eec03f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fsh7k" (UID: "d8768ef0-e03a-46c3-97b6-ca6035eec03f") : secret "samples-operator-tls" not found Apr 24 21:27:38.447314 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:38.447059 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:27:38.447314 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:38.447129 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1768c1b9-8390-4833-9613-4efec510f36b-metrics-certs podName:1768c1b9-8390-4833-9613-4efec510f36b nodeName:}" failed. No retries permitted until 2026-04-24 21:27:40.447113432 +0000 UTC m=+37.707682246 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1768c1b9-8390-4833-9613-4efec510f36b-metrics-certs") pod "router-default-58974b8966-z2xx8" (UID: "1768c1b9-8390-4833-9613-4efec510f36b") : secret "router-metrics-certs-default" not found Apr 24 21:27:38.447314 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:38.447196 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1768c1b9-8390-4833-9613-4efec510f36b-service-ca-bundle podName:1768c1b9-8390-4833-9613-4efec510f36b nodeName:}" failed. No retries permitted until 2026-04-24 21:27:40.447185521 +0000 UTC m=+37.707754331 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1768c1b9-8390-4833-9613-4efec510f36b-service-ca-bundle") pod "router-default-58974b8966-z2xx8" (UID: "1768c1b9-8390-4833-9613-4efec510f36b") : configmap references non-existent config key: service-ca.crt Apr 24 21:27:38.447314 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:38.447257 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:38.447314 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:38.447286 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/494a20d0-ace3-481d-ae39-780b958f7150-cluster-monitoring-operator-tls podName:494a20d0-ace3-481d-ae39-780b958f7150 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:40.447277224 +0000 UTC m=+37.707846036 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/494a20d0-ace3-481d-ae39-780b958f7150-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-59fzh" (UID: "494a20d0-ace3-481d-ae39-780b958f7150") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:38.548442 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:38.547372 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wh284\" (UID: \"5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wh284" Apr 24 21:27:38.548442 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:38.547530 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6024e94b-b234-46ca-80d6-f505949e48ac-cert\") pod \"ingress-canary-8rjd5\" (UID: \"6024e94b-b234-46ca-80d6-f505949e48ac\") " pod="openshift-ingress-canary/ingress-canary-8rjd5" Apr 24 21:27:38.548442 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:38.547577 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/49cf726e-12df-4887-9b95-ee4375115c11-metrics-tls\") pod \"dns-default-bxds4\" (UID: \"49cf726e-12df-4887-9b95-ee4375115c11\") " pod="openshift-dns/dns-default-bxds4" Apr 24 21:27:38.548442 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:38.547730 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:38.548442 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:38.547790 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49cf726e-12df-4887-9b95-ee4375115c11-metrics-tls podName:49cf726e-12df-4887-9b95-ee4375115c11 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:40.547770973 +0000 UTC m=+37.808339799 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/49cf726e-12df-4887-9b95-ee4375115c11-metrics-tls") pod "dns-default-bxds4" (UID: "49cf726e-12df-4887-9b95-ee4375115c11") : secret "dns-default-metrics-tls" not found Apr 24 21:27:38.548442 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:38.548143 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:38.548442 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:38.548193 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6024e94b-b234-46ca-80d6-f505949e48ac-cert podName:6024e94b-b234-46ca-80d6-f505949e48ac nodeName:}" failed. No retries permitted until 2026-04-24 21:27:40.548178241 +0000 UTC m=+37.808747055 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6024e94b-b234-46ca-80d6-f505949e48ac-cert") pod "ingress-canary-8rjd5" (UID: "6024e94b-b234-46ca-80d6-f505949e48ac") : secret "canary-serving-cert" not found Apr 24 21:27:38.548442 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:38.548337 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:27:38.548442 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:38.548379 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4-networking-console-plugin-cert podName:5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:40.548365619 +0000 UTC m=+37.808934434 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wh284" (UID: "5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4") : secret "networking-console-plugin-cert" not found Apr 24 21:27:38.575912 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:38.574976 2573 generic.go:358] "Generic (PLEG): container finished" podID="3807cdc4-74ff-4e27-bde0-2ed93b428a58" containerID="5ecfd10067b8deec223f96fdeaadc9ca6a004e3bb26fc67b2ed04116acb96c78" exitCode=0 Apr 24 21:27:38.576342 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:38.576289 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fn4d" event={"ID":"3807cdc4-74ff-4e27-bde0-2ed93b428a58","Type":"ContainerDied","Data":"5ecfd10067b8deec223f96fdeaadc9ca6a004e3bb26fc67b2ed04116acb96c78"} Apr 24 21:27:38.582562 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:38.582483 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8bd94" event={"ID":"e9999088-0209-4da1-b3c6-92c0d2e48409","Type":"ContainerStarted","Data":"7667e2c3f31f11128d76b0cd3afbe92625a6a7f4eae7365a4341c345c44df6bb"} Apr 24 21:27:39.622782 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:39.621550 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5fn4d" event={"ID":"3807cdc4-74ff-4e27-bde0-2ed93b428a58","Type":"ContainerStarted","Data":"dff45ef0107d0c0ed520a37c63a438adc2ab915402112ac332396f4d403a84b7"} Apr 24 21:27:39.665922 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:39.665689 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5fn4d" podStartSLOduration=6.261129584 podStartE2EDuration="36.665669748s" podCreationTimestamp="2026-04-24 21:27:03 +0000 UTC" firstStartedPulling="2026-04-24 21:27:05.941944497 +0000 UTC m=+3.202513319" lastFinishedPulling="2026-04-24 21:27:36.346484674 +0000 UTC m=+33.607053483" observedRunningTime="2026-04-24 21:27:39.663278662 +0000 UTC m=+36.923847493" watchObservedRunningTime="2026-04-24 21:27:39.665669748 +0000 UTC m=+36.926238583" Apr 24 21:27:40.371750 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:40.371007 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-registry-tls\") pod \"image-registry-5cd4fd4699-r7szn\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:27:40.371750 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:40.371321 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:40.371750 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:40.371338 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cd4fd4699-r7szn: secret "image-registry-tls" not found Apr 24 21:27:40.371750 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:40.371398 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-registry-tls podName:4c9bfa95-cfa8-4d7f-9dda-3d100a02d251 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:44.371380424 +0000 UTC m=+41.631949241 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-registry-tls") pod "image-registry-5cd4fd4699-r7szn" (UID: "4c9bfa95-cfa8-4d7f-9dda-3d100a02d251") : secret "image-registry-tls" not found Apr 24 21:27:40.473525 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:40.472247 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8768ef0-e03a-46c3-97b6-ca6035eec03f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fsh7k\" (UID: \"d8768ef0-e03a-46c3-97b6-ca6035eec03f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsh7k" Apr 24 21:27:40.473525 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:40.472374 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1768c1b9-8390-4833-9613-4efec510f36b-metrics-certs\") pod \"router-default-58974b8966-z2xx8\" (UID: \"1768c1b9-8390-4833-9613-4efec510f36b\") " pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:27:40.473525 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:40.472451 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1768c1b9-8390-4833-9613-4efec510f36b-service-ca-bundle\") pod \"router-default-58974b8966-z2xx8\" (UID: \"1768c1b9-8390-4833-9613-4efec510f36b\") " pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:27:40.473525 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:40.472495 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/494a20d0-ace3-481d-ae39-780b958f7150-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-59fzh\" (UID: \"494a20d0-ace3-481d-ae39-780b958f7150\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59fzh" Apr 24 21:27:40.473525 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:40.472713 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:40.473525 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:40.472786 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/494a20d0-ace3-481d-ae39-780b958f7150-cluster-monitoring-operator-tls podName:494a20d0-ace3-481d-ae39-780b958f7150 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:44.472765388 +0000 UTC m=+41.733334202 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/494a20d0-ace3-481d-ae39-780b958f7150-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-59fzh" (UID: "494a20d0-ace3-481d-ae39-780b958f7150") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:40.473525 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:40.473204 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:27:40.473525 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:40.473294 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8768ef0-e03a-46c3-97b6-ca6035eec03f-samples-operator-tls podName:d8768ef0-e03a-46c3-97b6-ca6035eec03f nodeName:}" failed. No retries permitted until 2026-04-24 21:27:44.473276856 +0000 UTC m=+41.733845668 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d8768ef0-e03a-46c3-97b6-ca6035eec03f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fsh7k" (UID: "d8768ef0-e03a-46c3-97b6-ca6035eec03f") : secret "samples-operator-tls" not found Apr 24 21:27:40.473525 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:40.473374 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:27:40.473525 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:40.473412 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1768c1b9-8390-4833-9613-4efec510f36b-metrics-certs podName:1768c1b9-8390-4833-9613-4efec510f36b nodeName:}" failed. No retries permitted until 2026-04-24 21:27:44.473399771 +0000 UTC m=+41.733968594 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1768c1b9-8390-4833-9613-4efec510f36b-metrics-certs") pod "router-default-58974b8966-z2xx8" (UID: "1768c1b9-8390-4833-9613-4efec510f36b") : secret "router-metrics-certs-default" not found Apr 24 21:27:40.473525 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:40.473480 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1768c1b9-8390-4833-9613-4efec510f36b-service-ca-bundle podName:1768c1b9-8390-4833-9613-4efec510f36b nodeName:}" failed. No retries permitted until 2026-04-24 21:27:44.47346913 +0000 UTC m=+41.734037940 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1768c1b9-8390-4833-9613-4efec510f36b-service-ca-bundle") pod "router-default-58974b8966-z2xx8" (UID: "1768c1b9-8390-4833-9613-4efec510f36b") : configmap references non-existent config key: service-ca.crt Apr 24 21:27:40.574447 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:40.573732 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/49cf726e-12df-4887-9b95-ee4375115c11-metrics-tls\") pod \"dns-default-bxds4\" (UID: \"49cf726e-12df-4887-9b95-ee4375115c11\") " pod="openshift-dns/dns-default-bxds4" Apr 24 21:27:40.574447 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:40.573817 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wh284\" (UID: \"5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wh284" Apr 24 21:27:40.574447 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:40.573974 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6024e94b-b234-46ca-80d6-f505949e48ac-cert\") pod \"ingress-canary-8rjd5\" (UID: \"6024e94b-b234-46ca-80d6-f505949e48ac\") " pod="openshift-ingress-canary/ingress-canary-8rjd5" Apr 24 21:27:40.574447 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:40.574164 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:40.574447 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:40.574256 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6024e94b-b234-46ca-80d6-f505949e48ac-cert podName:6024e94b-b234-46ca-80d6-f505949e48ac nodeName:}" failed. No retries permitted until 2026-04-24 21:27:44.574234503 +0000 UTC m=+41.834803331 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6024e94b-b234-46ca-80d6-f505949e48ac-cert") pod "ingress-canary-8rjd5" (UID: "6024e94b-b234-46ca-80d6-f505949e48ac") : secret "canary-serving-cert" not found Apr 24 21:27:40.574447 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:40.574255 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:40.574447 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:40.574292 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49cf726e-12df-4887-9b95-ee4375115c11-metrics-tls podName:49cf726e-12df-4887-9b95-ee4375115c11 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:44.574282488 +0000 UTC m=+41.834851299 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/49cf726e-12df-4887-9b95-ee4375115c11-metrics-tls") pod "dns-default-bxds4" (UID: "49cf726e-12df-4887-9b95-ee4375115c11") : secret "dns-default-metrics-tls" not found Apr 24 21:27:40.574447 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:40.574306 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:27:40.574447 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:40.574340 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4-networking-console-plugin-cert podName:5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:44.574329209 +0000 UTC m=+41.834898023 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wh284" (UID: "5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4") : secret "networking-console-plugin-cert" not found Apr 24 21:27:44.415816 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:44.415778 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-registry-tls\") pod \"image-registry-5cd4fd4699-r7szn\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:27:44.416421 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:44.415953 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:44.416421 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:44.415978 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cd4fd4699-r7szn: secret "image-registry-tls" not found Apr 24 21:27:44.416421 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:44.416088 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-registry-tls podName:4c9bfa95-cfa8-4d7f-9dda-3d100a02d251 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:52.416066341 +0000 UTC m=+49.676635152 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-registry-tls") pod "image-registry-5cd4fd4699-r7szn" (UID: "4c9bfa95-cfa8-4d7f-9dda-3d100a02d251") : secret "image-registry-tls" not found Apr 24 21:27:44.516721 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:44.516684 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1768c1b9-8390-4833-9613-4efec510f36b-service-ca-bundle\") pod \"router-default-58974b8966-z2xx8\" (UID: \"1768c1b9-8390-4833-9613-4efec510f36b\") " pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:27:44.516901 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:44.516762 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/494a20d0-ace3-481d-ae39-780b958f7150-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-59fzh\" (UID: \"494a20d0-ace3-481d-ae39-780b958f7150\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59fzh" Apr 24 21:27:44.516901 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:44.516824 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8768ef0-e03a-46c3-97b6-ca6035eec03f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fsh7k\" (UID: \"d8768ef0-e03a-46c3-97b6-ca6035eec03f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsh7k" Apr 24 21:27:44.516901 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:44.516851 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1768c1b9-8390-4833-9613-4efec510f36b-service-ca-bundle podName:1768c1b9-8390-4833-9613-4efec510f36b nodeName:}" failed. No retries permitted until 2026-04-24 21:27:52.51683279 +0000 UTC m=+49.777401600 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1768c1b9-8390-4833-9613-4efec510f36b-service-ca-bundle") pod "router-default-58974b8966-z2xx8" (UID: "1768c1b9-8390-4833-9613-4efec510f36b") : configmap references non-existent config key: service-ca.crt Apr 24 21:27:44.517049 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:44.516912 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:27:44.517049 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:44.516920 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:44.517049 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:44.516960 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1768c1b9-8390-4833-9613-4efec510f36b-metrics-certs\") pod \"router-default-58974b8966-z2xx8\" (UID: \"1768c1b9-8390-4833-9613-4efec510f36b\") " pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:27:44.517049 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:44.516971 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8768ef0-e03a-46c3-97b6-ca6035eec03f-samples-operator-tls podName:d8768ef0-e03a-46c3-97b6-ca6035eec03f nodeName:}" failed. No retries permitted until 2026-04-24 21:27:52.516956442 +0000 UTC m=+49.777525269 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d8768ef0-e03a-46c3-97b6-ca6035eec03f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fsh7k" (UID: "d8768ef0-e03a-46c3-97b6-ca6035eec03f") : secret "samples-operator-tls" not found Apr 24 21:27:44.517049 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:44.517014 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:27:44.517049 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:44.517016 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/494a20d0-ace3-481d-ae39-780b958f7150-cluster-monitoring-operator-tls podName:494a20d0-ace3-481d-ae39-780b958f7150 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:52.517007752 +0000 UTC m=+49.777576567 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/494a20d0-ace3-481d-ae39-780b958f7150-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-59fzh" (UID: "494a20d0-ace3-481d-ae39-780b958f7150") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:44.517270 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:44.517054 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1768c1b9-8390-4833-9613-4efec510f36b-metrics-certs podName:1768c1b9-8390-4833-9613-4efec510f36b nodeName:}" failed. No retries permitted until 2026-04-24 21:27:52.517046466 +0000 UTC m=+49.777615276 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1768c1b9-8390-4833-9613-4efec510f36b-metrics-certs") pod "router-default-58974b8966-z2xx8" (UID: "1768c1b9-8390-4833-9613-4efec510f36b") : secret "router-metrics-certs-default" not found Apr 24 21:27:44.617989 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:44.617958 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6024e94b-b234-46ca-80d6-f505949e48ac-cert\") pod \"ingress-canary-8rjd5\" (UID: \"6024e94b-b234-46ca-80d6-f505949e48ac\") " pod="openshift-ingress-canary/ingress-canary-8rjd5" Apr 24 21:27:44.618188 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:44.618005 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/49cf726e-12df-4887-9b95-ee4375115c11-metrics-tls\") pod \"dns-default-bxds4\" (UID: \"49cf726e-12df-4887-9b95-ee4375115c11\") " pod="openshift-dns/dns-default-bxds4" Apr 24 21:27:44.618188 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:44.618121 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:44.618188 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:44.618169 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wh284\" (UID: \"5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wh284" Apr 24 21:27:44.618327 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:44.618194 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6024e94b-b234-46ca-80d6-f505949e48ac-cert podName:6024e94b-b234-46ca-80d6-f505949e48ac nodeName:}" failed. No retries permitted until 2026-04-24 21:27:52.618172056 +0000 UTC m=+49.878740871 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6024e94b-b234-46ca-80d6-f505949e48ac-cert") pod "ingress-canary-8rjd5" (UID: "6024e94b-b234-46ca-80d6-f505949e48ac") : secret "canary-serving-cert" not found Apr 24 21:27:44.618327 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:44.618121 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:44.618327 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:44.618212 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:27:44.618327 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:44.618252 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4-networking-console-plugin-cert podName:5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:52.618240044 +0000 UTC m=+49.878808859 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wh284" (UID: "5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4") : secret "networking-console-plugin-cert" not found Apr 24 21:27:44.618327 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:44.618273 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49cf726e-12df-4887-9b95-ee4375115c11-metrics-tls podName:49cf726e-12df-4887-9b95-ee4375115c11 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:52.618258433 +0000 UTC m=+49.878827242 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/49cf726e-12df-4887-9b95-ee4375115c11-metrics-tls") pod "dns-default-bxds4" (UID: "49cf726e-12df-4887-9b95-ee4375115c11") : secret "dns-default-metrics-tls" not found Apr 24 21:27:51.658049 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:51.657711 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jnvn" event={"ID":"b7fe0f94-c8a8-42d9-b783-cfecce93ab05","Type":"ContainerStarted","Data":"4d43b54ae970f3cd7055c1a54618da275a0940fc9688f05141d4e28f9fbfc828"} Apr 24 21:27:51.659411 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:51.659384 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-4nztl" event={"ID":"b3e83451-d37f-4c5b-837f-6440bed57b91","Type":"ContainerStarted","Data":"81ac5c57b0931d9c0e371e90ddb82e0020bd94476b43ec7c54a9b24cad064f7b"} Apr 24 21:27:51.660844 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:51.660819 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7qths" event={"ID":"f2cf6003-5a3e-497b-82b1-afb7021314fc","Type":"ContainerStarted","Data":"62d4abce93d87ad3f947b95804ee3cb8cf4acf963c56854ad253933c620f3639"} Apr 24 21:27:51.712298 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:51.712254 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-7qths" podStartSLOduration=15.511634548 podStartE2EDuration="29.712237859s" podCreationTimestamp="2026-04-24 21:27:22 +0000 UTC" firstStartedPulling="2026-04-24 21:27:37.123541939 +0000 UTC m=+34.384110748" lastFinishedPulling="2026-04-24 21:27:51.324145234 +0000 UTC m=+48.584714059" observedRunningTime="2026-04-24 21:27:51.711149446 +0000 UTC m=+48.971718278" watchObservedRunningTime="2026-04-24 21:27:51.712237859 +0000 UTC m=+48.972806692" Apr 24 21:27:51.738140 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:51.738036 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-4nztl" podStartSLOduration=14.542187554 podStartE2EDuration="28.738021301s" podCreationTimestamp="2026-04-24 21:27:23 +0000 UTC" firstStartedPulling="2026-04-24 21:27:37.128322777 +0000 UTC m=+34.388891600" lastFinishedPulling="2026-04-24 21:27:51.324156526 +0000 UTC m=+48.584725347" observedRunningTime="2026-04-24 21:27:51.736622126 +0000 UTC m=+48.997190977" watchObservedRunningTime="2026-04-24 21:27:51.738021301 +0000 UTC m=+48.998590128" Apr 24 21:27:52.499001 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.498948 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-registry-tls\") pod \"image-registry-5cd4fd4699-r7szn\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:27:52.499325 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:52.499150 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:52.499325 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:52.499177 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cd4fd4699-r7szn: secret "image-registry-tls" not found Apr 24 21:27:52.499325 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:52.499253 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-registry-tls podName:4c9bfa95-cfa8-4d7f-9dda-3d100a02d251 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.499227454 +0000 UTC m=+65.759796278 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-registry-tls") pod "image-registry-5cd4fd4699-r7szn" (UID: "4c9bfa95-cfa8-4d7f-9dda-3d100a02d251") : secret "image-registry-tls" not found Apr 24 21:27:52.600660 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.600598 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1768c1b9-8390-4833-9613-4efec510f36b-service-ca-bundle\") pod \"router-default-58974b8966-z2xx8\" (UID: \"1768c1b9-8390-4833-9613-4efec510f36b\") " pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:27:52.600972 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:52.600944 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1768c1b9-8390-4833-9613-4efec510f36b-service-ca-bundle podName:1768c1b9-8390-4833-9613-4efec510f36b nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.600920163 +0000 UTC m=+65.861488988 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1768c1b9-8390-4833-9613-4efec510f36b-service-ca-bundle") pod "router-default-58974b8966-z2xx8" (UID: "1768c1b9-8390-4833-9613-4efec510f36b") : configmap references non-existent config key: service-ca.crt Apr 24 21:27:52.601147 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.601024 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/494a20d0-ace3-481d-ae39-780b958f7150-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-59fzh\" (UID: \"494a20d0-ace3-481d-ae39-780b958f7150\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59fzh" Apr 24 21:27:52.601147 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.601137 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8768ef0-e03a-46c3-97b6-ca6035eec03f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fsh7k\" (UID: \"d8768ef0-e03a-46c3-97b6-ca6035eec03f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsh7k" Apr 24 21:27:52.601265 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:52.601204 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:52.601317 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.601270 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1768c1b9-8390-4833-9613-4efec510f36b-metrics-certs\") pod \"router-default-58974b8966-z2xx8\" (UID: \"1768c1b9-8390-4833-9613-4efec510f36b\") " pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:27:52.601317 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:52.601282 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/494a20d0-ace3-481d-ae39-780b958f7150-cluster-monitoring-operator-tls podName:494a20d0-ace3-481d-ae39-780b958f7150 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.601263982 +0000 UTC m=+65.861832798 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/494a20d0-ace3-481d-ae39-780b958f7150-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-59fzh" (UID: "494a20d0-ace3-481d-ae39-780b958f7150") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:27:52.601416 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:52.601345 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:27:52.601416 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:52.601378 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8768ef0-e03a-46c3-97b6-ca6035eec03f-samples-operator-tls podName:d8768ef0-e03a-46c3-97b6-ca6035eec03f nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.60136519 +0000 UTC m=+65.861934003 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d8768ef0-e03a-46c3-97b6-ca6035eec03f-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fsh7k" (UID: "d8768ef0-e03a-46c3-97b6-ca6035eec03f") : secret "samples-operator-tls" not found Apr 24 21:27:52.601416 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:52.601391 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:27:52.601595 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:52.601586 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1768c1b9-8390-4833-9613-4efec510f36b-metrics-certs podName:1768c1b9-8390-4833-9613-4efec510f36b nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.601568847 +0000 UTC m=+65.862137670 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1768c1b9-8390-4833-9613-4efec510f36b-metrics-certs") pod "router-default-58974b8966-z2xx8" (UID: "1768c1b9-8390-4833-9613-4efec510f36b") : secret "router-metrics-certs-default" not found Apr 24 21:27:52.669798 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.669065 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-746f4dc565-zwj7p" event={"ID":"2cabf83e-0da3-4fe7-ae79-15b6bc9e5179","Type":"ContainerStarted","Data":"ea859389c6ef0c3552d5eb97ce5621b785f2f7b86e64b06c712940cfc8f195ca"} Apr 24 21:27:52.672669 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.672181 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-44vlp" event={"ID":"c4788265-16ce-4770-b838-025f0f7d06aa","Type":"ContainerStarted","Data":"5c5169b16bed226741e9e3a918e76842986613f7d9ca6f5212bf5afc2f765842"} Apr 24 21:27:52.676068 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.675968 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4nztl_b3e83451-d37f-4c5b-837f-6440bed57b91/console-operator/0.log" Apr 24 21:27:52.676068 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.676011 2573 generic.go:358] "Generic (PLEG): container finished" podID="b3e83451-d37f-4c5b-837f-6440bed57b91" containerID="81ac5c57b0931d9c0e371e90ddb82e0020bd94476b43ec7c54a9b24cad064f7b" exitCode=255 Apr 24 21:27:52.676231 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.676109 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-4nztl" event={"ID":"b3e83451-d37f-4c5b-837f-6440bed57b91","Type":"ContainerDied","Data":"81ac5c57b0931d9c0e371e90ddb82e0020bd94476b43ec7c54a9b24cad064f7b"} Apr 24 21:27:52.676409 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.676391 2573 scope.go:117] "RemoveContainer" containerID="81ac5c57b0931d9c0e371e90ddb82e0020bd94476b43ec7c54a9b24cad064f7b" Apr 24 21:27:52.679664 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.679628 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6xchx" event={"ID":"1612b97a-f223-4e83-8710-5764a3765126","Type":"ContainerStarted","Data":"86e1ee34dc6d373f23f26d5e82b7078ffa943e3314d8b332682c9f0cbe8fd28c"} Apr 24 21:27:52.680191 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.680014 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-6xchx" Apr 24 21:27:52.683269 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.681362 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cbb4986b-6xzmw" event={"ID":"69c189c7-8879-4a31-8147-83424613b103","Type":"ContainerStarted","Data":"fa01143a85b3b272fe1d9845497ccda9e92153f7500d7347e6c4909612fdea37"} Apr 24 21:27:52.683269 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.682015 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cbb4986b-6xzmw" Apr 24 21:27:52.684149 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.684121 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cbb4986b-6xzmw" Apr 24 21:27:52.685622 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.685589 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vlq8x" event={"ID":"9b1819a7-8889-48bc-9e09-7f5fef3c6672","Type":"ContainerStarted","Data":"df7782d31435ad149f16007a543732cbf54bd02923de04fe244a8324d422374a"} Apr 24 21:27:52.689327 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.688787 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v4rfn" event={"ID":"bd2084eb-fcc8-42e3-b526-171c67ac7a71","Type":"ContainerStarted","Data":"84017bbc52481564a11d160cd80464f7235ad4ca09979bbf38df46a514fe3166"} Apr 24 21:27:52.693157 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.693132 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8bd94" event={"ID":"e9999088-0209-4da1-b3c6-92c0d2e48409","Type":"ContainerStarted","Data":"cbbe587d9f1ae4f2693ec45162b232d158f61678dd96dbb0197dd1ae30ad5421"} Apr 24 21:27:52.695477 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.695459 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" event={"ID":"c78eaad7-e680-4b37-9286-453234917ab8","Type":"ContainerStarted","Data":"190877af16b83bd80d0368984789128767a3a092e45b3db5aec65e2de80bc908"} Apr 24 21:27:52.702214 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.702191 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wh284\" (UID: \"5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wh284" Apr 24 21:27:52.702330 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.702314 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6024e94b-b234-46ca-80d6-f505949e48ac-cert\") pod \"ingress-canary-8rjd5\" (UID: \"6024e94b-b234-46ca-80d6-f505949e48ac\") " pod="openshift-ingress-canary/ingress-canary-8rjd5" Apr 24 21:27:52.702391 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.702366 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/49cf726e-12df-4887-9b95-ee4375115c11-metrics-tls\") pod \"dns-default-bxds4\" (UID: \"49cf726e-12df-4887-9b95-ee4375115c11\") " pod="openshift-dns/dns-default-bxds4" Apr 24 21:27:52.702512 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:52.702496 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:52.702571 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:52.702560 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49cf726e-12df-4887-9b95-ee4375115c11-metrics-tls podName:49cf726e-12df-4887-9b95-ee4375115c11 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.702541479 +0000 UTC m=+65.963110300 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/49cf726e-12df-4887-9b95-ee4375115c11-metrics-tls") pod "dns-default-bxds4" (UID: "49cf726e-12df-4887-9b95-ee4375115c11") : secret "dns-default-metrics-tls" not found Apr 24 21:27:52.702921 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:52.702901 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:27:52.703016 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:52.702956 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4-networking-console-plugin-cert podName:5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.702941696 +0000 UTC m=+65.963510505 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wh284" (UID: "5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4") : secret "networking-console-plugin-cert" not found Apr 24 21:27:52.703016 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:52.703011 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:52.703245 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:52.703043 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6024e94b-b234-46ca-80d6-f505949e48ac-cert podName:6024e94b-b234-46ca-80d6-f505949e48ac nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.703032971 +0000 UTC m=+65.963601783 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6024e94b-b234-46ca-80d6-f505949e48ac-cert") pod "ingress-canary-8rjd5" (UID: "6024e94b-b234-46ca-80d6-f505949e48ac") : secret "canary-serving-cert" not found Apr 24 21:27:52.711549 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.711503 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-746f4dc565-zwj7p" podStartSLOduration=32.586858355 podStartE2EDuration="46.711487664s" podCreationTimestamp="2026-04-24 21:27:06 +0000 UTC" firstStartedPulling="2026-04-24 21:27:37.338252031 +0000 UTC m=+34.598820841" lastFinishedPulling="2026-04-24 21:27:51.462881326 +0000 UTC m=+48.723450150" observedRunningTime="2026-04-24 21:27:52.689031957 +0000 UTC m=+49.949600790" watchObservedRunningTime="2026-04-24 21:27:52.711487664 +0000 UTC m=+49.972056499" Apr 24 21:27:52.735559 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.734697 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v4rfn" podStartSLOduration=15.700470052 podStartE2EDuration="29.734678603s" podCreationTimestamp="2026-04-24 21:27:23 +0000 UTC" firstStartedPulling="2026-04-24 21:27:37.290018475 +0000 UTC m=+34.550587292" lastFinishedPulling="2026-04-24 21:27:51.324227028 +0000 UTC m=+48.584795843" observedRunningTime="2026-04-24 21:27:52.711922124 +0000 UTC m=+49.972490954" watchObservedRunningTime="2026-04-24 21:27:52.734678603 +0000 UTC m=+49.995247435" Apr 24 21:27:52.735559 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.735419 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cbb4986b-6xzmw" podStartSLOduration=32.655393564 podStartE2EDuration="46.735411579s" podCreationTimestamp="2026-04-24 21:27:06 +0000 UTC" firstStartedPulling="2026-04-24 21:27:37.383115713 +0000 UTC m=+34.643684533" lastFinishedPulling="2026-04-24 21:27:51.463133732 +0000 UTC m=+48.723702548" observedRunningTime="2026-04-24 21:27:52.73353792 +0000 UTC m=+49.994106753" watchObservedRunningTime="2026-04-24 21:27:52.735411579 +0000 UTC m=+49.995980412" Apr 24 21:27:52.754748 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.754680 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vlq8x" podStartSLOduration=15.635979426 podStartE2EDuration="29.754663908s" podCreationTimestamp="2026-04-24 21:27:23 +0000 UTC" firstStartedPulling="2026-04-24 21:27:37.343982771 +0000 UTC m=+34.604551580" lastFinishedPulling="2026-04-24 21:27:51.462667248 +0000 UTC m=+48.723236062" observedRunningTime="2026-04-24 21:27:52.752188259 +0000 UTC m=+50.012757088" watchObservedRunningTime="2026-04-24 21:27:52.754663908 +0000 UTC m=+50.015232744" Apr 24 21:27:52.784286 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.780936 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-6xchx" podStartSLOduration=35.803798523 podStartE2EDuration="49.780916195s" podCreationTimestamp="2026-04-24 21:27:03 +0000 UTC" firstStartedPulling="2026-04-24 21:27:37.495276798 +0000 UTC m=+34.755845607" lastFinishedPulling="2026-04-24 21:27:51.47239447 +0000 UTC m=+48.732963279" observedRunningTime="2026-04-24 21:27:52.777989748 +0000 UTC m=+50.038558577" watchObservedRunningTime="2026-04-24 21:27:52.780916195 +0000 UTC m=+50.041485028" Apr 24 21:27:52.839120 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.838526 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-44vlp" podStartSLOduration=15.735929743 podStartE2EDuration="29.838507942s" podCreationTimestamp="2026-04-24 21:27:23 +0000 UTC" firstStartedPulling="2026-04-24 21:27:37.221577101 +0000 UTC m=+34.482145925" lastFinishedPulling="2026-04-24 21:27:51.324155303 +0000 UTC m=+48.584724124" observedRunningTime="2026-04-24 21:27:52.803190201 +0000 UTC m=+50.063759032" watchObservedRunningTime="2026-04-24 21:27:52.838507942 +0000 UTC m=+50.099076774" Apr 24 21:27:52.857157 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:52.856841 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-8bd94" podStartSLOduration=18.561295818 podStartE2EDuration="31.856821458s" podCreationTimestamp="2026-04-24 21:27:21 +0000 UTC" firstStartedPulling="2026-04-24 21:27:38.16938868 +0000 UTC m=+35.429957495" lastFinishedPulling="2026-04-24 21:27:51.464914307 +0000 UTC m=+48.725483135" observedRunningTime="2026-04-24 21:27:52.854916196 +0000 UTC m=+50.115485028" watchObservedRunningTime="2026-04-24 21:27:52.856821458 +0000 UTC m=+50.117390288" Apr 24 21:27:53.700859 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:53.700828 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4nztl_b3e83451-d37f-4c5b-837f-6440bed57b91/console-operator/1.log" Apr 24 21:27:53.701519 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:53.701498 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4nztl_b3e83451-d37f-4c5b-837f-6440bed57b91/console-operator/0.log" Apr 24 21:27:53.701644 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:53.701539 2573 generic.go:358] "Generic (PLEG): container finished" podID="b3e83451-d37f-4c5b-837f-6440bed57b91" containerID="a153b8d7bba5aa0c2c5c1963d3ea3716c30bbd5aab233b4aab855a180422dac0" exitCode=255 Apr 24 21:27:53.702281 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:53.701889 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-4nztl" event={"ID":"b3e83451-d37f-4c5b-837f-6440bed57b91","Type":"ContainerDied","Data":"a153b8d7bba5aa0c2c5c1963d3ea3716c30bbd5aab233b4aab855a180422dac0"} Apr 24 21:27:53.702281 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:53.701938 2573 scope.go:117] "RemoveContainer" containerID="81ac5c57b0931d9c0e371e90ddb82e0020bd94476b43ec7c54a9b24cad064f7b" Apr 24 21:27:53.702281 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:53.702023 2573 scope.go:117] "RemoveContainer" containerID="a153b8d7bba5aa0c2c5c1963d3ea3716c30bbd5aab233b4aab855a180422dac0" Apr 24 21:27:53.702281 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:53.702218 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-4nztl_openshift-console-operator(b3e83451-d37f-4c5b-837f-6440bed57b91)\"" pod="openshift-console-operator/console-operator-9d4b6777b-4nztl" podUID="b3e83451-d37f-4c5b-837f-6440bed57b91" Apr 24 21:27:53.732005 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:53.731942 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jnvn" podStartSLOduration=16.782492733 podStartE2EDuration="30.73192512s" podCreationTimestamp="2026-04-24 21:27:23 +0000 UTC" firstStartedPulling="2026-04-24 21:27:37.374713491 +0000 UTC m=+34.635282319" lastFinishedPulling="2026-04-24 21:27:51.32414589 +0000 UTC m=+48.584714706" observedRunningTime="2026-04-24 21:27:52.873334312 +0000 UTC m=+50.133903160" watchObservedRunningTime="2026-04-24 21:27:53.73192512 +0000 UTC m=+50.992493993" Apr 24 21:27:54.706439 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:54.706388 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" event={"ID":"c78eaad7-e680-4b37-9286-453234917ab8","Type":"ContainerStarted","Data":"da159ba341e4a364adcdb309c0b41aaf28adbcf581bbd62cb53a7069d3506e9a"} Apr 24 21:27:54.706439 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:54.706441 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" event={"ID":"c78eaad7-e680-4b37-9286-453234917ab8","Type":"ContainerStarted","Data":"0de56cc930a7f328c0ec850692ba6998822d0131b8c52bea90422371fd90c51f"} Apr 24 21:27:54.707801 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:54.707783 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4nztl_b3e83451-d37f-4c5b-837f-6440bed57b91/console-operator/1.log" Apr 24 21:27:54.708264 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:54.708237 2573 scope.go:117] "RemoveContainer" containerID="a153b8d7bba5aa0c2c5c1963d3ea3716c30bbd5aab233b4aab855a180422dac0" Apr 24 21:27:54.708402 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:54.708384 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-4nztl_openshift-console-operator(b3e83451-d37f-4c5b-837f-6440bed57b91)\"" pod="openshift-console-operator/console-operator-9d4b6777b-4nztl" podUID="b3e83451-d37f-4c5b-837f-6440bed57b91" Apr 24 21:27:54.783429 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:54.783372 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" podStartSLOduration=31.803238374 podStartE2EDuration="48.783354172s" podCreationTimestamp="2026-04-24 21:27:06 +0000 UTC" firstStartedPulling="2026-04-24 21:27:37.344621816 +0000 UTC m=+34.605190626" lastFinishedPulling="2026-04-24 21:27:54.324737611 +0000 UTC m=+51.585306424" observedRunningTime="2026-04-24 21:27:54.782060846 +0000 UTC m=+52.042629697" watchObservedRunningTime="2026-04-24 21:27:54.783354172 +0000 UTC m=+52.043923004" Apr 24 21:27:56.721627 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:56.721598 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-z65xz_8159f631-f735-47c0-8dd1-8342be18cbcf/dns-node-resolver/0.log" Apr 24 21:27:56.927314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:56.927281 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-4nztl" Apr 24 21:27:56.927718 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:56.927701 2573 scope.go:117] "RemoveContainer" containerID="a153b8d7bba5aa0c2c5c1963d3ea3716c30bbd5aab233b4aab855a180422dac0" Apr 24 21:27:56.927917 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:27:56.927899 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-4nztl_openshift-console-operator(b3e83451-d37f-4c5b-837f-6440bed57b91)\"" pod="openshift-console-operator/console-operator-9d4b6777b-4nztl" podUID="b3e83451-d37f-4c5b-837f-6440bed57b91" Apr 24 21:27:57.518649 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:27:57.518621 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-gtgdx_e61e2625-9935-4ec2-88cd-d2ac5c781886/node-ca/0.log" Apr 24 21:28:00.546598 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:00.546569 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kcl2z" Apr 24 21:28:01.694863 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:01.694829 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-4nztl" Apr 24 21:28:01.695266 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:01.695240 2573 scope.go:117] "RemoveContainer" containerID="a153b8d7bba5aa0c2c5c1963d3ea3716c30bbd5aab233b4aab855a180422dac0" Apr 24 21:28:01.695426 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:28:01.695408 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-4nztl_openshift-console-operator(b3e83451-d37f-4c5b-837f-6440bed57b91)\"" pod="openshift-console-operator/console-operator-9d4b6777b-4nztl" podUID="b3e83451-d37f-4c5b-837f-6440bed57b91" Apr 24 21:28:08.567558 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:08.567523 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-registry-tls\") pod \"image-registry-5cd4fd4699-r7szn\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:28:08.570664 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:08.570631 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-registry-tls\") pod \"image-registry-5cd4fd4699-r7szn\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:28:08.668788 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:08.668754 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1768c1b9-8390-4833-9613-4efec510f36b-metrics-certs\") pod \"router-default-58974b8966-z2xx8\" (UID: \"1768c1b9-8390-4833-9613-4efec510f36b\") " pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:28:08.668967 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:08.668811 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1768c1b9-8390-4833-9613-4efec510f36b-service-ca-bundle\") pod \"router-default-58974b8966-z2xx8\" (UID: \"1768c1b9-8390-4833-9613-4efec510f36b\") " pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:28:08.668967 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:08.668838 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/494a20d0-ace3-481d-ae39-780b958f7150-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-59fzh\" (UID: \"494a20d0-ace3-481d-ae39-780b958f7150\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59fzh" Apr 24 21:28:08.668967 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:08.668876 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8768ef0-e03a-46c3-97b6-ca6035eec03f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fsh7k\" (UID: \"d8768ef0-e03a-46c3-97b6-ca6035eec03f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsh7k" Apr 24 21:28:08.669450 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:08.669421 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1768c1b9-8390-4833-9613-4efec510f36b-service-ca-bundle\") pod \"router-default-58974b8966-z2xx8\" (UID: \"1768c1b9-8390-4833-9613-4efec510f36b\") " pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:28:08.671119 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:08.671077 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/494a20d0-ace3-481d-ae39-780b958f7150-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-59fzh\" (UID: \"494a20d0-ace3-481d-ae39-780b958f7150\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59fzh" Apr 24 21:28:08.671268 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:08.671246 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8768ef0-e03a-46c3-97b6-ca6035eec03f-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fsh7k\" (UID: \"d8768ef0-e03a-46c3-97b6-ca6035eec03f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsh7k" Apr 24 21:28:08.671331 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:08.671317 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1768c1b9-8390-4833-9613-4efec510f36b-metrics-certs\") pod \"router-default-58974b8966-z2xx8\" (UID: \"1768c1b9-8390-4833-9613-4efec510f36b\") " pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:28:08.750420 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:08.750388 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-bckfb\"" Apr 24 21:28:08.757845 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:08.757826 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:28:08.769800 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:08.769774 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6024e94b-b234-46ca-80d6-f505949e48ac-cert\") pod \"ingress-canary-8rjd5\" (UID: \"6024e94b-b234-46ca-80d6-f505949e48ac\") " pod="openshift-ingress-canary/ingress-canary-8rjd5" Apr 24 21:28:08.769895 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:08.769813 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/49cf726e-12df-4887-9b95-ee4375115c11-metrics-tls\") pod \"dns-default-bxds4\" (UID: \"49cf726e-12df-4887-9b95-ee4375115c11\") " pod="openshift-dns/dns-default-bxds4" Apr 24 21:28:08.769895 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:08.769867 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wh284\" (UID: \"5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wh284" Apr 24 21:28:08.769979 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:28:08.769954 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 21:28:08.770036 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:28:08.770025 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4-networking-console-plugin-cert podName:5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:40.770004418 +0000 UTC m=+98.030573239 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-wh284" (UID: "5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4") : secret "networking-console-plugin-cert" not found Apr 24 21:28:08.772224 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:08.772203 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6024e94b-b234-46ca-80d6-f505949e48ac-cert\") pod \"ingress-canary-8rjd5\" (UID: \"6024e94b-b234-46ca-80d6-f505949e48ac\") " pod="openshift-ingress-canary/ingress-canary-8rjd5" Apr 24 21:28:08.772632 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:08.772611 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/49cf726e-12df-4887-9b95-ee4375115c11-metrics-tls\") pod \"dns-default-bxds4\" (UID: \"49cf726e-12df-4887-9b95-ee4375115c11\") " pod="openshift-dns/dns-default-bxds4" Apr 24 21:28:08.819688 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:08.819427 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-795kl\"" Apr 24 21:28:08.836147 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:08.829315 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsh7k" Apr 24 21:28:08.836147 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:08.829990 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-99vck\"" Apr 24 21:28:08.836147 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:08.830181 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:28:08.847172 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:08.847115 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-4c5kw\"" Apr 24 21:28:08.847332 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:08.847189 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59fzh" Apr 24 21:28:08.866345 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:08.866086 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wtdzs\"" Apr 24 21:28:08.874856 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:08.873751 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bxds4" Apr 24 21:28:08.903547 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:08.903499 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gftjb\"" Apr 24 21:28:08.911495 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:08.911455 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8rjd5" Apr 24 21:28:08.931302 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:08.930833 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5cd4fd4699-r7szn"] Apr 24 21:28:08.943967 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:28:08.943831 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c9bfa95_cfa8_4d7f_9dda_3d100a02d251.slice/crio-e4a0e5218d28fbb90ecd0545e46d1246f1bfec1e9d32abd2f4a5c392d0bb76ae WatchSource:0}: Error finding container e4a0e5218d28fbb90ecd0545e46d1246f1bfec1e9d32abd2f4a5c392d0bb76ae: Status 404 returned error can't find the container with id e4a0e5218d28fbb90ecd0545e46d1246f1bfec1e9d32abd2f4a5c392d0bb76ae Apr 24 21:28:09.047977 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:09.047938 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-58974b8966-z2xx8"] Apr 24 21:28:09.054640 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:28:09.054613 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1768c1b9_8390_4833_9613_4efec510f36b.slice/crio-4686eb7904f74fc88d483cd5ea013a3f014a0d4df83a96b6cf2dd088e1dcf88e WatchSource:0}: Error finding container 4686eb7904f74fc88d483cd5ea013a3f014a0d4df83a96b6cf2dd088e1dcf88e: Status 404 returned error can't find the container with id 4686eb7904f74fc88d483cd5ea013a3f014a0d4df83a96b6cf2dd088e1dcf88e Apr 24 21:28:09.070372 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:09.070312 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bxds4"] Apr 24 21:28:09.072844 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:09.072818 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/066d9094-f992-4853-86f1-b25700fe6070-metrics-certs\") pod \"network-metrics-daemon-kpb2c\" (UID: \"066d9094-f992-4853-86f1-b25700fe6070\") " pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:28:09.072973 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:28:09.072845 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49cf726e_12df_4887_9b95_ee4375115c11.slice/crio-af89567dc9a1bdeadda6a7a7f1f5381cf6ae711bfabd1bba670abb8d55f62259 WatchSource:0}: Error finding container af89567dc9a1bdeadda6a7a7f1f5381cf6ae711bfabd1bba670abb8d55f62259: Status 404 returned error can't find the container with id af89567dc9a1bdeadda6a7a7f1f5381cf6ae711bfabd1bba670abb8d55f62259 Apr 24 21:28:09.075341 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:09.075325 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:28:09.086506 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:09.086462 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/066d9094-f992-4853-86f1-b25700fe6070-metrics-certs\") pod \"network-metrics-daemon-kpb2c\" (UID: \"066d9094-f992-4853-86f1-b25700fe6070\") " pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:28:09.086620 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:09.086607 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8rjd5"] Apr 24 21:28:09.099282 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:28:09.099258 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6024e94b_b234_46ca_80d6_f505949e48ac.slice/crio-49e664b6f3e3e7538cd45e43869ef9c690a4ea41be617220342f6b7c960e5ad3 WatchSource:0}: Error finding container 49e664b6f3e3e7538cd45e43869ef9c690a4ea41be617220342f6b7c960e5ad3: Status 404 returned error can't find the container with id 49e664b6f3e3e7538cd45e43869ef9c690a4ea41be617220342f6b7c960e5ad3 Apr 24 21:28:09.226147 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:09.226119 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsh7k"] Apr 24 21:28:09.229427 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:09.229399 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-59fzh"] Apr 24 21:28:09.231132 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:09.231112 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-cbm9w\"" Apr 24 21:28:09.233768 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:28:09.233737 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod494a20d0_ace3_481d_ae39_780b958f7150.slice/crio-f2b4272b216c8f3090fafb9a6374a9998d599a1a797b3618a5817f7493a66b15 WatchSource:0}: Error finding container f2b4272b216c8f3090fafb9a6374a9998d599a1a797b3618a5817f7493a66b15: Status 404 returned error can't find the container with id f2b4272b216c8f3090fafb9a6374a9998d599a1a797b3618a5817f7493a66b15 Apr 24 21:28:09.238860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:09.238840 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpb2c" Apr 24 21:28:09.378532 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:09.378506 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kpb2c"] Apr 24 21:28:09.380648 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:28:09.380619 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod066d9094_f992_4853_86f1_b25700fe6070.slice/crio-e9bd1b114612b6d031f913db72e8face5a28d9965d566dda8e4298eaf5118a87 WatchSource:0}: Error finding container e9bd1b114612b6d031f913db72e8face5a28d9965d566dda8e4298eaf5118a87: Status 404 returned error can't find the container with id e9bd1b114612b6d031f913db72e8face5a28d9965d566dda8e4298eaf5118a87 Apr 24 21:28:09.754213 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:09.754153 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsh7k" event={"ID":"d8768ef0-e03a-46c3-97b6-ca6035eec03f","Type":"ContainerStarted","Data":"bd28243aa457573f8c5259a835914dfb35b2ca158645c540fe6cc4da259a2357"} Apr 24 21:28:09.757020 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:09.756911 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8rjd5" event={"ID":"6024e94b-b234-46ca-80d6-f505949e48ac","Type":"ContainerStarted","Data":"49e664b6f3e3e7538cd45e43869ef9c690a4ea41be617220342f6b7c960e5ad3"} Apr 24 21:28:09.760173 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:09.760130 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59fzh" event={"ID":"494a20d0-ace3-481d-ae39-780b958f7150","Type":"ContainerStarted","Data":"f2b4272b216c8f3090fafb9a6374a9998d599a1a797b3618a5817f7493a66b15"} Apr 24 21:28:09.762370 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:09.762307 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" event={"ID":"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251","Type":"ContainerStarted","Data":"747eb5c82aff256d97e27dabd1cfc6d763e1197109c99021fd4fa22ed0facd11"} Apr 24 21:28:09.762370 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:09.762342 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" event={"ID":"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251","Type":"ContainerStarted","Data":"e4a0e5218d28fbb90ecd0545e46d1246f1bfec1e9d32abd2f4a5c392d0bb76ae"} Apr 24 21:28:09.762634 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:09.762586 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:28:09.767501 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:09.765843 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-58974b8966-z2xx8" event={"ID":"1768c1b9-8390-4833-9613-4efec510f36b","Type":"ContainerStarted","Data":"7afddaed14069f4550b4e7cebdb90c512daa09e7806d587bd54f703c6db8a495"} Apr 24 21:28:09.767501 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:09.765988 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-58974b8966-z2xx8" event={"ID":"1768c1b9-8390-4833-9613-4efec510f36b","Type":"ContainerStarted","Data":"4686eb7904f74fc88d483cd5ea013a3f014a0d4df83a96b6cf2dd088e1dcf88e"} Apr 24 21:28:09.769608 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:09.769567 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kpb2c" event={"ID":"066d9094-f992-4853-86f1-b25700fe6070","Type":"ContainerStarted","Data":"e9bd1b114612b6d031f913db72e8face5a28d9965d566dda8e4298eaf5118a87"} Apr 24 21:28:09.770973 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:09.770937 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bxds4" event={"ID":"49cf726e-12df-4887-9b95-ee4375115c11","Type":"ContainerStarted","Data":"af89567dc9a1bdeadda6a7a7f1f5381cf6ae711bfabd1bba670abb8d55f62259"} Apr 24 21:28:09.789493 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:09.788890 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" podStartSLOduration=66.788864187 podStartE2EDuration="1m6.788864187s" podCreationTimestamp="2026-04-24 21:27:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:28:09.786844626 +0000 UTC m=+67.047413457" watchObservedRunningTime="2026-04-24 21:28:09.788864187 +0000 UTC m=+67.049433020" Apr 24 21:28:09.808741 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:09.807962 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-58974b8966-z2xx8" podStartSLOduration=46.807941771 podStartE2EDuration="46.807941771s" podCreationTimestamp="2026-04-24 21:27:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:28:09.807325589 +0000 UTC m=+67.067894423" watchObservedRunningTime="2026-04-24 21:28:09.807941771 +0000 UTC m=+67.068510603" Apr 24 21:28:09.831428 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:09.831344 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:28:09.834836 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:09.834636 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:28:10.775126 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:10.775073 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:28:10.776496 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:10.776474 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-58974b8966-z2xx8" Apr 24 21:28:13.357456 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:13.357428 2573 scope.go:117] "RemoveContainer" containerID="a153b8d7bba5aa0c2c5c1963d3ea3716c30bbd5aab233b4aab855a180422dac0" Apr 24 21:28:14.787747 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:14.787656 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59fzh" event={"ID":"494a20d0-ace3-481d-ae39-780b958f7150","Type":"ContainerStarted","Data":"9f231cdf0bbda3582aebaa2669b97c3f0e2c021480afe6542d4bf321382845aa"} Apr 24 21:28:14.789508 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:14.789483 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kpb2c" event={"ID":"066d9094-f992-4853-86f1-b25700fe6070","Type":"ContainerStarted","Data":"fd70730d5886ffcd93e5e800e579ba51601e6e7f57daa050fb2a16cd91ec68a1"} Apr 24 21:28:14.789657 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:14.789512 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kpb2c" event={"ID":"066d9094-f992-4853-86f1-b25700fe6070","Type":"ContainerStarted","Data":"89c8c73fcd594058b80a35d735901aeeb96b579a1aab29993c713312d4f552ed"} Apr 24 21:28:14.791058 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:14.791029 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bxds4" event={"ID":"49cf726e-12df-4887-9b95-ee4375115c11","Type":"ContainerStarted","Data":"6baeee702c3d498e4a1affca238d9d8ae4a1bbc946a5448a4bfb30a83dd3cfe4"} Apr 24 21:28:14.791187 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:14.791064 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bxds4" event={"ID":"49cf726e-12df-4887-9b95-ee4375115c11","Type":"ContainerStarted","Data":"a3215a918bf16a4580a7c049ece4d3654f5090cde5cf03628e17822c84ba4e19"} Apr 24 21:28:14.791246 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:14.791220 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-bxds4" Apr 24 21:28:14.792740 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:14.792723 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4nztl_b3e83451-d37f-4c5b-837f-6440bed57b91/console-operator/1.log" Apr 24 21:28:14.792836 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:14.792791 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-4nztl" event={"ID":"b3e83451-d37f-4c5b-837f-6440bed57b91","Type":"ContainerStarted","Data":"1c571cdcba207c6ea2808443365a11f7641efa5c66d7f721e8aa734e7e518198"} Apr 24 21:28:14.793054 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:14.793031 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-4nztl" Apr 24 21:28:14.794542 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:14.794515 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsh7k" event={"ID":"d8768ef0-e03a-46c3-97b6-ca6035eec03f","Type":"ContainerStarted","Data":"bd29f7f1c8da5e8db520223796139ee372f1f6a8fded74e5015bc0272c027a7a"} Apr 24 21:28:14.794646 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:14.794547 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsh7k" event={"ID":"d8768ef0-e03a-46c3-97b6-ca6035eec03f","Type":"ContainerStarted","Data":"45b6bc91d96bd135980c179a2352d9f595cd69b2e0a67206cddfb74c1c122674"} Apr 24 21:28:14.795885 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:14.795851 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8rjd5" event={"ID":"6024e94b-b234-46ca-80d6-f505949e48ac","Type":"ContainerStarted","Data":"1ae27d9b9872313cfae387679ece27112412929707a256e7ba55f3c9a3609ec2"} Apr 24 21:28:14.811156 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:14.811086 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59fzh" podStartSLOduration=47.038744051 podStartE2EDuration="51.811067986s" podCreationTimestamp="2026-04-24 21:27:23 +0000 UTC" firstStartedPulling="2026-04-24 21:28:09.235490976 +0000 UTC m=+66.496059805" lastFinishedPulling="2026-04-24 21:28:14.007814926 +0000 UTC m=+71.268383740" observedRunningTime="2026-04-24 21:28:14.809231586 +0000 UTC m=+72.069800445" watchObservedRunningTime="2026-04-24 21:28:14.811067986 +0000 UTC m=+72.071636822" Apr 24 21:28:14.830141 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:14.830076 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-kpb2c" podStartSLOduration=67.208483344 podStartE2EDuration="1m11.830060411s" podCreationTimestamp="2026-04-24 21:27:03 +0000 UTC" firstStartedPulling="2026-04-24 21:28:09.382552611 +0000 UTC m=+66.643121422" lastFinishedPulling="2026-04-24 21:28:14.004129674 +0000 UTC m=+71.264698489" observedRunningTime="2026-04-24 21:28:14.828958646 +0000 UTC m=+72.089527517" watchObservedRunningTime="2026-04-24 21:28:14.830060411 +0000 UTC m=+72.090629243" Apr 24 21:28:14.868857 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:14.868795 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8rjd5" podStartSLOduration=33.966738371 podStartE2EDuration="38.868776827s" podCreationTimestamp="2026-04-24 21:27:36 +0000 UTC" firstStartedPulling="2026-04-24 21:28:09.102089049 +0000 UTC m=+66.362657858" lastFinishedPulling="2026-04-24 21:28:14.004127487 +0000 UTC m=+71.264696314" observedRunningTime="2026-04-24 21:28:14.868433048 +0000 UTC m=+72.129001881" watchObservedRunningTime="2026-04-24 21:28:14.868776827 +0000 UTC m=+72.129345660" Apr 24 21:28:14.889124 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:14.889035 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fsh7k" podStartSLOduration=47.159657388 podStartE2EDuration="51.889017526s" podCreationTimestamp="2026-04-24 21:27:23 +0000 UTC" firstStartedPulling="2026-04-24 21:28:09.274752619 +0000 UTC m=+66.535321432" lastFinishedPulling="2026-04-24 21:28:14.004112753 +0000 UTC m=+71.264681570" observedRunningTime="2026-04-24 21:28:14.888280921 +0000 UTC m=+72.148849765" watchObservedRunningTime="2026-04-24 21:28:14.889017526 +0000 UTC m=+72.149586358" Apr 24 21:28:14.908704 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:14.908651 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-bxds4" podStartSLOduration=33.979221348 podStartE2EDuration="38.908634507s" podCreationTimestamp="2026-04-24 21:27:36 +0000 UTC" firstStartedPulling="2026-04-24 21:28:09.074701467 +0000 UTC m=+66.335270276" lastFinishedPulling="2026-04-24 21:28:14.004114617 +0000 UTC m=+71.264683435" observedRunningTime="2026-04-24 21:28:14.907778515 +0000 UTC m=+72.168347348" watchObservedRunningTime="2026-04-24 21:28:14.908634507 +0000 UTC m=+72.169203366" Apr 24 21:28:15.323274 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:15.323215 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-jkxl9"] Apr 24 21:28:15.344567 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:15.344535 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-4nztl" Apr 24 21:28:15.344567 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:15.344573 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-jkxl9"] Apr 24 21:28:15.344775 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:15.344741 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-jkxl9" Apr 24 21:28:15.347081 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:15.347053 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:28:15.347231 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:15.347077 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-6qqhw\"" Apr 24 21:28:15.347231 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:15.347129 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:28:15.430866 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:15.430766 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6847a7b4-b0b1-42ab-8771-88161eb09cc7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jkxl9\" (UID: \"6847a7b4-b0b1-42ab-8771-88161eb09cc7\") " pod="openshift-insights/insights-runtime-extractor-jkxl9" Apr 24 21:28:15.430866 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:15.430821 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fmd7\" (UniqueName: \"kubernetes.io/projected/6847a7b4-b0b1-42ab-8771-88161eb09cc7-kube-api-access-6fmd7\") pod \"insights-runtime-extractor-jkxl9\" (UID: \"6847a7b4-b0b1-42ab-8771-88161eb09cc7\") " pod="openshift-insights/insights-runtime-extractor-jkxl9" Apr 24 21:28:15.431085 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:15.430876 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6847a7b4-b0b1-42ab-8771-88161eb09cc7-data-volume\") pod \"insights-runtime-extractor-jkxl9\" (UID: \"6847a7b4-b0b1-42ab-8771-88161eb09cc7\") " pod="openshift-insights/insights-runtime-extractor-jkxl9" Apr 24 21:28:15.431085 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:15.430968 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6847a7b4-b0b1-42ab-8771-88161eb09cc7-crio-socket\") pod \"insights-runtime-extractor-jkxl9\" (UID: \"6847a7b4-b0b1-42ab-8771-88161eb09cc7\") " pod="openshift-insights/insights-runtime-extractor-jkxl9" Apr 24 21:28:15.431085 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:15.431020 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6847a7b4-b0b1-42ab-8771-88161eb09cc7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jkxl9\" (UID: \"6847a7b4-b0b1-42ab-8771-88161eb09cc7\") " pod="openshift-insights/insights-runtime-extractor-jkxl9" Apr 24 21:28:15.531697 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:15.531656 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6fmd7\" (UniqueName: \"kubernetes.io/projected/6847a7b4-b0b1-42ab-8771-88161eb09cc7-kube-api-access-6fmd7\") pod \"insights-runtime-extractor-jkxl9\" (UID: \"6847a7b4-b0b1-42ab-8771-88161eb09cc7\") " pod="openshift-insights/insights-runtime-extractor-jkxl9" Apr 24 21:28:15.531866 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:15.531821 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6847a7b4-b0b1-42ab-8771-88161eb09cc7-data-volume\") pod \"insights-runtime-extractor-jkxl9\" (UID: \"6847a7b4-b0b1-42ab-8771-88161eb09cc7\") " pod="openshift-insights/insights-runtime-extractor-jkxl9" Apr 24 21:28:15.531908 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:15.531900 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6847a7b4-b0b1-42ab-8771-88161eb09cc7-crio-socket\") pod \"insights-runtime-extractor-jkxl9\" (UID: \"6847a7b4-b0b1-42ab-8771-88161eb09cc7\") " pod="openshift-insights/insights-runtime-extractor-jkxl9" Apr 24 21:28:15.531951 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:15.531929 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6847a7b4-b0b1-42ab-8771-88161eb09cc7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jkxl9\" (UID: \"6847a7b4-b0b1-42ab-8771-88161eb09cc7\") " pod="openshift-insights/insights-runtime-extractor-jkxl9" Apr 24 21:28:15.532049 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:15.532029 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6847a7b4-b0b1-42ab-8771-88161eb09cc7-crio-socket\") pod \"insights-runtime-extractor-jkxl9\" (UID: \"6847a7b4-b0b1-42ab-8771-88161eb09cc7\") " pod="openshift-insights/insights-runtime-extractor-jkxl9" Apr 24 21:28:15.532144 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:15.532080 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6847a7b4-b0b1-42ab-8771-88161eb09cc7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jkxl9\" (UID: \"6847a7b4-b0b1-42ab-8771-88161eb09cc7\") " pod="openshift-insights/insights-runtime-extractor-jkxl9" Apr 24 21:28:15.532210 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:15.532193 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6847a7b4-b0b1-42ab-8771-88161eb09cc7-data-volume\") pod \"insights-runtime-extractor-jkxl9\" (UID: \"6847a7b4-b0b1-42ab-8771-88161eb09cc7\") " pod="openshift-insights/insights-runtime-extractor-jkxl9" Apr 24 21:28:15.532444 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:15.532429 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6847a7b4-b0b1-42ab-8771-88161eb09cc7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jkxl9\" (UID: \"6847a7b4-b0b1-42ab-8771-88161eb09cc7\") " pod="openshift-insights/insights-runtime-extractor-jkxl9" Apr 24 21:28:15.534635 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:15.534610 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6847a7b4-b0b1-42ab-8771-88161eb09cc7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jkxl9\" (UID: \"6847a7b4-b0b1-42ab-8771-88161eb09cc7\") " pod="openshift-insights/insights-runtime-extractor-jkxl9" Apr 24 21:28:15.554802 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:15.554765 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fmd7\" (UniqueName: \"kubernetes.io/projected/6847a7b4-b0b1-42ab-8771-88161eb09cc7-kube-api-access-6fmd7\") pod \"insights-runtime-extractor-jkxl9\" (UID: \"6847a7b4-b0b1-42ab-8771-88161eb09cc7\") " pod="openshift-insights/insights-runtime-extractor-jkxl9" Apr 24 21:28:15.654816 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:15.654786 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-jkxl9" Apr 24 21:28:15.801449 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:15.801423 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-jkxl9"] Apr 24 21:28:15.802737 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:28:15.802715 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6847a7b4_b0b1_42ab_8771_88161eb09cc7.slice/crio-386365aeb4e32b2cbb0c8c0bb8865b75838fef51fffd01337f18b8255cd81a38 WatchSource:0}: Error finding container 386365aeb4e32b2cbb0c8c0bb8865b75838fef51fffd01337f18b8255cd81a38: Status 404 returned error can't find the container with id 386365aeb4e32b2cbb0c8c0bb8865b75838fef51fffd01337f18b8255cd81a38 Apr 24 21:28:16.805268 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:16.805192 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jkxl9" event={"ID":"6847a7b4-b0b1-42ab-8771-88161eb09cc7","Type":"ContainerStarted","Data":"036dac74a9101a5d55e9c7d6eaaf88368ae9c7ef7b8eb8f178a5bcd6468536b9"} Apr 24 21:28:16.805268 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:16.805232 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jkxl9" event={"ID":"6847a7b4-b0b1-42ab-8771-88161eb09cc7","Type":"ContainerStarted","Data":"386365aeb4e32b2cbb0c8c0bb8865b75838fef51fffd01337f18b8255cd81a38"} Apr 24 21:28:17.814961 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:17.814832 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jkxl9" event={"ID":"6847a7b4-b0b1-42ab-8771-88161eb09cc7","Type":"ContainerStarted","Data":"a1df5cdc87b62e885a4eed443abbc1d86792ac019a4fa5f4e2c23e2c304e40eb"} Apr 24 21:28:18.819496 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:18.819462 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jkxl9" event={"ID":"6847a7b4-b0b1-42ab-8771-88161eb09cc7","Type":"ContainerStarted","Data":"9604b597fe6e0b4fdedbdd4081b872a3bb5dda5be8000e09362bfbe94dc1118f"} Apr 24 21:28:18.850664 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:18.850604 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-jkxl9" podStartSLOduration=1.17366718 podStartE2EDuration="3.850585527s" podCreationTimestamp="2026-04-24 21:28:15 +0000 UTC" firstStartedPulling="2026-04-24 21:28:15.944769799 +0000 UTC m=+73.205338609" lastFinishedPulling="2026-04-24 21:28:18.621688142 +0000 UTC m=+75.882256956" observedRunningTime="2026-04-24 21:28:18.849736613 +0000 UTC m=+76.110305445" watchObservedRunningTime="2026-04-24 21:28:18.850585527 +0000 UTC m=+76.111154360" Apr 24 21:28:24.710171 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:24.710140 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-6xchx" Apr 24 21:28:24.801228 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:24.801194 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-bxds4" Apr 24 21:28:24.839065 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:24.839036 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-59fzh_494a20d0-ace3-481d-ae39-780b958f7150/cluster-monitoring-operator/0.log" Apr 24 21:28:24.839237 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:24.839088 2573 generic.go:358] "Generic (PLEG): container finished" podID="494a20d0-ace3-481d-ae39-780b958f7150" containerID="9f231cdf0bbda3582aebaa2669b97c3f0e2c021480afe6542d4bf321382845aa" exitCode=2 Apr 24 21:28:24.839237 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:24.839136 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59fzh" event={"ID":"494a20d0-ace3-481d-ae39-780b958f7150","Type":"ContainerDied","Data":"9f231cdf0bbda3582aebaa2669b97c3f0e2c021480afe6542d4bf321382845aa"} Apr 24 21:28:24.839498 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:24.839481 2573 scope.go:117] "RemoveContainer" containerID="9f231cdf0bbda3582aebaa2669b97c3f0e2c021480afe6542d4bf321382845aa" Apr 24 21:28:25.844046 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:25.844009 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-59fzh_494a20d0-ace3-481d-ae39-780b958f7150/cluster-monitoring-operator/0.log" Apr 24 21:28:25.844443 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:25.844150 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-59fzh" event={"ID":"494a20d0-ace3-481d-ae39-780b958f7150","Type":"ContainerStarted","Data":"68e952f8d8b3e742f170c766133f29ec405bba8b572b42aa5a450776d50022cc"} Apr 24 21:28:28.446625 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.446585 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-zrjdd"] Apr 24 21:28:28.450360 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.450342 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zrjdd" Apr 24 21:28:28.453171 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.453139 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:28:28.453323 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.453176 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:28:28.453613 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.453596 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:28:28.453714 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.453674 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:28:28.454129 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.454106 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-kvvhf\"" Apr 24 21:28:28.546022 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.545990 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d07de7db-2978-4232-a544-00d2bced602e-sys\") pod \"node-exporter-zrjdd\" (UID: \"d07de7db-2978-4232-a544-00d2bced602e\") " pod="openshift-monitoring/node-exporter-zrjdd" Apr 24 21:28:28.546258 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.546068 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d07de7db-2978-4232-a544-00d2bced602e-node-exporter-wtmp\") pod \"node-exporter-zrjdd\" (UID: \"d07de7db-2978-4232-a544-00d2bced602e\") " pod="openshift-monitoring/node-exporter-zrjdd" Apr 24 21:28:28.546258 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.546121 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d07de7db-2978-4232-a544-00d2bced602e-node-exporter-textfile\") pod \"node-exporter-zrjdd\" (UID: \"d07de7db-2978-4232-a544-00d2bced602e\") " pod="openshift-monitoring/node-exporter-zrjdd" Apr 24 21:28:28.546258 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.546148 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d07de7db-2978-4232-a544-00d2bced602e-metrics-client-ca\") pod \"node-exporter-zrjdd\" (UID: \"d07de7db-2978-4232-a544-00d2bced602e\") " pod="openshift-monitoring/node-exporter-zrjdd" Apr 24 21:28:28.546258 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.546174 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d07de7db-2978-4232-a544-00d2bced602e-node-exporter-tls\") pod \"node-exporter-zrjdd\" (UID: \"d07de7db-2978-4232-a544-00d2bced602e\") " pod="openshift-monitoring/node-exporter-zrjdd" Apr 24 21:28:28.546258 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.546198 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d07de7db-2978-4232-a544-00d2bced602e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zrjdd\" (UID: \"d07de7db-2978-4232-a544-00d2bced602e\") " pod="openshift-monitoring/node-exporter-zrjdd" Apr 24 21:28:28.546258 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.546235 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d07de7db-2978-4232-a544-00d2bced602e-node-exporter-accelerators-collector-config\") pod \"node-exporter-zrjdd\" (UID: \"d07de7db-2978-4232-a544-00d2bced602e\") " pod="openshift-monitoring/node-exporter-zrjdd" Apr 24 21:28:28.546258 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.546257 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdwq4\" (UniqueName: \"kubernetes.io/projected/d07de7db-2978-4232-a544-00d2bced602e-kube-api-access-kdwq4\") pod \"node-exporter-zrjdd\" (UID: \"d07de7db-2978-4232-a544-00d2bced602e\") " pod="openshift-monitoring/node-exporter-zrjdd" Apr 24 21:28:28.546529 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.546364 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d07de7db-2978-4232-a544-00d2bced602e-root\") pod \"node-exporter-zrjdd\" (UID: \"d07de7db-2978-4232-a544-00d2bced602e\") " pod="openshift-monitoring/node-exporter-zrjdd" Apr 24 21:28:28.647653 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.647619 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d07de7db-2978-4232-a544-00d2bced602e-node-exporter-wtmp\") pod \"node-exporter-zrjdd\" (UID: \"d07de7db-2978-4232-a544-00d2bced602e\") " pod="openshift-monitoring/node-exporter-zrjdd" Apr 24 21:28:28.647653 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.647659 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d07de7db-2978-4232-a544-00d2bced602e-node-exporter-textfile\") pod \"node-exporter-zrjdd\" (UID: \"d07de7db-2978-4232-a544-00d2bced602e\") " pod="openshift-monitoring/node-exporter-zrjdd" Apr 24 21:28:28.647895 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.647676 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d07de7db-2978-4232-a544-00d2bced602e-metrics-client-ca\") pod \"node-exporter-zrjdd\" (UID: \"d07de7db-2978-4232-a544-00d2bced602e\") " pod="openshift-monitoring/node-exporter-zrjdd" Apr 24 21:28:28.647895 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.647693 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d07de7db-2978-4232-a544-00d2bced602e-node-exporter-tls\") pod \"node-exporter-zrjdd\" (UID: \"d07de7db-2978-4232-a544-00d2bced602e\") " pod="openshift-monitoring/node-exporter-zrjdd" Apr 24 21:28:28.647895 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.647714 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d07de7db-2978-4232-a544-00d2bced602e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zrjdd\" (UID: \"d07de7db-2978-4232-a544-00d2bced602e\") " pod="openshift-monitoring/node-exporter-zrjdd" Apr 24 21:28:28.647895 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.647743 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d07de7db-2978-4232-a544-00d2bced602e-node-exporter-accelerators-collector-config\") pod \"node-exporter-zrjdd\" (UID: \"d07de7db-2978-4232-a544-00d2bced602e\") " pod="openshift-monitoring/node-exporter-zrjdd" Apr 24 21:28:28.647895 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.647766 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdwq4\" (UniqueName: \"kubernetes.io/projected/d07de7db-2978-4232-a544-00d2bced602e-kube-api-access-kdwq4\") pod \"node-exporter-zrjdd\" (UID: \"d07de7db-2978-4232-a544-00d2bced602e\") " pod="openshift-monitoring/node-exporter-zrjdd" Apr 24 21:28:28.647895 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.647816 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d07de7db-2978-4232-a544-00d2bced602e-node-exporter-wtmp\") pod \"node-exporter-zrjdd\" (UID: \"d07de7db-2978-4232-a544-00d2bced602e\") " pod="openshift-monitoring/node-exporter-zrjdd" Apr 24 21:28:28.647895 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:28:28.647820 2573 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 21:28:28.648268 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.647912 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d07de7db-2978-4232-a544-00d2bced602e-root\") pod \"node-exporter-zrjdd\" (UID: \"d07de7db-2978-4232-a544-00d2bced602e\") " pod="openshift-monitoring/node-exporter-zrjdd" Apr 24 21:28:28.648268 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:28:28.647931 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d07de7db-2978-4232-a544-00d2bced602e-node-exporter-tls podName:d07de7db-2978-4232-a544-00d2bced602e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:29.147907196 +0000 UTC m=+86.408476005 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/d07de7db-2978-4232-a544-00d2bced602e-node-exporter-tls") pod "node-exporter-zrjdd" (UID: "d07de7db-2978-4232-a544-00d2bced602e") : secret "node-exporter-tls" not found Apr 24 21:28:28.648268 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.648001 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d07de7db-2978-4232-a544-00d2bced602e-root\") pod \"node-exporter-zrjdd\" (UID: \"d07de7db-2978-4232-a544-00d2bced602e\") " pod="openshift-monitoring/node-exporter-zrjdd" Apr 24 21:28:28.648268 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.648013 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d07de7db-2978-4232-a544-00d2bced602e-sys\") pod \"node-exporter-zrjdd\" (UID: \"d07de7db-2978-4232-a544-00d2bced602e\") " pod="openshift-monitoring/node-exporter-zrjdd" Apr 24 21:28:28.648268 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.648031 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d07de7db-2978-4232-a544-00d2bced602e-node-exporter-textfile\") pod \"node-exporter-zrjdd\" (UID: \"d07de7db-2978-4232-a544-00d2bced602e\") " pod="openshift-monitoring/node-exporter-zrjdd" Apr 24 21:28:28.648268 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.648058 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d07de7db-2978-4232-a544-00d2bced602e-sys\") pod \"node-exporter-zrjdd\" (UID: \"d07de7db-2978-4232-a544-00d2bced602e\") " pod="openshift-monitoring/node-exporter-zrjdd" Apr 24 21:28:28.648491 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.648349 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d07de7db-2978-4232-a544-00d2bced602e-metrics-client-ca\") pod \"node-exporter-zrjdd\" (UID: \"d07de7db-2978-4232-a544-00d2bced602e\") " pod="openshift-monitoring/node-exporter-zrjdd" Apr 24 21:28:28.648491 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.648376 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d07de7db-2978-4232-a544-00d2bced602e-node-exporter-accelerators-collector-config\") pod \"node-exporter-zrjdd\" (UID: \"d07de7db-2978-4232-a544-00d2bced602e\") " pod="openshift-monitoring/node-exporter-zrjdd" Apr 24 21:28:28.650108 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.650073 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d07de7db-2978-4232-a544-00d2bced602e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zrjdd\" (UID: \"d07de7db-2978-4232-a544-00d2bced602e\") " pod="openshift-monitoring/node-exporter-zrjdd" Apr 24 21:28:28.660225 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.660204 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdwq4\" (UniqueName: \"kubernetes.io/projected/d07de7db-2978-4232-a544-00d2bced602e-kube-api-access-kdwq4\") pod \"node-exporter-zrjdd\" (UID: \"d07de7db-2978-4232-a544-00d2bced602e\") " pod="openshift-monitoring/node-exporter-zrjdd" Apr 24 21:28:28.761948 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.761864 2573 patch_prober.go:28] interesting pod/image-registry-5cd4fd4699-r7szn container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 21:28:28.762086 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:28.761941 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" podUID="4c9bfa95-cfa8-4d7f-9dda-3d100a02d251" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:28:29.152934 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:29.152899 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d07de7db-2978-4232-a544-00d2bced602e-node-exporter-tls\") pod \"node-exporter-zrjdd\" (UID: \"d07de7db-2978-4232-a544-00d2bced602e\") " pod="openshift-monitoring/node-exporter-zrjdd" Apr 24 21:28:29.155262 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:29.155240 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d07de7db-2978-4232-a544-00d2bced602e-node-exporter-tls\") pod \"node-exporter-zrjdd\" (UID: \"d07de7db-2978-4232-a544-00d2bced602e\") " pod="openshift-monitoring/node-exporter-zrjdd" Apr 24 21:28:29.360736 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:29.360707 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zrjdd" Apr 24 21:28:29.369793 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:28:29.369760 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd07de7db_2978_4232_a544_00d2bced602e.slice/crio-ca1af911cb04d6db9cea76795986d954a462f08e10112c433da466944e7b1559 WatchSource:0}: Error finding container ca1af911cb04d6db9cea76795986d954a462f08e10112c433da466944e7b1559: Status 404 returned error can't find the container with id ca1af911cb04d6db9cea76795986d954a462f08e10112c433da466944e7b1559 Apr 24 21:28:29.857311 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:29.857275 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zrjdd" event={"ID":"d07de7db-2978-4232-a544-00d2bced602e","Type":"ContainerStarted","Data":"ca1af911cb04d6db9cea76795986d954a462f08e10112c433da466944e7b1559"} Apr 24 21:28:30.778903 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:30.778871 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:28:30.862221 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:30.862185 2573 generic.go:358] "Generic (PLEG): container finished" podID="d07de7db-2978-4232-a544-00d2bced602e" containerID="df7b4a09e9a287c27cd59c38e3a65433662f70bab9ab75c0364ca92bb2697549" exitCode=0 Apr 24 21:28:30.862585 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:30.862257 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zrjdd" event={"ID":"d07de7db-2978-4232-a544-00d2bced602e","Type":"ContainerDied","Data":"df7b4a09e9a287c27cd59c38e3a65433662f70bab9ab75c0364ca92bb2697549"} Apr 24 21:28:31.867563 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:31.867525 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zrjdd" event={"ID":"d07de7db-2978-4232-a544-00d2bced602e","Type":"ContainerStarted","Data":"b7328ec3f712af9d191b3fb40fad99389f9af7d3e97cf6585219893f78a45470"} Apr 24 21:28:31.867563 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:31.867564 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zrjdd" event={"ID":"d07de7db-2978-4232-a544-00d2bced602e","Type":"ContainerStarted","Data":"111da6a4cb057a2673dd2ce213e7d98667fbc7479d692678d0963c106d80f99d"} Apr 24 21:28:31.888630 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:31.888580 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-zrjdd" podStartSLOduration=2.965292473 podStartE2EDuration="3.888565222s" podCreationTimestamp="2026-04-24 21:28:28 +0000 UTC" firstStartedPulling="2026-04-24 21:28:29.371442146 +0000 UTC m=+86.632010957" lastFinishedPulling="2026-04-24 21:28:30.294714896 +0000 UTC m=+87.555283706" observedRunningTime="2026-04-24 21:28:31.886876405 +0000 UTC m=+89.147445232" watchObservedRunningTime="2026-04-24 21:28:31.888565222 +0000 UTC m=+89.149134106" Apr 24 21:28:38.896572 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:38.896528 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5cd4fd4699-r7szn"] Apr 24 21:28:40.859210 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:40.859166 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wh284\" (UID: \"5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wh284" Apr 24 21:28:40.861712 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:40.861686 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wh284\" (UID: \"5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wh284" Apr 24 21:28:40.978192 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:40.978156 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-vsgnj\"" Apr 24 21:28:40.986417 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:40.986370 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wh284" Apr 24 21:28:41.142708 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:41.142630 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-wh284"] Apr 24 21:28:41.145889 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:28:41.145859 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5153ee6e_f9de_4ac6_afdf_6f0f0bd574d4.slice/crio-55a0edcda67e3c9f8328f21862723d7e65aa69bc3af4c3d0f57975321248eb71 WatchSource:0}: Error finding container 55a0edcda67e3c9f8328f21862723d7e65aa69bc3af4c3d0f57975321248eb71: Status 404 returned error can't find the container with id 55a0edcda67e3c9f8328f21862723d7e65aa69bc3af4c3d0f57975321248eb71 Apr 24 21:28:41.896823 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:41.896780 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wh284" event={"ID":"5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4","Type":"ContainerStarted","Data":"55a0edcda67e3c9f8328f21862723d7e65aa69bc3af4c3d0f57975321248eb71"} Apr 24 21:28:42.901380 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:42.901346 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wh284" event={"ID":"5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4","Type":"ContainerStarted","Data":"b26cd1ed2750a7dd93bba0bb4c159cdf066971baf97eece15030c2e2548b1540"} Apr 24 21:28:42.920176 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:42.920125 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wh284" podStartSLOduration=79.100115485 podStartE2EDuration="1m19.920082708s" podCreationTimestamp="2026-04-24 21:27:23 +0000 UTC" firstStartedPulling="2026-04-24 21:28:41.147795824 +0000 UTC m=+98.408364634" lastFinishedPulling="2026-04-24 21:28:41.967763046 +0000 UTC m=+99.228331857" observedRunningTime="2026-04-24 21:28:42.919502064 +0000 UTC m=+100.180070908" watchObservedRunningTime="2026-04-24 21:28:42.920082708 +0000 UTC m=+100.180651531" Apr 24 21:28:58.952967 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:58.952929 2573 generic.go:358] "Generic (PLEG): container finished" podID="c4788265-16ce-4770-b838-025f0f7d06aa" containerID="5c5169b16bed226741e9e3a918e76842986613f7d9ca6f5212bf5afc2f765842" exitCode=0 Apr 24 21:28:58.953451 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:58.953015 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-44vlp" event={"ID":"c4788265-16ce-4770-b838-025f0f7d06aa","Type":"ContainerDied","Data":"5c5169b16bed226741e9e3a918e76842986613f7d9ca6f5212bf5afc2f765842"} Apr 24 21:28:58.953451 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:58.953406 2573 scope.go:117] "RemoveContainer" containerID="5c5169b16bed226741e9e3a918e76842986613f7d9ca6f5212bf5afc2f765842" Apr 24 21:28:59.898462 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:59.898433 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-58974b8966-z2xx8_1768c1b9-8390-4833-9613-4efec510f36b/router/0.log" Apr 24 21:28:59.910939 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:59.910913 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8rjd5_6024e94b-b234-46ca-80d6-f505949e48ac/serve-healthcheck-canary/0.log" Apr 24 21:28:59.957510 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:28:59.957477 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-44vlp" event={"ID":"c4788265-16ce-4770-b838-025f0f7d06aa","Type":"ContainerStarted","Data":"985dbdfbbc904952368ef817b97e3941a87bfbbe9e5dc8e59b7d10a101383adb"} Apr 24 21:29:02.967195 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:02.967157 2573 generic.go:358] "Generic (PLEG): container finished" podID="b7fe0f94-c8a8-42d9-b783-cfecce93ab05" containerID="4d43b54ae970f3cd7055c1a54618da275a0940fc9688f05141d4e28f9fbfc828" exitCode=0 Apr 24 21:29:02.967599 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:02.967217 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jnvn" event={"ID":"b7fe0f94-c8a8-42d9-b783-cfecce93ab05","Type":"ContainerDied","Data":"4d43b54ae970f3cd7055c1a54618da275a0940fc9688f05141d4e28f9fbfc828"} Apr 24 21:29:02.967599 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:02.967551 2573 scope.go:117] "RemoveContainer" containerID="4d43b54ae970f3cd7055c1a54618da275a0940fc9688f05141d4e28f9fbfc828" Apr 24 21:29:03.915638 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:03.915598 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" podUID="4c9bfa95-cfa8-4d7f-9dda-3d100a02d251" containerName="registry" containerID="cri-o://747eb5c82aff256d97e27dabd1cfc6d763e1197109c99021fd4fa22ed0facd11" gracePeriod=30 Apr 24 21:29:03.971158 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:03.971126 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9jnvn" event={"ID":"b7fe0f94-c8a8-42d9-b783-cfecce93ab05","Type":"ContainerStarted","Data":"1b0ecce38400e97bcb7b6917e3eaf3ff045e65c732ebb8155f3407572606c8c7"} Apr 24 21:29:03.972500 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:03.972474 2573 generic.go:358] "Generic (PLEG): container finished" podID="bd2084eb-fcc8-42e3-b526-171c67ac7a71" containerID="84017bbc52481564a11d160cd80464f7235ad4ca09979bbf38df46a514fe3166" exitCode=0 Apr 24 21:29:03.972600 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:03.972538 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v4rfn" event={"ID":"bd2084eb-fcc8-42e3-b526-171c67ac7a71","Type":"ContainerDied","Data":"84017bbc52481564a11d160cd80464f7235ad4ca09979bbf38df46a514fe3166"} Apr 24 21:29:03.972823 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:03.972811 2573 scope.go:117] "RemoveContainer" containerID="84017bbc52481564a11d160cd80464f7235ad4ca09979bbf38df46a514fe3166" Apr 24 21:29:04.170203 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.170142 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:29:04.254781 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.254726 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-ca-trust-extracted\") pod \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " Apr 24 21:29:04.254781 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.254774 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-trusted-ca\") pod \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " Apr 24 21:29:04.255029 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.254818 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p96wc\" (UniqueName: \"kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-kube-api-access-p96wc\") pod \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " Apr 24 21:29:04.255029 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.254871 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-registry-tls\") pod \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " Apr 24 21:29:04.255029 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.254901 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-bound-sa-token\") pod \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " Apr 24 21:29:04.255029 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.254949 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-image-registry-private-configuration\") pod \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " Apr 24 21:29:04.255029 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.254974 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-installation-pull-secrets\") pod \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " Apr 24 21:29:04.255029 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.255012 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-registry-certificates\") pod \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\" (UID: \"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251\") " Apr 24 21:29:04.255355 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.255326 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4c9bfa95-cfa8-4d7f-9dda-3d100a02d251" (UID: "4c9bfa95-cfa8-4d7f-9dda-3d100a02d251"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:04.255644 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.255616 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4c9bfa95-cfa8-4d7f-9dda-3d100a02d251" (UID: "4c9bfa95-cfa8-4d7f-9dda-3d100a02d251"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:04.257590 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.257541 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4c9bfa95-cfa8-4d7f-9dda-3d100a02d251" (UID: "4c9bfa95-cfa8-4d7f-9dda-3d100a02d251"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:04.257590 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.257561 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-kube-api-access-p96wc" (OuterVolumeSpecName: "kube-api-access-p96wc") pod "4c9bfa95-cfa8-4d7f-9dda-3d100a02d251" (UID: "4c9bfa95-cfa8-4d7f-9dda-3d100a02d251"). InnerVolumeSpecName "kube-api-access-p96wc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:04.257899 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.257872 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4c9bfa95-cfa8-4d7f-9dda-3d100a02d251" (UID: "4c9bfa95-cfa8-4d7f-9dda-3d100a02d251"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:04.257980 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.257896 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "4c9bfa95-cfa8-4d7f-9dda-3d100a02d251" (UID: "4c9bfa95-cfa8-4d7f-9dda-3d100a02d251"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:04.257980 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.257897 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4c9bfa95-cfa8-4d7f-9dda-3d100a02d251" (UID: "4c9bfa95-cfa8-4d7f-9dda-3d100a02d251"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:04.263596 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.263574 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4c9bfa95-cfa8-4d7f-9dda-3d100a02d251" (UID: "4c9bfa95-cfa8-4d7f-9dda-3d100a02d251"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:29:04.356593 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.356551 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-registry-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:29:04.356593 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.356586 2573 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-bound-sa-token\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:29:04.356800 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.356607 2573 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-image-registry-private-configuration\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:29:04.356800 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.356618 2573 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-installation-pull-secrets\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:29:04.356800 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.356630 2573 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-registry-certificates\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:29:04.356800 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.356638 2573 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-ca-trust-extracted\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:29:04.356800 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.356647 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-trusted-ca\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:29:04.356800 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.356656 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p96wc\" (UniqueName: \"kubernetes.io/projected/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251-kube-api-access-p96wc\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:29:04.977572 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.977534 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v4rfn" event={"ID":"bd2084eb-fcc8-42e3-b526-171c67ac7a71","Type":"ContainerStarted","Data":"1c4b248eae801a51fc470c9ebe807286334479af523ba2f3ba8370a8a2b86c3c"} Apr 24 21:29:04.978679 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.978653 2573 generic.go:358] "Generic (PLEG): container finished" podID="4c9bfa95-cfa8-4d7f-9dda-3d100a02d251" containerID="747eb5c82aff256d97e27dabd1cfc6d763e1197109c99021fd4fa22ed0facd11" exitCode=0 Apr 24 21:29:04.978789 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.978699 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" Apr 24 21:29:04.978789 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.978697 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" event={"ID":"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251","Type":"ContainerDied","Data":"747eb5c82aff256d97e27dabd1cfc6d763e1197109c99021fd4fa22ed0facd11"} Apr 24 21:29:04.978857 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.978807 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cd4fd4699-r7szn" event={"ID":"4c9bfa95-cfa8-4d7f-9dda-3d100a02d251","Type":"ContainerDied","Data":"e4a0e5218d28fbb90ecd0545e46d1246f1bfec1e9d32abd2f4a5c392d0bb76ae"} Apr 24 21:29:04.978857 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.978823 2573 scope.go:117] "RemoveContainer" containerID="747eb5c82aff256d97e27dabd1cfc6d763e1197109c99021fd4fa22ed0facd11" Apr 24 21:29:04.987018 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.986998 2573 scope.go:117] "RemoveContainer" containerID="747eb5c82aff256d97e27dabd1cfc6d763e1197109c99021fd4fa22ed0facd11" Apr 24 21:29:04.987310 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:29:04.987290 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"747eb5c82aff256d97e27dabd1cfc6d763e1197109c99021fd4fa22ed0facd11\": container with ID starting with 747eb5c82aff256d97e27dabd1cfc6d763e1197109c99021fd4fa22ed0facd11 not found: ID does not exist" containerID="747eb5c82aff256d97e27dabd1cfc6d763e1197109c99021fd4fa22ed0facd11" Apr 24 21:29:04.987376 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:04.987319 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"747eb5c82aff256d97e27dabd1cfc6d763e1197109c99021fd4fa22ed0facd11"} err="failed to get container status \"747eb5c82aff256d97e27dabd1cfc6d763e1197109c99021fd4fa22ed0facd11\": rpc error: code = NotFound desc = could not find container \"747eb5c82aff256d97e27dabd1cfc6d763e1197109c99021fd4fa22ed0facd11\": container with ID starting with 747eb5c82aff256d97e27dabd1cfc6d763e1197109c99021fd4fa22ed0facd11 not found: ID does not exist" Apr 24 21:29:05.025898 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:05.025870 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5cd4fd4699-r7szn"] Apr 24 21:29:05.033898 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:05.033864 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5cd4fd4699-r7szn"] Apr 24 21:29:05.356579 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:05.356548 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c9bfa95-cfa8-4d7f-9dda-3d100a02d251" path="/var/lib/kubelet/pods/4c9bfa95-cfa8-4d7f-9dda-3d100a02d251/volumes" Apr 24 21:29:07.151203 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:07.151158 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" podUID="c78eaad7-e680-4b37-9286-453234917ab8" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 21:29:17.150914 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:17.150870 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" podUID="c78eaad7-e680-4b37-9286-453234917ab8" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 21:29:27.150759 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:27.150716 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" podUID="c78eaad7-e680-4b37-9286-453234917ab8" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 21:29:27.151242 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:27.150796 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" Apr 24 21:29:27.151447 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:27.151423 2573 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"da159ba341e4a364adcdb309c0b41aaf28adbcf581bbd62cb53a7069d3506e9a"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 24 21:29:27.151518 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:27.151475 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" podUID="c78eaad7-e680-4b37-9286-453234917ab8" containerName="service-proxy" containerID="cri-o://da159ba341e4a364adcdb309c0b41aaf28adbcf581bbd62cb53a7069d3506e9a" gracePeriod=30 Apr 24 21:29:28.048308 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:28.048274 2573 generic.go:358] "Generic (PLEG): container finished" podID="c78eaad7-e680-4b37-9286-453234917ab8" containerID="da159ba341e4a364adcdb309c0b41aaf28adbcf581bbd62cb53a7069d3506e9a" exitCode=2 Apr 24 21:29:28.048486 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:28.048332 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" event={"ID":"c78eaad7-e680-4b37-9286-453234917ab8","Type":"ContainerDied","Data":"da159ba341e4a364adcdb309c0b41aaf28adbcf581bbd62cb53a7069d3506e9a"} Apr 24 21:29:28.048486 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:29:28.048366 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-84459467dd-t4mr9" event={"ID":"c78eaad7-e680-4b37-9286-453234917ab8","Type":"ContainerStarted","Data":"3a40bfc4c65a1ff8eb032c4f951a3f167464a8d22c72302a1cb29015f66a5f5a"} Apr 24 21:32:03.256317 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:03.256283 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-59fzh_494a20d0-ace3-481d-ae39-780b958f7150/cluster-monitoring-operator/0.log" Apr 24 21:32:03.259082 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:03.259060 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-59fzh_494a20d0-ace3-481d-ae39-780b958f7150/cluster-monitoring-operator/0.log" Apr 24 21:32:03.261202 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:03.261184 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4nztl_b3e83451-d37f-4c5b-837f-6440bed57b91/console-operator/1.log" Apr 24 21:32:03.264415 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:03.264389 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4nztl_b3e83451-d37f-4c5b-837f-6440bed57b91/console-operator/1.log" Apr 24 21:32:03.277017 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:32:03.276995 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 21:34:25.317442 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:25.317358 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc"] Apr 24 21:34:25.317891 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:25.317701 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c9bfa95-cfa8-4d7f-9dda-3d100a02d251" containerName="registry" Apr 24 21:34:25.317891 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:25.317714 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c9bfa95-cfa8-4d7f-9dda-3d100a02d251" containerName="registry" Apr 24 21:34:25.317891 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:25.317781 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c9bfa95-cfa8-4d7f-9dda-3d100a02d251" containerName="registry" Apr 24 21:34:25.320787 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:25.320769 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" Apr 24 21:34:25.323996 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:25.323968 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:34:25.324611 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:25.324586 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-f87qg\"" Apr 24 21:34:25.324854 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:25.324617 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:34:25.324982 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:25.324630 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-predictor-serving-cert\"" Apr 24 21:34:25.324982 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:25.324641 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-kube-rbac-proxy-sar-config\"" Apr 24 21:34:25.340434 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:25.340396 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc"] Apr 24 21:34:25.422408 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:25.422373 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4249ca4d-5581-4557-aeb2-69c27ee9c041-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-n7ckc\" (UID: \"4249ca4d-5581-4557-aeb2-69c27ee9c041\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" Apr 24 21:34:25.422607 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:25.422439 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4249ca4d-5581-4557-aeb2-69c27ee9c041-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-n7ckc\" (UID: \"4249ca4d-5581-4557-aeb2-69c27ee9c041\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" Apr 24 21:34:25.422607 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:25.422518 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m6mq\" (UniqueName: \"kubernetes.io/projected/4249ca4d-5581-4557-aeb2-69c27ee9c041-kube-api-access-2m6mq\") pod \"isvc-xgboost-graph-predictor-669d8d6456-n7ckc\" (UID: \"4249ca4d-5581-4557-aeb2-69c27ee9c041\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" Apr 24 21:34:25.422607 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:25.422559 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4249ca4d-5581-4557-aeb2-69c27ee9c041-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-n7ckc\" (UID: \"4249ca4d-5581-4557-aeb2-69c27ee9c041\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" Apr 24 21:34:25.523190 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:25.523155 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4249ca4d-5581-4557-aeb2-69c27ee9c041-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-n7ckc\" (UID: \"4249ca4d-5581-4557-aeb2-69c27ee9c041\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" Apr 24 21:34:25.523383 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:25.523210 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4249ca4d-5581-4557-aeb2-69c27ee9c041-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-n7ckc\" (UID: \"4249ca4d-5581-4557-aeb2-69c27ee9c041\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" Apr 24 21:34:25.523383 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:25.523248 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2m6mq\" (UniqueName: \"kubernetes.io/projected/4249ca4d-5581-4557-aeb2-69c27ee9c041-kube-api-access-2m6mq\") pod \"isvc-xgboost-graph-predictor-669d8d6456-n7ckc\" (UID: \"4249ca4d-5581-4557-aeb2-69c27ee9c041\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" Apr 24 21:34:25.523383 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:25.523267 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4249ca4d-5581-4557-aeb2-69c27ee9c041-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-n7ckc\" (UID: \"4249ca4d-5581-4557-aeb2-69c27ee9c041\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" Apr 24 21:34:25.523670 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:25.523651 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4249ca4d-5581-4557-aeb2-69c27ee9c041-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-n7ckc\" (UID: \"4249ca4d-5581-4557-aeb2-69c27ee9c041\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" Apr 24 21:34:25.523919 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:25.523899 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4249ca4d-5581-4557-aeb2-69c27ee9c041-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-n7ckc\" (UID: \"4249ca4d-5581-4557-aeb2-69c27ee9c041\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" Apr 24 21:34:25.525639 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:25.525622 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4249ca4d-5581-4557-aeb2-69c27ee9c041-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-n7ckc\" (UID: \"4249ca4d-5581-4557-aeb2-69c27ee9c041\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" Apr 24 21:34:25.538943 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:25.538915 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m6mq\" (UniqueName: \"kubernetes.io/projected/4249ca4d-5581-4557-aeb2-69c27ee9c041-kube-api-access-2m6mq\") pod \"isvc-xgboost-graph-predictor-669d8d6456-n7ckc\" (UID: \"4249ca4d-5581-4557-aeb2-69c27ee9c041\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" Apr 24 21:34:25.632293 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:25.632236 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" Apr 24 21:34:25.767754 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:25.767714 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc"] Apr 24 21:34:25.770591 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:34:25.770563 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4249ca4d_5581_4557_aeb2_69c27ee9c041.slice/crio-ba08f2a4bb65afbb5fc3ede78e2e6c696ba62a9e66570daff675eb298529a397 WatchSource:0}: Error finding container ba08f2a4bb65afbb5fc3ede78e2e6c696ba62a9e66570daff675eb298529a397: Status 404 returned error can't find the container with id ba08f2a4bb65afbb5fc3ede78e2e6c696ba62a9e66570daff675eb298529a397 Apr 24 21:34:25.772375 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:25.772358 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:34:25.897561 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:25.897468 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" event={"ID":"4249ca4d-5581-4557-aeb2-69c27ee9c041","Type":"ContainerStarted","Data":"ba08f2a4bb65afbb5fc3ede78e2e6c696ba62a9e66570daff675eb298529a397"} Apr 24 21:34:30.914308 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:30.914265 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" event={"ID":"4249ca4d-5581-4557-aeb2-69c27ee9c041","Type":"ContainerStarted","Data":"7a2d7c87b498e6a550359b96562e6d003229373297438a9929bad55dcf087be6"} Apr 24 21:34:33.924638 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:33.924603 2573 generic.go:358] "Generic (PLEG): container finished" podID="4249ca4d-5581-4557-aeb2-69c27ee9c041" containerID="7a2d7c87b498e6a550359b96562e6d003229373297438a9929bad55dcf087be6" exitCode=0 Apr 24 21:34:33.925034 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:33.924682 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" event={"ID":"4249ca4d-5581-4557-aeb2-69c27ee9c041","Type":"ContainerDied","Data":"7a2d7c87b498e6a550359b96562e6d003229373297438a9929bad55dcf087be6"} Apr 24 21:34:53.989137 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:53.989085 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" event={"ID":"4249ca4d-5581-4557-aeb2-69c27ee9c041","Type":"ContainerStarted","Data":"e1c5329e09832817e6d32875005e2237eff55f2193cddac11afa8cc467c98b63"} Apr 24 21:34:56.999227 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:56.999181 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" event={"ID":"4249ca4d-5581-4557-aeb2-69c27ee9c041","Type":"ContainerStarted","Data":"7700b17d42839d85104013b3d7c7edd96473e58185df1668909ee1275227a79f"} Apr 24 21:34:56.999710 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:56.999370 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" Apr 24 21:34:57.021289 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:57.021238 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" podStartSLOduration=1.296853237 podStartE2EDuration="32.021220313s" podCreationTimestamp="2026-04-24 21:34:25 +0000 UTC" firstStartedPulling="2026-04-24 21:34:25.772486114 +0000 UTC m=+443.033054923" lastFinishedPulling="2026-04-24 21:34:56.496853186 +0000 UTC m=+473.757421999" observedRunningTime="2026-04-24 21:34:57.019567885 +0000 UTC m=+474.280136717" watchObservedRunningTime="2026-04-24 21:34:57.021220313 +0000 UTC m=+474.281789150" Apr 24 21:34:58.002680 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:58.002643 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" Apr 24 21:34:58.003981 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:58.003949 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" podUID="4249ca4d-5581-4557-aeb2-69c27ee9c041" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 24 21:34:59.005491 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:34:59.005449 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" podUID="4249ca4d-5581-4557-aeb2-69c27ee9c041" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 24 21:35:04.010088 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:04.010059 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" Apr 24 21:35:04.010654 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:04.010628 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" podUID="4249ca4d-5581-4557-aeb2-69c27ee9c041" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 24 21:35:14.010795 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:14.010750 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" podUID="4249ca4d-5581-4557-aeb2-69c27ee9c041" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 24 21:35:24.010909 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:24.010868 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" podUID="4249ca4d-5581-4557-aeb2-69c27ee9c041" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 24 21:35:34.011275 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:34.011226 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" podUID="4249ca4d-5581-4557-aeb2-69c27ee9c041" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 24 21:35:44.010734 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:44.010691 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" podUID="4249ca4d-5581-4557-aeb2-69c27ee9c041" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 24 21:35:45.062037 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:45.061993 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk"] Apr 24 21:35:45.075646 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:45.075610 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk" Apr 24 21:35:45.077946 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:45.077920 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk"] Apr 24 21:35:45.079266 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:45.079245 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-9db4b-kube-rbac-proxy-sar-config\"" Apr 24 21:35:45.079767 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:45.079745 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-9db4b-serving-cert\"" Apr 24 21:35:45.148858 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:45.148821 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3-openshift-service-ca-bundle\") pod \"switch-graph-9db4b-8484996d65-m4klk\" (UID: \"b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3\") " pod="kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk" Apr 24 21:35:45.148858 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:45.148862 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3-proxy-tls\") pod \"switch-graph-9db4b-8484996d65-m4klk\" (UID: \"b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3\") " pod="kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk" Apr 24 21:35:45.250434 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:45.250387 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3-openshift-service-ca-bundle\") pod \"switch-graph-9db4b-8484996d65-m4klk\" (UID: \"b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3\") " pod="kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk" Apr 24 21:35:45.250626 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:45.250466 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3-proxy-tls\") pod \"switch-graph-9db4b-8484996d65-m4klk\" (UID: \"b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3\") " pod="kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk" Apr 24 21:35:45.251034 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:45.251010 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3-openshift-service-ca-bundle\") pod \"switch-graph-9db4b-8484996d65-m4klk\" (UID: \"b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3\") " pod="kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk" Apr 24 21:35:45.252895 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:45.252873 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3-proxy-tls\") pod \"switch-graph-9db4b-8484996d65-m4klk\" (UID: \"b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3\") " pod="kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk" Apr 24 21:35:45.386719 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:45.386681 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk" Apr 24 21:35:45.524327 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:45.524293 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk"] Apr 24 21:35:45.527697 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:35:45.527669 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4e5e7bc_e9dd_48c1_b2b3_05cfb35fbee3.slice/crio-6bdfb6662409f473e8dc78576f3ce185c3953b6549788483ac8ee1bed612b23c WatchSource:0}: Error finding container 6bdfb6662409f473e8dc78576f3ce185c3953b6549788483ac8ee1bed612b23c: Status 404 returned error can't find the container with id 6bdfb6662409f473e8dc78576f3ce185c3953b6549788483ac8ee1bed612b23c Apr 24 21:35:46.136766 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:46.136727 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk" event={"ID":"b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3","Type":"ContainerStarted","Data":"6bdfb6662409f473e8dc78576f3ce185c3953b6549788483ac8ee1bed612b23c"} Apr 24 21:35:48.144153 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:48.144117 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk" event={"ID":"b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3","Type":"ContainerStarted","Data":"272566c92b915a438419dae42e47d0873d87618fc68a309ed3053db790c82b36"} Apr 24 21:35:48.144541 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:48.144273 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk" Apr 24 21:35:48.165929 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:48.165875 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk" podStartSLOduration=1.185712104 podStartE2EDuration="3.165859713s" podCreationTimestamp="2026-04-24 21:35:45 +0000 UTC" firstStartedPulling="2026-04-24 21:35:45.529510739 +0000 UTC m=+522.790079549" lastFinishedPulling="2026-04-24 21:35:47.509658345 +0000 UTC m=+524.770227158" observedRunningTime="2026-04-24 21:35:48.163979359 +0000 UTC m=+525.424548191" watchObservedRunningTime="2026-04-24 21:35:48.165859713 +0000 UTC m=+525.426428545" Apr 24 21:35:54.011410 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:54.011370 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" podUID="4249ca4d-5581-4557-aeb2-69c27ee9c041" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.23:8080: connect: connection refused" Apr 24 21:35:54.153482 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:54.153451 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk" Apr 24 21:35:55.145736 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:55.145701 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk"] Apr 24 21:35:55.146155 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:55.145933 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk" podUID="b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3" containerName="switch-graph-9db4b" containerID="cri-o://272566c92b915a438419dae42e47d0873d87618fc68a309ed3053db790c82b36" gracePeriod=30 Apr 24 21:35:59.151633 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:35:59.151588 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk" podUID="b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3" containerName="switch-graph-9db4b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:36:04.011556 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:04.011521 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" Apr 24 21:36:04.151756 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:04.151707 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk" podUID="b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3" containerName="switch-graph-9db4b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:36:09.151262 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:09.151219 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk" podUID="b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3" containerName="switch-graph-9db4b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:36:09.151794 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:09.151334 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk" Apr 24 21:36:14.151577 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:14.151538 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk" podUID="b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3" containerName="switch-graph-9db4b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:36:19.151212 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:19.151159 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk" podUID="b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3" containerName="switch-graph-9db4b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:36:24.151853 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:24.151811 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk" podUID="b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3" containerName="switch-graph-9db4b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:36:25.061425 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.061387 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr"] Apr 24 21:36:25.067396 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.067376 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr" Apr 24 21:36:25.077198 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.077177 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-serving-cert\"" Apr 24 21:36:25.077315 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.077177 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-kube-rbac-proxy-sar-config\"" Apr 24 21:36:25.092494 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.092465 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr"] Apr 24 21:36:25.169646 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:36:25.169610 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4e5e7bc_e9dd_48c1_b2b3_05cfb35fbee3.slice/crio-272566c92b915a438419dae42e47d0873d87618fc68a309ed3053db790c82b36.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4e5e7bc_e9dd_48c1_b2b3_05cfb35fbee3.slice/crio-conmon-272566c92b915a438419dae42e47d0873d87618fc68a309ed3053db790c82b36.scope\": RecentStats: unable to find data in memory cache]" Apr 24 21:36:25.170039 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:36:25.169676 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4e5e7bc_e9dd_48c1_b2b3_05cfb35fbee3.slice/crio-conmon-272566c92b915a438419dae42e47d0873d87618fc68a309ed3053db790c82b36.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4e5e7bc_e9dd_48c1_b2b3_05cfb35fbee3.slice/crio-272566c92b915a438419dae42e47d0873d87618fc68a309ed3053db790c82b36.scope\": RecentStats: unable to find data in memory cache]" Apr 24 21:36:25.170039 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:36:25.169685 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4e5e7bc_e9dd_48c1_b2b3_05cfb35fbee3.slice/crio-conmon-272566c92b915a438419dae42e47d0873d87618fc68a309ed3053db790c82b36.scope\": RecentStats: unable to find data in memory cache]" Apr 24 21:36:25.175766 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.175731 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6-proxy-tls\") pod \"model-chainer-85c5fc8d94-dvnrr\" (UID: \"4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6\") " pod="kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr" Apr 24 21:36:25.175928 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.175879 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6-openshift-service-ca-bundle\") pod \"model-chainer-85c5fc8d94-dvnrr\" (UID: \"4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6\") " pod="kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr" Apr 24 21:36:25.254162 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.254123 2573 generic.go:358] "Generic (PLEG): container finished" podID="b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3" containerID="272566c92b915a438419dae42e47d0873d87618fc68a309ed3053db790c82b36" exitCode=0 Apr 24 21:36:25.254355 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.254201 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk" event={"ID":"b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3","Type":"ContainerDied","Data":"272566c92b915a438419dae42e47d0873d87618fc68a309ed3053db790c82b36"} Apr 24 21:36:25.276480 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.276446 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6-openshift-service-ca-bundle\") pod \"model-chainer-85c5fc8d94-dvnrr\" (UID: \"4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6\") " pod="kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr" Apr 24 21:36:25.276665 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.276498 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6-proxy-tls\") pod \"model-chainer-85c5fc8d94-dvnrr\" (UID: \"4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6\") " pod="kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr" Apr 24 21:36:25.277107 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.277064 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6-openshift-service-ca-bundle\") pod \"model-chainer-85c5fc8d94-dvnrr\" (UID: \"4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6\") " pod="kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr" Apr 24 21:36:25.279111 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.279071 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6-proxy-tls\") pod \"model-chainer-85c5fc8d94-dvnrr\" (UID: \"4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6\") " pod="kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr" Apr 24 21:36:25.377604 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.377564 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr" Apr 24 21:36:25.504314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.504183 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr"] Apr 24 21:36:25.507165 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:36:25.507140 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d3b7b13_4461_4dac_a0c5_1d9e857b2ae6.slice/crio-ca92c8615a4886373af78649dcf0e500738bca3993143d22ac6ab43679b0df96 WatchSource:0}: Error finding container ca92c8615a4886373af78649dcf0e500738bca3993143d22ac6ab43679b0df96: Status 404 returned error can't find the container with id ca92c8615a4886373af78649dcf0e500738bca3993143d22ac6ab43679b0df96 Apr 24 21:36:25.771109 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.771072 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk" Apr 24 21:36:25.881593 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.881554 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3-openshift-service-ca-bundle\") pod \"b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3\" (UID: \"b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3\") " Apr 24 21:36:25.881776 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.881606 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3-proxy-tls\") pod \"b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3\" (UID: \"b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3\") " Apr 24 21:36:25.881930 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.881908 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3" (UID: "b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:36:25.883806 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.883785 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3" (UID: "b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:36:25.982559 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.982474 2573 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3-openshift-service-ca-bundle\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:36:25.982559 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:25.982501 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:36:26.264019 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:26.263916 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk" event={"ID":"b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3","Type":"ContainerDied","Data":"6bdfb6662409f473e8dc78576f3ce185c3953b6549788483ac8ee1bed612b23c"} Apr 24 21:36:26.264019 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:26.263936 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk" Apr 24 21:36:26.264019 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:26.263972 2573 scope.go:117] "RemoveContainer" containerID="272566c92b915a438419dae42e47d0873d87618fc68a309ed3053db790c82b36" Apr 24 21:36:26.265545 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:26.265522 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr" event={"ID":"4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6","Type":"ContainerStarted","Data":"a86b9eebeff58e69b78d89ce7e6b3c50b9085b783d339af0a40a8409137709e2"} Apr 24 21:36:26.265645 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:26.265550 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr" event={"ID":"4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6","Type":"ContainerStarted","Data":"ca92c8615a4886373af78649dcf0e500738bca3993143d22ac6ab43679b0df96"} Apr 24 21:36:26.265703 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:26.265680 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr" Apr 24 21:36:26.299294 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:26.299247 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr" podStartSLOduration=1.299230073 podStartE2EDuration="1.299230073s" podCreationTimestamp="2026-04-24 21:36:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:36:26.297686034 +0000 UTC m=+563.558254867" watchObservedRunningTime="2026-04-24 21:36:26.299230073 +0000 UTC m=+563.559798904" Apr 24 21:36:26.312592 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:26.312558 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk"] Apr 24 21:36:26.322355 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:26.322327 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-9db4b-8484996d65-m4klk"] Apr 24 21:36:27.355949 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:27.355914 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3" path="/var/lib/kubelet/pods/b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3/volumes" Apr 24 21:36:32.275020 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:32.274986 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr" Apr 24 21:36:35.089415 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:35.089377 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr"] Apr 24 21:36:35.089874 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:35.089613 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr" podUID="4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6" containerName="model-chainer" containerID="cri-o://a86b9eebeff58e69b78d89ce7e6b3c50b9085b783d339af0a40a8409137709e2" gracePeriod=30 Apr 24 21:36:35.208161 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:35.208121 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc"] Apr 24 21:36:35.208486 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:35.208460 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" podUID="4249ca4d-5581-4557-aeb2-69c27ee9c041" containerName="kserve-container" containerID="cri-o://e1c5329e09832817e6d32875005e2237eff55f2193cddac11afa8cc467c98b63" gracePeriod=30 Apr 24 21:36:35.208635 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:35.208569 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" podUID="4249ca4d-5581-4557-aeb2-69c27ee9c041" containerName="kube-rbac-proxy" containerID="cri-o://7700b17d42839d85104013b3d7c7edd96473e58185df1668909ee1275227a79f" gracePeriod=30 Apr 24 21:36:36.298204 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:36.298169 2573 generic.go:358] "Generic (PLEG): container finished" podID="4249ca4d-5581-4557-aeb2-69c27ee9c041" containerID="7700b17d42839d85104013b3d7c7edd96473e58185df1668909ee1275227a79f" exitCode=2 Apr 24 21:36:36.298577 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:36.298226 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" event={"ID":"4249ca4d-5581-4557-aeb2-69c27ee9c041","Type":"ContainerDied","Data":"7700b17d42839d85104013b3d7c7edd96473e58185df1668909ee1275227a79f"} Apr 24 21:36:37.273315 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:37.273268 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr" podUID="4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:36:39.052494 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:39.052469 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" Apr 24 21:36:39.198780 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:39.198746 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4249ca4d-5581-4557-aeb2-69c27ee9c041-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"4249ca4d-5581-4557-aeb2-69c27ee9c041\" (UID: \"4249ca4d-5581-4557-aeb2-69c27ee9c041\") " Apr 24 21:36:39.198967 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:39.198791 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m6mq\" (UniqueName: \"kubernetes.io/projected/4249ca4d-5581-4557-aeb2-69c27ee9c041-kube-api-access-2m6mq\") pod \"4249ca4d-5581-4557-aeb2-69c27ee9c041\" (UID: \"4249ca4d-5581-4557-aeb2-69c27ee9c041\") " Apr 24 21:36:39.198967 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:39.198826 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4249ca4d-5581-4557-aeb2-69c27ee9c041-proxy-tls\") pod \"4249ca4d-5581-4557-aeb2-69c27ee9c041\" (UID: \"4249ca4d-5581-4557-aeb2-69c27ee9c041\") " Apr 24 21:36:39.198967 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:39.198876 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4249ca4d-5581-4557-aeb2-69c27ee9c041-kserve-provision-location\") pod \"4249ca4d-5581-4557-aeb2-69c27ee9c041\" (UID: \"4249ca4d-5581-4557-aeb2-69c27ee9c041\") " Apr 24 21:36:39.199164 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:39.199137 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4249ca4d-5581-4557-aeb2-69c27ee9c041-isvc-xgboost-graph-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-kube-rbac-proxy-sar-config") pod "4249ca4d-5581-4557-aeb2-69c27ee9c041" (UID: "4249ca4d-5581-4557-aeb2-69c27ee9c041"). InnerVolumeSpecName "isvc-xgboost-graph-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:36:39.199292 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:39.199267 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4249ca4d-5581-4557-aeb2-69c27ee9c041-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4249ca4d-5581-4557-aeb2-69c27ee9c041" (UID: "4249ca4d-5581-4557-aeb2-69c27ee9c041"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:36:39.200981 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:39.200958 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4249ca4d-5581-4557-aeb2-69c27ee9c041-kube-api-access-2m6mq" (OuterVolumeSpecName: "kube-api-access-2m6mq") pod "4249ca4d-5581-4557-aeb2-69c27ee9c041" (UID: "4249ca4d-5581-4557-aeb2-69c27ee9c041"). InnerVolumeSpecName "kube-api-access-2m6mq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:36:39.201081 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:39.200964 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4249ca4d-5581-4557-aeb2-69c27ee9c041-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4249ca4d-5581-4557-aeb2-69c27ee9c041" (UID: "4249ca4d-5581-4557-aeb2-69c27ee9c041"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:36:39.299868 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:39.299829 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4249ca4d-5581-4557-aeb2-69c27ee9c041-kserve-provision-location\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:36:39.299868 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:39.299861 2573 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4249ca4d-5581-4557-aeb2-69c27ee9c041-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:36:39.299868 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:39.299872 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2m6mq\" (UniqueName: \"kubernetes.io/projected/4249ca4d-5581-4557-aeb2-69c27ee9c041-kube-api-access-2m6mq\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:36:39.300127 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:39.299883 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4249ca4d-5581-4557-aeb2-69c27ee9c041-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:36:39.309676 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:39.309647 2573 generic.go:358] "Generic (PLEG): container finished" podID="4249ca4d-5581-4557-aeb2-69c27ee9c041" containerID="e1c5329e09832817e6d32875005e2237eff55f2193cddac11afa8cc467c98b63" exitCode=0 Apr 24 21:36:39.309824 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:39.309731 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" Apr 24 21:36:39.309824 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:39.309729 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" event={"ID":"4249ca4d-5581-4557-aeb2-69c27ee9c041","Type":"ContainerDied","Data":"e1c5329e09832817e6d32875005e2237eff55f2193cddac11afa8cc467c98b63"} Apr 24 21:36:39.309824 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:39.309766 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" event={"ID":"4249ca4d-5581-4557-aeb2-69c27ee9c041","Type":"ContainerDied","Data":"ba08f2a4bb65afbb5fc3ede78e2e6c696ba62a9e66570daff675eb298529a397"} Apr 24 21:36:39.309824 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:39.309787 2573 scope.go:117] "RemoveContainer" containerID="7700b17d42839d85104013b3d7c7edd96473e58185df1668909ee1275227a79f" Apr 24 21:36:39.318016 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:39.317992 2573 scope.go:117] "RemoveContainer" containerID="e1c5329e09832817e6d32875005e2237eff55f2193cddac11afa8cc467c98b63" Apr 24 21:36:39.325546 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:39.325528 2573 scope.go:117] "RemoveContainer" containerID="7a2d7c87b498e6a550359b96562e6d003229373297438a9929bad55dcf087be6" Apr 24 21:36:39.332932 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:39.332798 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc"] Apr 24 21:36:39.332999 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:39.332957 2573 scope.go:117] "RemoveContainer" containerID="7700b17d42839d85104013b3d7c7edd96473e58185df1668909ee1275227a79f" Apr 24 21:36:39.333270 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:36:39.333251 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7700b17d42839d85104013b3d7c7edd96473e58185df1668909ee1275227a79f\": container with ID starting with 7700b17d42839d85104013b3d7c7edd96473e58185df1668909ee1275227a79f not found: ID does not exist" containerID="7700b17d42839d85104013b3d7c7edd96473e58185df1668909ee1275227a79f" Apr 24 21:36:39.333313 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:39.333279 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7700b17d42839d85104013b3d7c7edd96473e58185df1668909ee1275227a79f"} err="failed to get container status \"7700b17d42839d85104013b3d7c7edd96473e58185df1668909ee1275227a79f\": rpc error: code = NotFound desc = could not find container \"7700b17d42839d85104013b3d7c7edd96473e58185df1668909ee1275227a79f\": container with ID starting with 7700b17d42839d85104013b3d7c7edd96473e58185df1668909ee1275227a79f not found: ID does not exist" Apr 24 21:36:39.333313 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:39.333299 2573 scope.go:117] "RemoveContainer" containerID="e1c5329e09832817e6d32875005e2237eff55f2193cddac11afa8cc467c98b63" Apr 24 21:36:39.333501 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:36:39.333484 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1c5329e09832817e6d32875005e2237eff55f2193cddac11afa8cc467c98b63\": container with ID starting with e1c5329e09832817e6d32875005e2237eff55f2193cddac11afa8cc467c98b63 not found: ID does not exist" containerID="e1c5329e09832817e6d32875005e2237eff55f2193cddac11afa8cc467c98b63" Apr 24 21:36:39.333550 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:39.333504 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1c5329e09832817e6d32875005e2237eff55f2193cddac11afa8cc467c98b63"} err="failed to get container status \"e1c5329e09832817e6d32875005e2237eff55f2193cddac11afa8cc467c98b63\": rpc error: code = NotFound desc = could not find container \"e1c5329e09832817e6d32875005e2237eff55f2193cddac11afa8cc467c98b63\": container with ID starting with e1c5329e09832817e6d32875005e2237eff55f2193cddac11afa8cc467c98b63 not found: ID does not exist" Apr 24 21:36:39.333550 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:39.333517 2573 scope.go:117] "RemoveContainer" containerID="7a2d7c87b498e6a550359b96562e6d003229373297438a9929bad55dcf087be6" Apr 24 21:36:39.333741 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:36:39.333722 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a2d7c87b498e6a550359b96562e6d003229373297438a9929bad55dcf087be6\": container with ID starting with 7a2d7c87b498e6a550359b96562e6d003229373297438a9929bad55dcf087be6 not found: ID does not exist" containerID="7a2d7c87b498e6a550359b96562e6d003229373297438a9929bad55dcf087be6" Apr 24 21:36:39.333798 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:39.333751 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a2d7c87b498e6a550359b96562e6d003229373297438a9929bad55dcf087be6"} err="failed to get container status \"7a2d7c87b498e6a550359b96562e6d003229373297438a9929bad55dcf087be6\": rpc error: code = NotFound desc = could not find container \"7a2d7c87b498e6a550359b96562e6d003229373297438a9929bad55dcf087be6\": container with ID starting with 7a2d7c87b498e6a550359b96562e6d003229373297438a9929bad55dcf087be6 not found: ID does not exist" Apr 24 21:36:39.339261 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:39.339235 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc"] Apr 24 21:36:39.355906 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:39.355876 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4249ca4d-5581-4557-aeb2-69c27ee9c041" path="/var/lib/kubelet/pods/4249ca4d-5581-4557-aeb2-69c27ee9c041/volumes" Apr 24 21:36:40.007309 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:40.007260 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-n7ckc" podUID="4249ca4d-5581-4557-aeb2-69c27ee9c041" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.23:8643/healthz\": context deadline exceeded" Apr 24 21:36:42.273305 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:42.273208 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr" podUID="4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:36:47.273329 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:47.273289 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr" podUID="4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:36:47.273719 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:47.273403 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr" Apr 24 21:36:52.273548 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:52.273492 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr" podUID="4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:36:55.404421 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:55.404380 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g"] Apr 24 21:36:55.404802 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:55.404661 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4249ca4d-5581-4557-aeb2-69c27ee9c041" containerName="kserve-container" Apr 24 21:36:55.404802 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:55.404672 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="4249ca4d-5581-4557-aeb2-69c27ee9c041" containerName="kserve-container" Apr 24 21:36:55.404802 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:55.404693 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4249ca4d-5581-4557-aeb2-69c27ee9c041" containerName="storage-initializer" Apr 24 21:36:55.404802 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:55.404699 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="4249ca4d-5581-4557-aeb2-69c27ee9c041" containerName="storage-initializer" Apr 24 21:36:55.404802 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:55.404705 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4249ca4d-5581-4557-aeb2-69c27ee9c041" containerName="kube-rbac-proxy" Apr 24 21:36:55.404802 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:55.404711 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="4249ca4d-5581-4557-aeb2-69c27ee9c041" containerName="kube-rbac-proxy" Apr 24 21:36:55.404802 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:55.404718 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3" containerName="switch-graph-9db4b" Apr 24 21:36:55.404802 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:55.404724 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3" containerName="switch-graph-9db4b" Apr 24 21:36:55.404802 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:55.404772 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="4249ca4d-5581-4557-aeb2-69c27ee9c041" containerName="kserve-container" Apr 24 21:36:55.404802 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:55.404779 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4e5e7bc-e9dd-48c1-b2b3-05cfb35fbee3" containerName="switch-graph-9db4b" Apr 24 21:36:55.404802 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:55.404785 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="4249ca4d-5581-4557-aeb2-69c27ee9c041" containerName="kube-rbac-proxy" Apr 24 21:36:55.407497 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:55.407478 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g" Apr 24 21:36:55.409633 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:55.409606 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-6d3ba-serving-cert\"" Apr 24 21:36:55.409765 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:55.409608 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-6d3ba-kube-rbac-proxy-sar-config\"" Apr 24 21:36:55.416475 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:55.416449 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g"] Apr 24 21:36:55.533069 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:55.533025 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9e392a8-a3cb-4106-91db-d76190d71448-openshift-service-ca-bundle\") pod \"switch-graph-6d3ba-548c7746dd-xgm2g\" (UID: \"a9e392a8-a3cb-4106-91db-d76190d71448\") " pod="kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g" Apr 24 21:36:55.533301 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:55.533084 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9e392a8-a3cb-4106-91db-d76190d71448-proxy-tls\") pod \"switch-graph-6d3ba-548c7746dd-xgm2g\" (UID: \"a9e392a8-a3cb-4106-91db-d76190d71448\") " pod="kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g" Apr 24 21:36:55.633809 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:55.633763 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9e392a8-a3cb-4106-91db-d76190d71448-openshift-service-ca-bundle\") pod \"switch-graph-6d3ba-548c7746dd-xgm2g\" (UID: \"a9e392a8-a3cb-4106-91db-d76190d71448\") " pod="kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g" Apr 24 21:36:55.633991 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:55.633848 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9e392a8-a3cb-4106-91db-d76190d71448-proxy-tls\") pod \"switch-graph-6d3ba-548c7746dd-xgm2g\" (UID: \"a9e392a8-a3cb-4106-91db-d76190d71448\") " pod="kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g" Apr 24 21:36:55.634568 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:55.634542 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9e392a8-a3cb-4106-91db-d76190d71448-openshift-service-ca-bundle\") pod \"switch-graph-6d3ba-548c7746dd-xgm2g\" (UID: \"a9e392a8-a3cb-4106-91db-d76190d71448\") " pod="kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g" Apr 24 21:36:55.636431 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:55.636409 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9e392a8-a3cb-4106-91db-d76190d71448-proxy-tls\") pod \"switch-graph-6d3ba-548c7746dd-xgm2g\" (UID: \"a9e392a8-a3cb-4106-91db-d76190d71448\") " pod="kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g" Apr 24 21:36:55.718087 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:55.717996 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g" Apr 24 21:36:55.842701 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:55.842676 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g"] Apr 24 21:36:55.845496 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:36:55.845467 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9e392a8_a3cb_4106_91db_d76190d71448.slice/crio-dbfb09e0c94d195ad23a28ab31b34596131c4d24ae811ba984842c74c96eff95 WatchSource:0}: Error finding container dbfb09e0c94d195ad23a28ab31b34596131c4d24ae811ba984842c74c96eff95: Status 404 returned error can't find the container with id dbfb09e0c94d195ad23a28ab31b34596131c4d24ae811ba984842c74c96eff95 Apr 24 21:36:56.363236 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:56.363199 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g" event={"ID":"a9e392a8-a3cb-4106-91db-d76190d71448","Type":"ContainerStarted","Data":"5bd4b626388442655f328f8c2c1b1bf34ffcfce34e189063992425df0e3b84be"} Apr 24 21:36:56.363236 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:56.363236 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g" event={"ID":"a9e392a8-a3cb-4106-91db-d76190d71448","Type":"ContainerStarted","Data":"dbfb09e0c94d195ad23a28ab31b34596131c4d24ae811ba984842c74c96eff95"} Apr 24 21:36:56.363464 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:56.363320 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g" Apr 24 21:36:56.380137 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:56.380053 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g" podStartSLOduration=1.380038093 podStartE2EDuration="1.380038093s" podCreationTimestamp="2026-04-24 21:36:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:36:56.379068137 +0000 UTC m=+593.639636970" watchObservedRunningTime="2026-04-24 21:36:56.380038093 +0000 UTC m=+593.640606927" Apr 24 21:36:57.273850 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:36:57.273810 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr" podUID="4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:37:02.273630 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:02.273575 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr" podUID="4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:37:02.370886 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:02.370851 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g" Apr 24 21:37:03.286770 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:03.286738 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-59fzh_494a20d0-ace3-481d-ae39-780b958f7150/cluster-monitoring-operator/0.log" Apr 24 21:37:03.287258 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:03.286946 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-59fzh_494a20d0-ace3-481d-ae39-780b958f7150/cluster-monitoring-operator/0.log" Apr 24 21:37:03.291750 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:03.291726 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4nztl_b3e83451-d37f-4c5b-837f-6440bed57b91/console-operator/1.log" Apr 24 21:37:03.291897 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:03.291726 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4nztl_b3e83451-d37f-4c5b-837f-6440bed57b91/console-operator/1.log" Apr 24 21:37:05.236050 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:05.236025 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr" Apr 24 21:37:05.391796 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:05.391760 2573 generic.go:358] "Generic (PLEG): container finished" podID="4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6" containerID="a86b9eebeff58e69b78d89ce7e6b3c50b9085b783d339af0a40a8409137709e2" exitCode=0 Apr 24 21:37:05.391946 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:05.391805 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr" event={"ID":"4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6","Type":"ContainerDied","Data":"a86b9eebeff58e69b78d89ce7e6b3c50b9085b783d339af0a40a8409137709e2"} Apr 24 21:37:05.391946 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:05.391828 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr" Apr 24 21:37:05.391946 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:05.391847 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr" event={"ID":"4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6","Type":"ContainerDied","Data":"ca92c8615a4886373af78649dcf0e500738bca3993143d22ac6ab43679b0df96"} Apr 24 21:37:05.391946 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:05.391864 2573 scope.go:117] "RemoveContainer" containerID="a86b9eebeff58e69b78d89ce7e6b3c50b9085b783d339af0a40a8409137709e2" Apr 24 21:37:05.399601 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:05.399582 2573 scope.go:117] "RemoveContainer" containerID="a86b9eebeff58e69b78d89ce7e6b3c50b9085b783d339af0a40a8409137709e2" Apr 24 21:37:05.399877 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:37:05.399859 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a86b9eebeff58e69b78d89ce7e6b3c50b9085b783d339af0a40a8409137709e2\": container with ID starting with a86b9eebeff58e69b78d89ce7e6b3c50b9085b783d339af0a40a8409137709e2 not found: ID does not exist" containerID="a86b9eebeff58e69b78d89ce7e6b3c50b9085b783d339af0a40a8409137709e2" Apr 24 21:37:05.399919 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:05.399889 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a86b9eebeff58e69b78d89ce7e6b3c50b9085b783d339af0a40a8409137709e2"} err="failed to get container status \"a86b9eebeff58e69b78d89ce7e6b3c50b9085b783d339af0a40a8409137709e2\": rpc error: code = NotFound desc = could not find container \"a86b9eebeff58e69b78d89ce7e6b3c50b9085b783d339af0a40a8409137709e2\": container with ID starting with a86b9eebeff58e69b78d89ce7e6b3c50b9085b783d339af0a40a8409137709e2 not found: ID does not exist" Apr 24 21:37:05.413715 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:05.413684 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6-proxy-tls\") pod \"4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6\" (UID: \"4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6\") " Apr 24 21:37:05.413850 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:05.413735 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6-openshift-service-ca-bundle\") pod \"4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6\" (UID: \"4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6\") " Apr 24 21:37:05.414212 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:05.414182 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6" (UID: "4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:37:05.415789 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:05.415769 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6" (UID: "4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:37:05.515016 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:05.514956 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:37:05.515016 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:05.515007 2573 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6-openshift-service-ca-bundle\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:37:05.714625 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:05.714590 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr"] Apr 24 21:37:05.719383 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:05.719357 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-85c5fc8d94-dvnrr"] Apr 24 21:37:07.356987 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:07.356952 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6" path="/var/lib/kubelet/pods/4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6/volumes" Apr 24 21:37:35.283846 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:35.283811 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk"] Apr 24 21:37:35.284456 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:35.284263 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6" containerName="model-chainer" Apr 24 21:37:35.284456 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:35.284283 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6" containerName="model-chainer" Apr 24 21:37:35.284456 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:35.284376 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d3b7b13-4461-4dac-a0c5-1d9e857b2ae6" containerName="model-chainer" Apr 24 21:37:35.287245 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:35.287222 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk" Apr 24 21:37:35.289505 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:35.289481 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-54155-serving-cert\"" Apr 24 21:37:35.289686 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:35.289664 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-54155-kube-rbac-proxy-sar-config\"" Apr 24 21:37:35.297765 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:35.297736 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk"] Apr 24 21:37:35.347149 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:35.347084 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df15de12-ba86-43e8-bd9d-8cea0f40f72a-openshift-service-ca-bundle\") pod \"sequence-graph-54155-84d8774cb-sx4nk\" (UID: \"df15de12-ba86-43e8-bd9d-8cea0f40f72a\") " pod="kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk" Apr 24 21:37:35.347321 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:35.347215 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df15de12-ba86-43e8-bd9d-8cea0f40f72a-proxy-tls\") pod \"sequence-graph-54155-84d8774cb-sx4nk\" (UID: \"df15de12-ba86-43e8-bd9d-8cea0f40f72a\") " pod="kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk" Apr 24 21:37:35.447915 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:35.447871 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df15de12-ba86-43e8-bd9d-8cea0f40f72a-proxy-tls\") pod \"sequence-graph-54155-84d8774cb-sx4nk\" (UID: \"df15de12-ba86-43e8-bd9d-8cea0f40f72a\") " pod="kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk" Apr 24 21:37:35.448143 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:35.447949 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df15de12-ba86-43e8-bd9d-8cea0f40f72a-openshift-service-ca-bundle\") pod \"sequence-graph-54155-84d8774cb-sx4nk\" (UID: \"df15de12-ba86-43e8-bd9d-8cea0f40f72a\") " pod="kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk" Apr 24 21:37:35.448576 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:35.448555 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df15de12-ba86-43e8-bd9d-8cea0f40f72a-openshift-service-ca-bundle\") pod \"sequence-graph-54155-84d8774cb-sx4nk\" (UID: \"df15de12-ba86-43e8-bd9d-8cea0f40f72a\") " pod="kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk" Apr 24 21:37:35.450317 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:35.450297 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df15de12-ba86-43e8-bd9d-8cea0f40f72a-proxy-tls\") pod \"sequence-graph-54155-84d8774cb-sx4nk\" (UID: \"df15de12-ba86-43e8-bd9d-8cea0f40f72a\") " pod="kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk" Apr 24 21:37:35.598698 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:35.598598 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk" Apr 24 21:37:35.734007 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:35.733970 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk"] Apr 24 21:37:35.738277 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:37:35.738232 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf15de12_ba86_43e8_bd9d_8cea0f40f72a.slice/crio-b385a9938dc81083ca77fc66f6561be9df6cd319fdf3eb54d0f638dbd7c63788 WatchSource:0}: Error finding container b385a9938dc81083ca77fc66f6561be9df6cd319fdf3eb54d0f638dbd7c63788: Status 404 returned error can't find the container with id b385a9938dc81083ca77fc66f6561be9df6cd319fdf3eb54d0f638dbd7c63788 Apr 24 21:37:36.483952 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:36.483910 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk" event={"ID":"df15de12-ba86-43e8-bd9d-8cea0f40f72a","Type":"ContainerStarted","Data":"fc67847fc4001e08360434585df56cc80da3d7b899882cfce701f377431feabe"} Apr 24 21:37:36.483952 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:36.483955 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk" event={"ID":"df15de12-ba86-43e8-bd9d-8cea0f40f72a","Type":"ContainerStarted","Data":"b385a9938dc81083ca77fc66f6561be9df6cd319fdf3eb54d0f638dbd7c63788"} Apr 24 21:37:36.484418 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:36.483986 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk" Apr 24 21:37:36.518289 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:36.518235 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk" podStartSLOduration=1.5182173159999999 podStartE2EDuration="1.518217316s" podCreationTimestamp="2026-04-24 21:37:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:37:36.517270593 +0000 UTC m=+633.777839423" watchObservedRunningTime="2026-04-24 21:37:36.518217316 +0000 UTC m=+633.778786148" Apr 24 21:37:42.492643 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:37:42.492615 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk" Apr 24 21:42:03.307818 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:42:03.307739 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-59fzh_494a20d0-ace3-481d-ae39-780b958f7150/cluster-monitoring-operator/0.log" Apr 24 21:42:03.310222 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:42:03.310195 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-59fzh_494a20d0-ace3-481d-ae39-780b958f7150/cluster-monitoring-operator/0.log" Apr 24 21:42:03.312926 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:42:03.312900 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4nztl_b3e83451-d37f-4c5b-837f-6440bed57b91/console-operator/1.log" Apr 24 21:42:03.314901 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:42:03.314883 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4nztl_b3e83451-d37f-4c5b-837f-6440bed57b91/console-operator/1.log" Apr 24 21:45:10.182330 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:45:10.182285 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g"] Apr 24 21:45:10.184785 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:45:10.182569 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g" podUID="a9e392a8-a3cb-4106-91db-d76190d71448" containerName="switch-graph-6d3ba" containerID="cri-o://5bd4b626388442655f328f8c2c1b1bf34ffcfce34e189063992425df0e3b84be" gracePeriod=30 Apr 24 21:45:12.370657 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:45:12.370610 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g" podUID="a9e392a8-a3cb-4106-91db-d76190d71448" containerName="switch-graph-6d3ba" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:45:17.369860 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:45:17.369814 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g" podUID="a9e392a8-a3cb-4106-91db-d76190d71448" containerName="switch-graph-6d3ba" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:45:22.369846 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:45:22.369805 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g" podUID="a9e392a8-a3cb-4106-91db-d76190d71448" containerName="switch-graph-6d3ba" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:45:22.370281 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:45:22.369954 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g" Apr 24 21:45:27.370043 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:45:27.370003 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g" podUID="a9e392a8-a3cb-4106-91db-d76190d71448" containerName="switch-graph-6d3ba" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:45:32.369634 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:45:32.369592 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g" podUID="a9e392a8-a3cb-4106-91db-d76190d71448" containerName="switch-graph-6d3ba" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:45:37.369677 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:45:37.369637 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g" podUID="a9e392a8-a3cb-4106-91db-d76190d71448" containerName="switch-graph-6d3ba" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:45:40.324918 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:45:40.324893 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g" Apr 24 21:45:40.397641 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:45:40.397603 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9e392a8-a3cb-4106-91db-d76190d71448-openshift-service-ca-bundle\") pod \"a9e392a8-a3cb-4106-91db-d76190d71448\" (UID: \"a9e392a8-a3cb-4106-91db-d76190d71448\") " Apr 24 21:45:40.397830 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:45:40.397654 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9e392a8-a3cb-4106-91db-d76190d71448-proxy-tls\") pod \"a9e392a8-a3cb-4106-91db-d76190d71448\" (UID: \"a9e392a8-a3cb-4106-91db-d76190d71448\") " Apr 24 21:45:40.397988 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:45:40.397964 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9e392a8-a3cb-4106-91db-d76190d71448-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "a9e392a8-a3cb-4106-91db-d76190d71448" (UID: "a9e392a8-a3cb-4106-91db-d76190d71448"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:45:40.399791 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:45:40.399770 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9e392a8-a3cb-4106-91db-d76190d71448-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a9e392a8-a3cb-4106-91db-d76190d71448" (UID: "a9e392a8-a3cb-4106-91db-d76190d71448"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:45:40.498664 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:45:40.498568 2573 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9e392a8-a3cb-4106-91db-d76190d71448-openshift-service-ca-bundle\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:45:40.498664 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:45:40.498599 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a9e392a8-a3cb-4106-91db-d76190d71448-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:45:40.902978 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:45:40.902942 2573 generic.go:358] "Generic (PLEG): container finished" podID="a9e392a8-a3cb-4106-91db-d76190d71448" containerID="5bd4b626388442655f328f8c2c1b1bf34ffcfce34e189063992425df0e3b84be" exitCode=0 Apr 24 21:45:40.903290 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:45:40.903002 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g" Apr 24 21:45:40.903290 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:45:40.903031 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g" event={"ID":"a9e392a8-a3cb-4106-91db-d76190d71448","Type":"ContainerDied","Data":"5bd4b626388442655f328f8c2c1b1bf34ffcfce34e189063992425df0e3b84be"} Apr 24 21:45:40.903290 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:45:40.903077 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g" event={"ID":"a9e392a8-a3cb-4106-91db-d76190d71448","Type":"ContainerDied","Data":"dbfb09e0c94d195ad23a28ab31b34596131c4d24ae811ba984842c74c96eff95"} Apr 24 21:45:40.903290 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:45:40.903116 2573 scope.go:117] "RemoveContainer" containerID="5bd4b626388442655f328f8c2c1b1bf34ffcfce34e189063992425df0e3b84be" Apr 24 21:45:40.911309 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:45:40.911292 2573 scope.go:117] "RemoveContainer" containerID="5bd4b626388442655f328f8c2c1b1bf34ffcfce34e189063992425df0e3b84be" Apr 24 21:45:40.911570 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:45:40.911549 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bd4b626388442655f328f8c2c1b1bf34ffcfce34e189063992425df0e3b84be\": container with ID starting with 5bd4b626388442655f328f8c2c1b1bf34ffcfce34e189063992425df0e3b84be not found: ID does not exist" containerID="5bd4b626388442655f328f8c2c1b1bf34ffcfce34e189063992425df0e3b84be" Apr 24 21:45:40.911641 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:45:40.911578 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bd4b626388442655f328f8c2c1b1bf34ffcfce34e189063992425df0e3b84be"} err="failed to get container status \"5bd4b626388442655f328f8c2c1b1bf34ffcfce34e189063992425df0e3b84be\": rpc error: code = NotFound desc = could not find container \"5bd4b626388442655f328f8c2c1b1bf34ffcfce34e189063992425df0e3b84be\": container with ID starting with 5bd4b626388442655f328f8c2c1b1bf34ffcfce34e189063992425df0e3b84be not found: ID does not exist" Apr 24 21:45:40.923973 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:45:40.923948 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g"] Apr 24 21:45:40.928223 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:45:40.928198 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-6d3ba-548c7746dd-xgm2g"] Apr 24 21:45:41.357079 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:45:41.357047 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9e392a8-a3cb-4106-91db-d76190d71448" path="/var/lib/kubelet/pods/a9e392a8-a3cb-4106-91db-d76190d71448/volumes" Apr 24 21:45:50.062902 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:45:50.062869 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk"] Apr 24 21:45:50.063356 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:45:50.063109 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk" podUID="df15de12-ba86-43e8-bd9d-8cea0f40f72a" containerName="sequence-graph-54155" containerID="cri-o://fc67847fc4001e08360434585df56cc80da3d7b899882cfce701f377431feabe" gracePeriod=30 Apr 24 21:45:52.491184 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:45:52.491142 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk" podUID="df15de12-ba86-43e8-bd9d-8cea0f40f72a" containerName="sequence-graph-54155" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:45:57.490535 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:45:57.490481 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk" podUID="df15de12-ba86-43e8-bd9d-8cea0f40f72a" containerName="sequence-graph-54155" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:46:02.490267 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:02.490225 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk" podUID="df15de12-ba86-43e8-bd9d-8cea0f40f72a" containerName="sequence-graph-54155" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:46:02.490701 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:02.490333 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk" Apr 24 21:46:07.490967 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:07.490921 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk" podUID="df15de12-ba86-43e8-bd9d-8cea0f40f72a" containerName="sequence-graph-54155" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:46:10.387203 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:10.387162 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc"] Apr 24 21:46:10.387606 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:10.387485 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9e392a8-a3cb-4106-91db-d76190d71448" containerName="switch-graph-6d3ba" Apr 24 21:46:10.387606 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:10.387499 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9e392a8-a3cb-4106-91db-d76190d71448" containerName="switch-graph-6d3ba" Apr 24 21:46:10.387606 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:10.387562 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a9e392a8-a3cb-4106-91db-d76190d71448" containerName="switch-graph-6d3ba" Apr 24 21:46:10.391394 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:10.391376 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc" Apr 24 21:46:10.393672 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:10.393646 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-dd3fc-kube-rbac-proxy-sar-config\"" Apr 24 21:46:10.393770 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:10.393646 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-dd3fc-serving-cert\"" Apr 24 21:46:10.396933 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:10.396910 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc"] Apr 24 21:46:10.539353 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:10.539310 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/40bf3ac8-34f0-4165-8262-5138231ed11f-proxy-tls\") pod \"ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc\" (UID: \"40bf3ac8-34f0-4165-8262-5138231ed11f\") " pod="kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc" Apr 24 21:46:10.539546 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:10.539388 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40bf3ac8-34f0-4165-8262-5138231ed11f-openshift-service-ca-bundle\") pod \"ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc\" (UID: \"40bf3ac8-34f0-4165-8262-5138231ed11f\") " pod="kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc" Apr 24 21:46:10.640739 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:10.640651 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40bf3ac8-34f0-4165-8262-5138231ed11f-openshift-service-ca-bundle\") pod \"ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc\" (UID: \"40bf3ac8-34f0-4165-8262-5138231ed11f\") " pod="kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc" Apr 24 21:46:10.640739 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:10.640712 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/40bf3ac8-34f0-4165-8262-5138231ed11f-proxy-tls\") pod \"ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc\" (UID: \"40bf3ac8-34f0-4165-8262-5138231ed11f\") " pod="kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc" Apr 24 21:46:10.640950 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:46:10.640816 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-dd3fc-serving-cert: secret "ensemble-graph-dd3fc-serving-cert" not found Apr 24 21:46:10.640950 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:46:10.640886 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40bf3ac8-34f0-4165-8262-5138231ed11f-proxy-tls podName:40bf3ac8-34f0-4165-8262-5138231ed11f nodeName:}" failed. No retries permitted until 2026-04-24 21:46:11.140867431 +0000 UTC m=+1148.401436244 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/40bf3ac8-34f0-4165-8262-5138231ed11f-proxy-tls") pod "ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc" (UID: "40bf3ac8-34f0-4165-8262-5138231ed11f") : secret "ensemble-graph-dd3fc-serving-cert" not found Apr 24 21:46:10.641372 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:10.641354 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40bf3ac8-34f0-4165-8262-5138231ed11f-openshift-service-ca-bundle\") pod \"ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc\" (UID: \"40bf3ac8-34f0-4165-8262-5138231ed11f\") " pod="kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc" Apr 24 21:46:11.146336 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:11.146298 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/40bf3ac8-34f0-4165-8262-5138231ed11f-proxy-tls\") pod \"ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc\" (UID: \"40bf3ac8-34f0-4165-8262-5138231ed11f\") " pod="kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc" Apr 24 21:46:11.148762 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:11.148741 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/40bf3ac8-34f0-4165-8262-5138231ed11f-proxy-tls\") pod \"ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc\" (UID: \"40bf3ac8-34f0-4165-8262-5138231ed11f\") " pod="kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc" Apr 24 21:46:11.302025 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:11.301991 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc" Apr 24 21:46:11.429702 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:11.429616 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc"] Apr 24 21:46:11.433062 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:46:11.433021 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40bf3ac8_34f0_4165_8262_5138231ed11f.slice/crio-b4d9ed11259a55e1cf368170421d2cce78740ed3f50c145dfa492ed588ab209b WatchSource:0}: Error finding container b4d9ed11259a55e1cf368170421d2cce78740ed3f50c145dfa492ed588ab209b: Status 404 returned error can't find the container with id b4d9ed11259a55e1cf368170421d2cce78740ed3f50c145dfa492ed588ab209b Apr 24 21:46:11.434903 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:11.434882 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:46:11.993482 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:11.993447 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc" event={"ID":"40bf3ac8-34f0-4165-8262-5138231ed11f","Type":"ContainerStarted","Data":"a1b4fb135a5f9ee1ee7cbb4cb9a66a3978bcc673e964172a1e440b8a7cd51dfc"} Apr 24 21:46:11.993482 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:11.993485 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc" event={"ID":"40bf3ac8-34f0-4165-8262-5138231ed11f","Type":"ContainerStarted","Data":"b4d9ed11259a55e1cf368170421d2cce78740ed3f50c145dfa492ed588ab209b"} Apr 24 21:46:11.993864 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:11.993515 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc" Apr 24 21:46:12.011612 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:12.011555 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc" podStartSLOduration=2.011541286 podStartE2EDuration="2.011541286s" podCreationTimestamp="2026-04-24 21:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:46:12.010148329 +0000 UTC m=+1149.270717162" watchObservedRunningTime="2026-04-24 21:46:12.011541286 +0000 UTC m=+1149.272110147" Apr 24 21:46:12.491261 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:12.491224 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk" podUID="df15de12-ba86-43e8-bd9d-8cea0f40f72a" containerName="sequence-graph-54155" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:46:17.491025 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:17.490975 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk" podUID="df15de12-ba86-43e8-bd9d-8cea0f40f72a" containerName="sequence-graph-54155" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:46:18.002654 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:18.002620 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc" Apr 24 21:46:20.449544 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:20.449507 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc"] Apr 24 21:46:20.450018 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:20.449722 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc" podUID="40bf3ac8-34f0-4165-8262-5138231ed11f" containerName="ensemble-graph-dd3fc" containerID="cri-o://a1b4fb135a5f9ee1ee7cbb4cb9a66a3978bcc673e964172a1e440b8a7cd51dfc" gracePeriod=30 Apr 24 21:46:20.725519 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:20.725490 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk" Apr 24 21:46:20.828029 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:20.827990 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df15de12-ba86-43e8-bd9d-8cea0f40f72a-proxy-tls\") pod \"df15de12-ba86-43e8-bd9d-8cea0f40f72a\" (UID: \"df15de12-ba86-43e8-bd9d-8cea0f40f72a\") " Apr 24 21:46:20.828029 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:20.828034 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df15de12-ba86-43e8-bd9d-8cea0f40f72a-openshift-service-ca-bundle\") pod \"df15de12-ba86-43e8-bd9d-8cea0f40f72a\" (UID: \"df15de12-ba86-43e8-bd9d-8cea0f40f72a\") " Apr 24 21:46:20.828484 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:20.828456 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df15de12-ba86-43e8-bd9d-8cea0f40f72a-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "df15de12-ba86-43e8-bd9d-8cea0f40f72a" (UID: "df15de12-ba86-43e8-bd9d-8cea0f40f72a"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:46:20.830188 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:20.830164 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df15de12-ba86-43e8-bd9d-8cea0f40f72a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "df15de12-ba86-43e8-bd9d-8cea0f40f72a" (UID: "df15de12-ba86-43e8-bd9d-8cea0f40f72a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:46:20.929316 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:20.929281 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df15de12-ba86-43e8-bd9d-8cea0f40f72a-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:46:20.929316 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:20.929313 2573 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df15de12-ba86-43e8-bd9d-8cea0f40f72a-openshift-service-ca-bundle\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:46:21.021022 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:21.020917 2573 generic.go:358] "Generic (PLEG): container finished" podID="df15de12-ba86-43e8-bd9d-8cea0f40f72a" containerID="fc67847fc4001e08360434585df56cc80da3d7b899882cfce701f377431feabe" exitCode=0 Apr 24 21:46:21.021022 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:21.020993 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk" Apr 24 21:46:21.021022 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:21.021007 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk" event={"ID":"df15de12-ba86-43e8-bd9d-8cea0f40f72a","Type":"ContainerDied","Data":"fc67847fc4001e08360434585df56cc80da3d7b899882cfce701f377431feabe"} Apr 24 21:46:21.021317 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:21.021043 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk" event={"ID":"df15de12-ba86-43e8-bd9d-8cea0f40f72a","Type":"ContainerDied","Data":"b385a9938dc81083ca77fc66f6561be9df6cd319fdf3eb54d0f638dbd7c63788"} Apr 24 21:46:21.021317 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:21.021059 2573 scope.go:117] "RemoveContainer" containerID="fc67847fc4001e08360434585df56cc80da3d7b899882cfce701f377431feabe" Apr 24 21:46:21.035408 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:21.035381 2573 scope.go:117] "RemoveContainer" containerID="fc67847fc4001e08360434585df56cc80da3d7b899882cfce701f377431feabe" Apr 24 21:46:21.035803 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:46:21.035769 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc67847fc4001e08360434585df56cc80da3d7b899882cfce701f377431feabe\": container with ID starting with fc67847fc4001e08360434585df56cc80da3d7b899882cfce701f377431feabe not found: ID does not exist" containerID="fc67847fc4001e08360434585df56cc80da3d7b899882cfce701f377431feabe" Apr 24 21:46:21.035857 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:21.035827 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc67847fc4001e08360434585df56cc80da3d7b899882cfce701f377431feabe"} err="failed to get container status \"fc67847fc4001e08360434585df56cc80da3d7b899882cfce701f377431feabe\": rpc error: code = NotFound desc = could not find container \"fc67847fc4001e08360434585df56cc80da3d7b899882cfce701f377431feabe\": container with ID starting with fc67847fc4001e08360434585df56cc80da3d7b899882cfce701f377431feabe not found: ID does not exist" Apr 24 21:46:21.044869 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:21.044838 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk"] Apr 24 21:46:21.048505 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:21.048474 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-54155-84d8774cb-sx4nk"] Apr 24 21:46:21.356228 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:21.356184 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df15de12-ba86-43e8-bd9d-8cea0f40f72a" path="/var/lib/kubelet/pods/df15de12-ba86-43e8-bd9d-8cea0f40f72a/volumes" Apr 24 21:46:23.000841 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:23.000802 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc" podUID="40bf3ac8-34f0-4165-8262-5138231ed11f" containerName="ensemble-graph-dd3fc" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:46:28.000565 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:28.000523 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc" podUID="40bf3ac8-34f0-4165-8262-5138231ed11f" containerName="ensemble-graph-dd3fc" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:46:33.000514 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:33.000473 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc" podUID="40bf3ac8-34f0-4165-8262-5138231ed11f" containerName="ensemble-graph-dd3fc" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:46:33.000989 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:33.000617 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc" Apr 24 21:46:38.001193 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:38.001143 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc" podUID="40bf3ac8-34f0-4165-8262-5138231ed11f" containerName="ensemble-graph-dd3fc" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:46:43.000928 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:43.000882 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc" podUID="40bf3ac8-34f0-4165-8262-5138231ed11f" containerName="ensemble-graph-dd3fc" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:46:48.000949 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:48.000901 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc" podUID="40bf3ac8-34f0-4165-8262-5138231ed11f" containerName="ensemble-graph-dd3fc" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:46:50.210284 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:50.210243 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz"] Apr 24 21:46:50.210799 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:50.210684 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df15de12-ba86-43e8-bd9d-8cea0f40f72a" containerName="sequence-graph-54155" Apr 24 21:46:50.210799 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:50.210704 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="df15de12-ba86-43e8-bd9d-8cea0f40f72a" containerName="sequence-graph-54155" Apr 24 21:46:50.210799 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:50.210787 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="df15de12-ba86-43e8-bd9d-8cea0f40f72a" containerName="sequence-graph-54155" Apr 24 21:46:50.214008 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:50.213981 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz" Apr 24 21:46:50.216648 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:50.216624 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-6445b-serving-cert\"" Apr 24 21:46:50.216782 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:50.216624 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-6445b-kube-rbac-proxy-sar-config\"" Apr 24 21:46:50.224442 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:50.224411 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz"] Apr 24 21:46:50.255243 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:50.255208 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61d18dda-aba2-4aa4-a61b-5a66c6078382-openshift-service-ca-bundle\") pod \"sequence-graph-6445b-696c86c896-8k2tz\" (UID: \"61d18dda-aba2-4aa4-a61b-5a66c6078382\") " pod="kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz" Apr 24 21:46:50.255429 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:50.255257 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61d18dda-aba2-4aa4-a61b-5a66c6078382-proxy-tls\") pod \"sequence-graph-6445b-696c86c896-8k2tz\" (UID: \"61d18dda-aba2-4aa4-a61b-5a66c6078382\") " pod="kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz" Apr 24 21:46:50.356284 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:50.356241 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61d18dda-aba2-4aa4-a61b-5a66c6078382-proxy-tls\") pod \"sequence-graph-6445b-696c86c896-8k2tz\" (UID: \"61d18dda-aba2-4aa4-a61b-5a66c6078382\") " pod="kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz" Apr 24 21:46:50.356476 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:50.356350 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61d18dda-aba2-4aa4-a61b-5a66c6078382-openshift-service-ca-bundle\") pod \"sequence-graph-6445b-696c86c896-8k2tz\" (UID: \"61d18dda-aba2-4aa4-a61b-5a66c6078382\") " pod="kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz" Apr 24 21:46:50.356476 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:46:50.356406 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-6445b-serving-cert: secret "sequence-graph-6445b-serving-cert" not found Apr 24 21:46:50.356569 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:46:50.356479 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61d18dda-aba2-4aa4-a61b-5a66c6078382-proxy-tls podName:61d18dda-aba2-4aa4-a61b-5a66c6078382 nodeName:}" failed. No retries permitted until 2026-04-24 21:46:50.856456955 +0000 UTC m=+1188.117025772 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/61d18dda-aba2-4aa4-a61b-5a66c6078382-proxy-tls") pod "sequence-graph-6445b-696c86c896-8k2tz" (UID: "61d18dda-aba2-4aa4-a61b-5a66c6078382") : secret "sequence-graph-6445b-serving-cert" not found Apr 24 21:46:50.357012 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:50.356993 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61d18dda-aba2-4aa4-a61b-5a66c6078382-openshift-service-ca-bundle\") pod \"sequence-graph-6445b-696c86c896-8k2tz\" (UID: \"61d18dda-aba2-4aa4-a61b-5a66c6078382\") " pod="kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz" Apr 24 21:46:50.860382 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:50.860349 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61d18dda-aba2-4aa4-a61b-5a66c6078382-proxy-tls\") pod \"sequence-graph-6445b-696c86c896-8k2tz\" (UID: \"61d18dda-aba2-4aa4-a61b-5a66c6078382\") " pod="kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz" Apr 24 21:46:50.862818 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:50.862794 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61d18dda-aba2-4aa4-a61b-5a66c6078382-proxy-tls\") pod \"sequence-graph-6445b-696c86c896-8k2tz\" (UID: \"61d18dda-aba2-4aa4-a61b-5a66c6078382\") " pod="kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz" Apr 24 21:46:51.095740 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:51.095718 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc" Apr 24 21:46:51.114432 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:51.114351 2573 generic.go:358] "Generic (PLEG): container finished" podID="40bf3ac8-34f0-4165-8262-5138231ed11f" containerID="a1b4fb135a5f9ee1ee7cbb4cb9a66a3978bcc673e964172a1e440b8a7cd51dfc" exitCode=0 Apr 24 21:46:51.114432 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:51.114413 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc" Apr 24 21:46:51.114643 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:51.114433 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc" event={"ID":"40bf3ac8-34f0-4165-8262-5138231ed11f","Type":"ContainerDied","Data":"a1b4fb135a5f9ee1ee7cbb4cb9a66a3978bcc673e964172a1e440b8a7cd51dfc"} Apr 24 21:46:51.114643 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:51.114477 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc" event={"ID":"40bf3ac8-34f0-4165-8262-5138231ed11f","Type":"ContainerDied","Data":"b4d9ed11259a55e1cf368170421d2cce78740ed3f50c145dfa492ed588ab209b"} Apr 24 21:46:51.114643 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:51.114497 2573 scope.go:117] "RemoveContainer" containerID="a1b4fb135a5f9ee1ee7cbb4cb9a66a3978bcc673e964172a1e440b8a7cd51dfc" Apr 24 21:46:51.123307 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:51.123288 2573 scope.go:117] "RemoveContainer" containerID="a1b4fb135a5f9ee1ee7cbb4cb9a66a3978bcc673e964172a1e440b8a7cd51dfc" Apr 24 21:46:51.123607 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:46:51.123587 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1b4fb135a5f9ee1ee7cbb4cb9a66a3978bcc673e964172a1e440b8a7cd51dfc\": container with ID starting with a1b4fb135a5f9ee1ee7cbb4cb9a66a3978bcc673e964172a1e440b8a7cd51dfc not found: ID does not exist" containerID="a1b4fb135a5f9ee1ee7cbb4cb9a66a3978bcc673e964172a1e440b8a7cd51dfc" Apr 24 21:46:51.123722 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:51.123618 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1b4fb135a5f9ee1ee7cbb4cb9a66a3978bcc673e964172a1e440b8a7cd51dfc"} err="failed to get container status \"a1b4fb135a5f9ee1ee7cbb4cb9a66a3978bcc673e964172a1e440b8a7cd51dfc\": rpc error: code = NotFound desc = could not find container \"a1b4fb135a5f9ee1ee7cbb4cb9a66a3978bcc673e964172a1e440b8a7cd51dfc\": container with ID starting with a1b4fb135a5f9ee1ee7cbb4cb9a66a3978bcc673e964172a1e440b8a7cd51dfc not found: ID does not exist" Apr 24 21:46:51.124715 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:51.124694 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz" Apr 24 21:46:51.162673 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:51.162637 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/40bf3ac8-34f0-4165-8262-5138231ed11f-proxy-tls\") pod \"40bf3ac8-34f0-4165-8262-5138231ed11f\" (UID: \"40bf3ac8-34f0-4165-8262-5138231ed11f\") " Apr 24 21:46:51.162874 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:51.162693 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40bf3ac8-34f0-4165-8262-5138231ed11f-openshift-service-ca-bundle\") pod \"40bf3ac8-34f0-4165-8262-5138231ed11f\" (UID: \"40bf3ac8-34f0-4165-8262-5138231ed11f\") " Apr 24 21:46:51.163065 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:51.163035 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40bf3ac8-34f0-4165-8262-5138231ed11f-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "40bf3ac8-34f0-4165-8262-5138231ed11f" (UID: "40bf3ac8-34f0-4165-8262-5138231ed11f"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:46:51.164725 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:51.164691 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40bf3ac8-34f0-4165-8262-5138231ed11f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "40bf3ac8-34f0-4165-8262-5138231ed11f" (UID: "40bf3ac8-34f0-4165-8262-5138231ed11f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:46:51.243537 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:51.243504 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz"] Apr 24 21:46:51.246599 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:46:51.246571 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61d18dda_aba2_4aa4_a61b_5a66c6078382.slice/crio-9145bfbac137b132bc2c8265a9ce84c0144ff1354a3245cca43b00330ea377b4 WatchSource:0}: Error finding container 9145bfbac137b132bc2c8265a9ce84c0144ff1354a3245cca43b00330ea377b4: Status 404 returned error can't find the container with id 9145bfbac137b132bc2c8265a9ce84c0144ff1354a3245cca43b00330ea377b4 Apr 24 21:46:51.263529 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:51.263499 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/40bf3ac8-34f0-4165-8262-5138231ed11f-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:46:51.263632 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:51.263538 2573 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40bf3ac8-34f0-4165-8262-5138231ed11f-openshift-service-ca-bundle\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:46:51.429529 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:51.429441 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc"] Apr 24 21:46:51.432355 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:51.432329 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-dd3fc-6bb9dd7f4b-fj4mc"] Apr 24 21:46:52.119032 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:52.118995 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz" event={"ID":"61d18dda-aba2-4aa4-a61b-5a66c6078382","Type":"ContainerStarted","Data":"75954004cd37e1fea3c739e417a4bd9ba054a451b94b9a3d14c57ea36e37f11d"} Apr 24 21:46:52.119032 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:52.119033 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz" event={"ID":"61d18dda-aba2-4aa4-a61b-5a66c6078382","Type":"ContainerStarted","Data":"9145bfbac137b132bc2c8265a9ce84c0144ff1354a3245cca43b00330ea377b4"} Apr 24 21:46:52.119288 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:52.119062 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz" Apr 24 21:46:52.136970 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:52.136917 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz" podStartSLOduration=2.136900217 podStartE2EDuration="2.136900217s" podCreationTimestamp="2026-04-24 21:46:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:46:52.135752763 +0000 UTC m=+1189.396321596" watchObservedRunningTime="2026-04-24 21:46:52.136900217 +0000 UTC m=+1189.397469050" Apr 24 21:46:53.356303 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:53.356265 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40bf3ac8-34f0-4165-8262-5138231ed11f" path="/var/lib/kubelet/pods/40bf3ac8-34f0-4165-8262-5138231ed11f/volumes" Apr 24 21:46:58.128333 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:46:58.128301 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz" Apr 24 21:47:00.294408 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:00.294348 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz"] Apr 24 21:47:00.294824 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:00.294657 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz" podUID="61d18dda-aba2-4aa4-a61b-5a66c6078382" containerName="sequence-graph-6445b" containerID="cri-o://75954004cd37e1fea3c739e417a4bd9ba054a451b94b9a3d14c57ea36e37f11d" gracePeriod=30 Apr 24 21:47:03.127184 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:03.127136 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz" podUID="61d18dda-aba2-4aa4-a61b-5a66c6078382" containerName="sequence-graph-6445b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:47:03.329048 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:03.329009 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-59fzh_494a20d0-ace3-481d-ae39-780b958f7150/cluster-monitoring-operator/0.log" Apr 24 21:47:03.331139 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:03.331117 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-59fzh_494a20d0-ace3-481d-ae39-780b958f7150/cluster-monitoring-operator/0.log" Apr 24 21:47:03.333208 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:03.333183 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4nztl_b3e83451-d37f-4c5b-837f-6440bed57b91/console-operator/1.log" Apr 24 21:47:03.335440 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:03.335413 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4nztl_b3e83451-d37f-4c5b-837f-6440bed57b91/console-operator/1.log" Apr 24 21:47:08.127652 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:08.127610 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz" podUID="61d18dda-aba2-4aa4-a61b-5a66c6078382" containerName="sequence-graph-6445b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:47:13.127380 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:13.127332 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz" podUID="61d18dda-aba2-4aa4-a61b-5a66c6078382" containerName="sequence-graph-6445b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:47:13.127829 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:13.127475 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz" Apr 24 21:47:18.126734 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:18.126691 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz" podUID="61d18dda-aba2-4aa4-a61b-5a66c6078382" containerName="sequence-graph-6445b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:47:20.641649 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:20.641613 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425"] Apr 24 21:47:20.642035 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:20.641910 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40bf3ac8-34f0-4165-8262-5138231ed11f" containerName="ensemble-graph-dd3fc" Apr 24 21:47:20.642035 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:20.641923 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="40bf3ac8-34f0-4165-8262-5138231ed11f" containerName="ensemble-graph-dd3fc" Apr 24 21:47:20.642035 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:20.641990 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="40bf3ac8-34f0-4165-8262-5138231ed11f" containerName="ensemble-graph-dd3fc" Apr 24 21:47:20.646208 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:20.646189 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425" Apr 24 21:47:20.648575 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:20.648543 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-a9fd5-serving-cert\"" Apr 24 21:47:20.648712 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:20.648543 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-a9fd5-kube-rbac-proxy-sar-config\"" Apr 24 21:47:20.651447 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:20.651405 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425"] Apr 24 21:47:20.807722 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:20.807680 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26969180-d212-46ae-aa32-fe5bf4e0cd40-proxy-tls\") pod \"ensemble-graph-a9fd5-7b8774d5b4-6s425\" (UID: \"26969180-d212-46ae-aa32-fe5bf4e0cd40\") " pod="kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425" Apr 24 21:47:20.807722 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:20.807730 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26969180-d212-46ae-aa32-fe5bf4e0cd40-openshift-service-ca-bundle\") pod \"ensemble-graph-a9fd5-7b8774d5b4-6s425\" (UID: \"26969180-d212-46ae-aa32-fe5bf4e0cd40\") " pod="kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425" Apr 24 21:47:20.908314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:20.908217 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26969180-d212-46ae-aa32-fe5bf4e0cd40-proxy-tls\") pod \"ensemble-graph-a9fd5-7b8774d5b4-6s425\" (UID: \"26969180-d212-46ae-aa32-fe5bf4e0cd40\") " pod="kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425" Apr 24 21:47:20.908314 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:20.908262 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26969180-d212-46ae-aa32-fe5bf4e0cd40-openshift-service-ca-bundle\") pod \"ensemble-graph-a9fd5-7b8774d5b4-6s425\" (UID: \"26969180-d212-46ae-aa32-fe5bf4e0cd40\") " pod="kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425" Apr 24 21:47:20.908508 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:47:20.908373 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-a9fd5-serving-cert: secret "ensemble-graph-a9fd5-serving-cert" not found Apr 24 21:47:20.908508 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:47:20.908449 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26969180-d212-46ae-aa32-fe5bf4e0cd40-proxy-tls podName:26969180-d212-46ae-aa32-fe5bf4e0cd40 nodeName:}" failed. No retries permitted until 2026-04-24 21:47:21.408426818 +0000 UTC m=+1218.668995631 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/26969180-d212-46ae-aa32-fe5bf4e0cd40-proxy-tls") pod "ensemble-graph-a9fd5-7b8774d5b4-6s425" (UID: "26969180-d212-46ae-aa32-fe5bf4e0cd40") : secret "ensemble-graph-a9fd5-serving-cert" not found Apr 24 21:47:20.908975 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:20.908955 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26969180-d212-46ae-aa32-fe5bf4e0cd40-openshift-service-ca-bundle\") pod \"ensemble-graph-a9fd5-7b8774d5b4-6s425\" (UID: \"26969180-d212-46ae-aa32-fe5bf4e0cd40\") " pod="kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425" Apr 24 21:47:21.413313 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:21.413277 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26969180-d212-46ae-aa32-fe5bf4e0cd40-proxy-tls\") pod \"ensemble-graph-a9fd5-7b8774d5b4-6s425\" (UID: \"26969180-d212-46ae-aa32-fe5bf4e0cd40\") " pod="kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425" Apr 24 21:47:21.413495 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:47:21.413387 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-a9fd5-serving-cert: secret "ensemble-graph-a9fd5-serving-cert" not found Apr 24 21:47:21.413495 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:47:21.413439 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26969180-d212-46ae-aa32-fe5bf4e0cd40-proxy-tls podName:26969180-d212-46ae-aa32-fe5bf4e0cd40 nodeName:}" failed. No retries permitted until 2026-04-24 21:47:22.413425242 +0000 UTC m=+1219.673994051 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/26969180-d212-46ae-aa32-fe5bf4e0cd40-proxy-tls") pod "ensemble-graph-a9fd5-7b8774d5b4-6s425" (UID: "26969180-d212-46ae-aa32-fe5bf4e0cd40") : secret "ensemble-graph-a9fd5-serving-cert" not found Apr 24 21:47:22.420334 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:22.420274 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26969180-d212-46ae-aa32-fe5bf4e0cd40-proxy-tls\") pod \"ensemble-graph-a9fd5-7b8774d5b4-6s425\" (UID: \"26969180-d212-46ae-aa32-fe5bf4e0cd40\") " pod="kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425" Apr 24 21:47:22.422753 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:22.422731 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26969180-d212-46ae-aa32-fe5bf4e0cd40-proxy-tls\") pod \"ensemble-graph-a9fd5-7b8774d5b4-6s425\" (UID: \"26969180-d212-46ae-aa32-fe5bf4e0cd40\") " pod="kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425" Apr 24 21:47:22.457520 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:22.457478 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425" Apr 24 21:47:22.581928 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:22.581881 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425"] Apr 24 21:47:23.127744 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:23.127700 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz" podUID="61d18dda-aba2-4aa4-a61b-5a66c6078382" containerName="sequence-graph-6445b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:47:23.211185 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:23.211146 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425" event={"ID":"26969180-d212-46ae-aa32-fe5bf4e0cd40","Type":"ContainerStarted","Data":"02201ccaf057120b3987f33ad55afdf226e0f12cd850a25304681a94023b5bac"} Apr 24 21:47:23.211185 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:23.211187 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425" event={"ID":"26969180-d212-46ae-aa32-fe5bf4e0cd40","Type":"ContainerStarted","Data":"7da79035d6cf380105aaa3d1e1e69cc388c106c014e18e2a45ee4bfad11805b8"} Apr 24 21:47:23.211404 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:23.211302 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425" Apr 24 21:47:23.228215 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:23.228162 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425" podStartSLOduration=3.228145879 podStartE2EDuration="3.228145879s" podCreationTimestamp="2026-04-24 21:47:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:47:23.226854292 +0000 UTC m=+1220.487423123" watchObservedRunningTime="2026-04-24 21:47:23.228145879 +0000 UTC m=+1220.488714764" Apr 24 21:47:28.126586 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:28.126543 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz" podUID="61d18dda-aba2-4aa4-a61b-5a66c6078382" containerName="sequence-graph-6445b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:47:29.219466 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:29.219436 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425" Apr 24 21:47:30.319600 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:47:30.319552 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61d18dda_aba2_4aa4_a61b_5a66c6078382.slice/crio-conmon-75954004cd37e1fea3c739e417a4bd9ba054a451b94b9a3d14c57ea36e37f11d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61d18dda_aba2_4aa4_a61b_5a66c6078382.slice/crio-75954004cd37e1fea3c739e417a4bd9ba054a451b94b9a3d14c57ea36e37f11d.scope\": RecentStats: unable to find data in memory cache]" Apr 24 21:47:30.320029 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:47:30.319566 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61d18dda_aba2_4aa4_a61b_5a66c6078382.slice/crio-75954004cd37e1fea3c739e417a4bd9ba054a451b94b9a3d14c57ea36e37f11d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61d18dda_aba2_4aa4_a61b_5a66c6078382.slice/crio-conmon-75954004cd37e1fea3c739e417a4bd9ba054a451b94b9a3d14c57ea36e37f11d.scope\": RecentStats: unable to find data in memory cache]" Apr 24 21:47:30.445900 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:30.445873 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz" Apr 24 21:47:30.585534 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:30.585427 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61d18dda-aba2-4aa4-a61b-5a66c6078382-openshift-service-ca-bundle\") pod \"61d18dda-aba2-4aa4-a61b-5a66c6078382\" (UID: \"61d18dda-aba2-4aa4-a61b-5a66c6078382\") " Apr 24 21:47:30.585534 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:30.585503 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61d18dda-aba2-4aa4-a61b-5a66c6078382-proxy-tls\") pod \"61d18dda-aba2-4aa4-a61b-5a66c6078382\" (UID: \"61d18dda-aba2-4aa4-a61b-5a66c6078382\") " Apr 24 21:47:30.585830 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:30.585802 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61d18dda-aba2-4aa4-a61b-5a66c6078382-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "61d18dda-aba2-4aa4-a61b-5a66c6078382" (UID: "61d18dda-aba2-4aa4-a61b-5a66c6078382"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:47:30.587571 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:30.587544 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61d18dda-aba2-4aa4-a61b-5a66c6078382-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "61d18dda-aba2-4aa4-a61b-5a66c6078382" (UID: "61d18dda-aba2-4aa4-a61b-5a66c6078382"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:47:30.686353 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:30.686307 2573 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/61d18dda-aba2-4aa4-a61b-5a66c6078382-openshift-service-ca-bundle\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:47:30.686538 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:30.686347 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61d18dda-aba2-4aa4-a61b-5a66c6078382-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:47:31.234543 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:31.234510 2573 generic.go:358] "Generic (PLEG): container finished" podID="61d18dda-aba2-4aa4-a61b-5a66c6078382" containerID="75954004cd37e1fea3c739e417a4bd9ba054a451b94b9a3d14c57ea36e37f11d" exitCode=0 Apr 24 21:47:31.234758 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:31.234561 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz" event={"ID":"61d18dda-aba2-4aa4-a61b-5a66c6078382","Type":"ContainerDied","Data":"75954004cd37e1fea3c739e417a4bd9ba054a451b94b9a3d14c57ea36e37f11d"} Apr 24 21:47:31.234758 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:31.234583 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz" event={"ID":"61d18dda-aba2-4aa4-a61b-5a66c6078382","Type":"ContainerDied","Data":"9145bfbac137b132bc2c8265a9ce84c0144ff1354a3245cca43b00330ea377b4"} Apr 24 21:47:31.234758 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:31.234598 2573 scope.go:117] "RemoveContainer" containerID="75954004cd37e1fea3c739e417a4bd9ba054a451b94b9a3d14c57ea36e37f11d" Apr 24 21:47:31.234758 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:31.234596 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz" Apr 24 21:47:31.242818 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:31.242799 2573 scope.go:117] "RemoveContainer" containerID="75954004cd37e1fea3c739e417a4bd9ba054a451b94b9a3d14c57ea36e37f11d" Apr 24 21:47:31.243144 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:47:31.243128 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75954004cd37e1fea3c739e417a4bd9ba054a451b94b9a3d14c57ea36e37f11d\": container with ID starting with 75954004cd37e1fea3c739e417a4bd9ba054a451b94b9a3d14c57ea36e37f11d not found: ID does not exist" containerID="75954004cd37e1fea3c739e417a4bd9ba054a451b94b9a3d14c57ea36e37f11d" Apr 24 21:47:31.243203 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:31.243151 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75954004cd37e1fea3c739e417a4bd9ba054a451b94b9a3d14c57ea36e37f11d"} err="failed to get container status \"75954004cd37e1fea3c739e417a4bd9ba054a451b94b9a3d14c57ea36e37f11d\": rpc error: code = NotFound desc = could not find container \"75954004cd37e1fea3c739e417a4bd9ba054a451b94b9a3d14c57ea36e37f11d\": container with ID starting with 75954004cd37e1fea3c739e417a4bd9ba054a451b94b9a3d14c57ea36e37f11d not found: ID does not exist" Apr 24 21:47:31.257527 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:31.257494 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz"] Apr 24 21:47:31.265988 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:31.265960 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-6445b-696c86c896-8k2tz"] Apr 24 21:47:31.356292 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:47:31.356259 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61d18dda-aba2-4aa4-a61b-5a66c6078382" path="/var/lib/kubelet/pods/61d18dda-aba2-4aa4-a61b-5a66c6078382/volumes" Apr 24 21:48:00.486531 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:00.486448 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r"] Apr 24 21:48:00.486981 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:00.486777 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="61d18dda-aba2-4aa4-a61b-5a66c6078382" containerName="sequence-graph-6445b" Apr 24 21:48:00.486981 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:00.486788 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="61d18dda-aba2-4aa4-a61b-5a66c6078382" containerName="sequence-graph-6445b" Apr 24 21:48:00.486981 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:00.486862 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="61d18dda-aba2-4aa4-a61b-5a66c6078382" containerName="sequence-graph-6445b" Apr 24 21:48:00.489626 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:00.489609 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r" Apr 24 21:48:00.491875 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:00.491852 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-d1134-serving-cert\"" Apr 24 21:48:00.492002 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:00.491913 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-d1134-kube-rbac-proxy-sar-config\"" Apr 24 21:48:00.499923 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:00.499900 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r"] Apr 24 21:48:00.517232 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:00.517197 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0483f79-b854-42da-a3b9-ac80a7066269-proxy-tls\") pod \"sequence-graph-d1134-69b44c49f9-tm66r\" (UID: \"c0483f79-b854-42da-a3b9-ac80a7066269\") " pod="kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r" Apr 24 21:48:00.517232 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:00.517234 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0483f79-b854-42da-a3b9-ac80a7066269-openshift-service-ca-bundle\") pod \"sequence-graph-d1134-69b44c49f9-tm66r\" (UID: \"c0483f79-b854-42da-a3b9-ac80a7066269\") " pod="kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r" Apr 24 21:48:00.618009 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:00.617963 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0483f79-b854-42da-a3b9-ac80a7066269-proxy-tls\") pod \"sequence-graph-d1134-69b44c49f9-tm66r\" (UID: \"c0483f79-b854-42da-a3b9-ac80a7066269\") " pod="kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r" Apr 24 21:48:00.618009 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:00.618014 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0483f79-b854-42da-a3b9-ac80a7066269-openshift-service-ca-bundle\") pod \"sequence-graph-d1134-69b44c49f9-tm66r\" (UID: \"c0483f79-b854-42da-a3b9-ac80a7066269\") " pod="kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r" Apr 24 21:48:00.618268 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:48:00.618139 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-d1134-serving-cert: secret "sequence-graph-d1134-serving-cert" not found Apr 24 21:48:00.618268 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:48:00.618213 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0483f79-b854-42da-a3b9-ac80a7066269-proxy-tls podName:c0483f79-b854-42da-a3b9-ac80a7066269 nodeName:}" failed. No retries permitted until 2026-04-24 21:48:01.118194669 +0000 UTC m=+1258.378763478 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c0483f79-b854-42da-a3b9-ac80a7066269-proxy-tls") pod "sequence-graph-d1134-69b44c49f9-tm66r" (UID: "c0483f79-b854-42da-a3b9-ac80a7066269") : secret "sequence-graph-d1134-serving-cert" not found Apr 24 21:48:00.618649 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:00.618630 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0483f79-b854-42da-a3b9-ac80a7066269-openshift-service-ca-bundle\") pod \"sequence-graph-d1134-69b44c49f9-tm66r\" (UID: \"c0483f79-b854-42da-a3b9-ac80a7066269\") " pod="kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r" Apr 24 21:48:01.122388 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:01.122347 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0483f79-b854-42da-a3b9-ac80a7066269-proxy-tls\") pod \"sequence-graph-d1134-69b44c49f9-tm66r\" (UID: \"c0483f79-b854-42da-a3b9-ac80a7066269\") " pod="kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r" Apr 24 21:48:01.124874 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:01.124840 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0483f79-b854-42da-a3b9-ac80a7066269-proxy-tls\") pod \"sequence-graph-d1134-69b44c49f9-tm66r\" (UID: \"c0483f79-b854-42da-a3b9-ac80a7066269\") " pod="kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r" Apr 24 21:48:01.400708 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:01.400610 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r" Apr 24 21:48:01.524949 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:01.524921 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r"] Apr 24 21:48:01.527478 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:48:01.527451 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0483f79_b854_42da_a3b9_ac80a7066269.slice/crio-9a5e8a604e1c135c83a1fe0a31a3c58bac525378ebec9d37c6ab54dc556dc22e WatchSource:0}: Error finding container 9a5e8a604e1c135c83a1fe0a31a3c58bac525378ebec9d37c6ab54dc556dc22e: Status 404 returned error can't find the container with id 9a5e8a604e1c135c83a1fe0a31a3c58bac525378ebec9d37c6ab54dc556dc22e Apr 24 21:48:02.326310 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:02.326271 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r" event={"ID":"c0483f79-b854-42da-a3b9-ac80a7066269","Type":"ContainerStarted","Data":"c1e5b541652f0550429021125523ad2ffd143dede0a4ad2c4ca10e01e804606a"} Apr 24 21:48:02.326310 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:02.326314 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r" event={"ID":"c0483f79-b854-42da-a3b9-ac80a7066269","Type":"ContainerStarted","Data":"9a5e8a604e1c135c83a1fe0a31a3c58bac525378ebec9d37c6ab54dc556dc22e"} Apr 24 21:48:02.326539 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:02.326359 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r" Apr 24 21:48:02.347758 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:02.347705 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r" podStartSLOduration=2.347691078 podStartE2EDuration="2.347691078s" podCreationTimestamp="2026-04-24 21:48:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:48:02.345463349 +0000 UTC m=+1259.606032185" watchObservedRunningTime="2026-04-24 21:48:02.347691078 +0000 UTC m=+1259.608259909" Apr 24 21:48:08.336531 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:48:08.336502 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r" Apr 24 21:52:03.349749 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:52:03.349714 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-59fzh_494a20d0-ace3-481d-ae39-780b958f7150/cluster-monitoring-operator/0.log" Apr 24 21:52:03.352724 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:52:03.352700 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-59fzh_494a20d0-ace3-481d-ae39-780b958f7150/cluster-monitoring-operator/0.log" Apr 24 21:52:03.355087 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:52:03.355064 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4nztl_b3e83451-d37f-4c5b-837f-6440bed57b91/console-operator/1.log" Apr 24 21:52:03.357917 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:52:03.357898 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4nztl_b3e83451-d37f-4c5b-837f-6440bed57b91/console-operator/1.log" Apr 24 21:55:35.412990 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:55:35.412954 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425"] Apr 24 21:55:35.415607 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:55:35.413241 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425" podUID="26969180-d212-46ae-aa32-fe5bf4e0cd40" containerName="ensemble-graph-a9fd5" containerID="cri-o://02201ccaf057120b3987f33ad55afdf226e0f12cd850a25304681a94023b5bac" gracePeriod=30 Apr 24 21:55:39.217738 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:55:39.217697 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425" podUID="26969180-d212-46ae-aa32-fe5bf4e0cd40" containerName="ensemble-graph-a9fd5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:55:44.218200 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:55:44.218160 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425" podUID="26969180-d212-46ae-aa32-fe5bf4e0cd40" containerName="ensemble-graph-a9fd5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:55:49.218651 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:55:49.218609 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425" podUID="26969180-d212-46ae-aa32-fe5bf4e0cd40" containerName="ensemble-graph-a9fd5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:55:49.219134 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:55:49.218745 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425" Apr 24 21:55:54.217838 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:55:54.217793 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425" podUID="26969180-d212-46ae-aa32-fe5bf4e0cd40" containerName="ensemble-graph-a9fd5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:55:59.217920 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:55:59.217878 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425" podUID="26969180-d212-46ae-aa32-fe5bf4e0cd40" containerName="ensemble-graph-a9fd5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:56:04.217910 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:04.217869 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425" podUID="26969180-d212-46ae-aa32-fe5bf4e0cd40" containerName="ensemble-graph-a9fd5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:56:05.556339 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:05.556315 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425" Apr 24 21:56:05.654833 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:05.654790 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26969180-d212-46ae-aa32-fe5bf4e0cd40-openshift-service-ca-bundle\") pod \"26969180-d212-46ae-aa32-fe5bf4e0cd40\" (UID: \"26969180-d212-46ae-aa32-fe5bf4e0cd40\") " Apr 24 21:56:05.654833 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:05.654836 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26969180-d212-46ae-aa32-fe5bf4e0cd40-proxy-tls\") pod \"26969180-d212-46ae-aa32-fe5bf4e0cd40\" (UID: \"26969180-d212-46ae-aa32-fe5bf4e0cd40\") " Apr 24 21:56:05.655242 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:05.655216 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26969180-d212-46ae-aa32-fe5bf4e0cd40-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "26969180-d212-46ae-aa32-fe5bf4e0cd40" (UID: "26969180-d212-46ae-aa32-fe5bf4e0cd40"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:56:05.657153 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:05.657124 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26969180-d212-46ae-aa32-fe5bf4e0cd40-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "26969180-d212-46ae-aa32-fe5bf4e0cd40" (UID: "26969180-d212-46ae-aa32-fe5bf4e0cd40"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:56:05.726867 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:05.726775 2573 generic.go:358] "Generic (PLEG): container finished" podID="26969180-d212-46ae-aa32-fe5bf4e0cd40" containerID="02201ccaf057120b3987f33ad55afdf226e0f12cd850a25304681a94023b5bac" exitCode=0 Apr 24 21:56:05.726867 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:05.726843 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425" Apr 24 21:56:05.727060 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:05.726862 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425" event={"ID":"26969180-d212-46ae-aa32-fe5bf4e0cd40","Type":"ContainerDied","Data":"02201ccaf057120b3987f33ad55afdf226e0f12cd850a25304681a94023b5bac"} Apr 24 21:56:05.727060 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:05.726903 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425" event={"ID":"26969180-d212-46ae-aa32-fe5bf4e0cd40","Type":"ContainerDied","Data":"7da79035d6cf380105aaa3d1e1e69cc388c106c014e18e2a45ee4bfad11805b8"} Apr 24 21:56:05.727060 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:05.726919 2573 scope.go:117] "RemoveContainer" containerID="02201ccaf057120b3987f33ad55afdf226e0f12cd850a25304681a94023b5bac" Apr 24 21:56:05.734779 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:05.734761 2573 scope.go:117] "RemoveContainer" containerID="02201ccaf057120b3987f33ad55afdf226e0f12cd850a25304681a94023b5bac" Apr 24 21:56:05.735076 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:56:05.735056 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02201ccaf057120b3987f33ad55afdf226e0f12cd850a25304681a94023b5bac\": container with ID starting with 02201ccaf057120b3987f33ad55afdf226e0f12cd850a25304681a94023b5bac not found: ID does not exist" containerID="02201ccaf057120b3987f33ad55afdf226e0f12cd850a25304681a94023b5bac" Apr 24 21:56:05.735140 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:05.735085 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02201ccaf057120b3987f33ad55afdf226e0f12cd850a25304681a94023b5bac"} err="failed to get container status \"02201ccaf057120b3987f33ad55afdf226e0f12cd850a25304681a94023b5bac\": rpc error: code = NotFound desc = could not find container \"02201ccaf057120b3987f33ad55afdf226e0f12cd850a25304681a94023b5bac\": container with ID starting with 02201ccaf057120b3987f33ad55afdf226e0f12cd850a25304681a94023b5bac not found: ID does not exist" Apr 24 21:56:05.755670 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:05.755634 2573 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26969180-d212-46ae-aa32-fe5bf4e0cd40-openshift-service-ca-bundle\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:56:05.755670 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:05.755673 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26969180-d212-46ae-aa32-fe5bf4e0cd40-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:56:05.757477 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:05.757454 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425"] Apr 24 21:56:05.765220 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:05.765194 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-a9fd5-7b8774d5b4-6s425"] Apr 24 21:56:07.356326 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:07.356288 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26969180-d212-46ae-aa32-fe5bf4e0cd40" path="/var/lib/kubelet/pods/26969180-d212-46ae-aa32-fe5bf4e0cd40/volumes" Apr 24 21:56:15.114563 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:15.114529 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r"] Apr 24 21:56:15.114978 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:15.114753 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r" podUID="c0483f79-b854-42da-a3b9-ac80a7066269" containerName="sequence-graph-d1134" containerID="cri-o://c1e5b541652f0550429021125523ad2ffd143dede0a4ad2c4ca10e01e804606a" gracePeriod=30 Apr 24 21:56:18.334696 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:18.334630 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r" podUID="c0483f79-b854-42da-a3b9-ac80a7066269" containerName="sequence-graph-d1134" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:56:23.334601 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:23.334537 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r" podUID="c0483f79-b854-42da-a3b9-ac80a7066269" containerName="sequence-graph-d1134" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:56:28.334310 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:28.334266 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r" podUID="c0483f79-b854-42da-a3b9-ac80a7066269" containerName="sequence-graph-d1134" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:56:28.334826 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:28.334392 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r" Apr 24 21:56:33.333939 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:33.333899 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r" podUID="c0483f79-b854-42da-a3b9-ac80a7066269" containerName="sequence-graph-d1134" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:56:35.629587 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:35.629549 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd"] Apr 24 21:56:35.630061 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:35.629840 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26969180-d212-46ae-aa32-fe5bf4e0cd40" containerName="ensemble-graph-a9fd5" Apr 24 21:56:35.630061 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:35.629852 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="26969180-d212-46ae-aa32-fe5bf4e0cd40" containerName="ensemble-graph-a9fd5" Apr 24 21:56:35.630061 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:35.629909 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="26969180-d212-46ae-aa32-fe5bf4e0cd40" containerName="ensemble-graph-a9fd5" Apr 24 21:56:35.632714 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:35.632695 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd" Apr 24 21:56:35.636136 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:35.636113 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-6842b-serving-cert\"" Apr 24 21:56:35.636271 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:35.636124 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-6842b-kube-rbac-proxy-sar-config\"" Apr 24 21:56:35.647904 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:35.647876 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd"] Apr 24 21:56:35.655311 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:35.655287 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3da9271-340b-4dd7-9eb3-9bbb1c51ec42-proxy-tls\") pod \"splitter-graph-6842b-55bb6cfccf-g8lvd\" (UID: \"e3da9271-340b-4dd7-9eb3-9bbb1c51ec42\") " pod="kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd" Apr 24 21:56:35.655444 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:35.655418 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3da9271-340b-4dd7-9eb3-9bbb1c51ec42-openshift-service-ca-bundle\") pod \"splitter-graph-6842b-55bb6cfccf-g8lvd\" (UID: \"e3da9271-340b-4dd7-9eb3-9bbb1c51ec42\") " pod="kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd" Apr 24 21:56:35.756266 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:35.756229 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3da9271-340b-4dd7-9eb3-9bbb1c51ec42-proxy-tls\") pod \"splitter-graph-6842b-55bb6cfccf-g8lvd\" (UID: \"e3da9271-340b-4dd7-9eb3-9bbb1c51ec42\") " pod="kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd" Apr 24 21:56:35.756434 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:35.756311 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3da9271-340b-4dd7-9eb3-9bbb1c51ec42-openshift-service-ca-bundle\") pod \"splitter-graph-6842b-55bb6cfccf-g8lvd\" (UID: \"e3da9271-340b-4dd7-9eb3-9bbb1c51ec42\") " pod="kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd" Apr 24 21:56:35.756434 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:56:35.756388 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-6842b-serving-cert: secret "splitter-graph-6842b-serving-cert" not found Apr 24 21:56:35.756505 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:56:35.756467 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3da9271-340b-4dd7-9eb3-9bbb1c51ec42-proxy-tls podName:e3da9271-340b-4dd7-9eb3-9bbb1c51ec42 nodeName:}" failed. No retries permitted until 2026-04-24 21:56:36.256450569 +0000 UTC m=+1773.517019380 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/e3da9271-340b-4dd7-9eb3-9bbb1c51ec42-proxy-tls") pod "splitter-graph-6842b-55bb6cfccf-g8lvd" (UID: "e3da9271-340b-4dd7-9eb3-9bbb1c51ec42") : secret "splitter-graph-6842b-serving-cert" not found Apr 24 21:56:35.756886 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:35.756868 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3da9271-340b-4dd7-9eb3-9bbb1c51ec42-openshift-service-ca-bundle\") pod \"splitter-graph-6842b-55bb6cfccf-g8lvd\" (UID: \"e3da9271-340b-4dd7-9eb3-9bbb1c51ec42\") " pod="kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd" Apr 24 21:56:36.260428 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:36.260386 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3da9271-340b-4dd7-9eb3-9bbb1c51ec42-proxy-tls\") pod \"splitter-graph-6842b-55bb6cfccf-g8lvd\" (UID: \"e3da9271-340b-4dd7-9eb3-9bbb1c51ec42\") " pod="kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd" Apr 24 21:56:36.262813 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:36.262792 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3da9271-340b-4dd7-9eb3-9bbb1c51ec42-proxy-tls\") pod \"splitter-graph-6842b-55bb6cfccf-g8lvd\" (UID: \"e3da9271-340b-4dd7-9eb3-9bbb1c51ec42\") " pod="kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd" Apr 24 21:56:36.543531 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:36.543428 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd" Apr 24 21:56:36.667816 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:36.667651 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd"] Apr 24 21:56:36.670386 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:56:36.670358 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3da9271_340b_4dd7_9eb3_9bbb1c51ec42.slice/crio-e0291e4c052142d1adaa9ca219e405f1368dc609ed5e81191ad77fb563f3f6fe WatchSource:0}: Error finding container e0291e4c052142d1adaa9ca219e405f1368dc609ed5e81191ad77fb563f3f6fe: Status 404 returned error can't find the container with id e0291e4c052142d1adaa9ca219e405f1368dc609ed5e81191ad77fb563f3f6fe Apr 24 21:56:36.672287 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:36.672271 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:56:36.817911 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:36.817812 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd" event={"ID":"e3da9271-340b-4dd7-9eb3-9bbb1c51ec42","Type":"ContainerStarted","Data":"087064ed0e16357fc60546e8b41dc761d9d1b12c1e63601e3f296c01dd0ee2b5"} Apr 24 21:56:36.817911 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:36.817847 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd" event={"ID":"e3da9271-340b-4dd7-9eb3-9bbb1c51ec42","Type":"ContainerStarted","Data":"e0291e4c052142d1adaa9ca219e405f1368dc609ed5e81191ad77fb563f3f6fe"} Apr 24 21:56:36.817911 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:36.817875 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd" Apr 24 21:56:36.843340 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:36.843292 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd" podStartSLOduration=1.843275014 podStartE2EDuration="1.843275014s" podCreationTimestamp="2026-04-24 21:56:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:56:36.842484993 +0000 UTC m=+1774.103053824" watchObservedRunningTime="2026-04-24 21:56:36.843275014 +0000 UTC m=+1774.103843845" Apr 24 21:56:38.334685 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:38.334637 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r" podUID="c0483f79-b854-42da-a3b9-ac80a7066269" containerName="sequence-graph-d1134" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:56:42.825899 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:42.825869 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd" Apr 24 21:56:43.334203 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:43.334165 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r" podUID="c0483f79-b854-42da-a3b9-ac80a7066269" containerName="sequence-graph-d1134" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:56:45.261925 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:45.261900 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r" Apr 24 21:56:45.337182 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:45.337143 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0483f79-b854-42da-a3b9-ac80a7066269-openshift-service-ca-bundle\") pod \"c0483f79-b854-42da-a3b9-ac80a7066269\" (UID: \"c0483f79-b854-42da-a3b9-ac80a7066269\") " Apr 24 21:56:45.337378 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:45.337215 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0483f79-b854-42da-a3b9-ac80a7066269-proxy-tls\") pod \"c0483f79-b854-42da-a3b9-ac80a7066269\" (UID: \"c0483f79-b854-42da-a3b9-ac80a7066269\") " Apr 24 21:56:45.337499 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:45.337470 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0483f79-b854-42da-a3b9-ac80a7066269-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "c0483f79-b854-42da-a3b9-ac80a7066269" (UID: "c0483f79-b854-42da-a3b9-ac80a7066269"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:56:45.339350 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:45.339327 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0483f79-b854-42da-a3b9-ac80a7066269-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c0483f79-b854-42da-a3b9-ac80a7066269" (UID: "c0483f79-b854-42da-a3b9-ac80a7066269"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:56:45.438298 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:45.438204 2573 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0483f79-b854-42da-a3b9-ac80a7066269-openshift-service-ca-bundle\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:56:45.438298 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:45.438233 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c0483f79-b854-42da-a3b9-ac80a7066269-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:56:45.707605 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:45.707508 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd"] Apr 24 21:56:45.707790 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:45.707751 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd" podUID="e3da9271-340b-4dd7-9eb3-9bbb1c51ec42" containerName="splitter-graph-6842b" containerID="cri-o://087064ed0e16357fc60546e8b41dc761d9d1b12c1e63601e3f296c01dd0ee2b5" gracePeriod=30 Apr 24 21:56:45.847399 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:45.847360 2573 generic.go:358] "Generic (PLEG): container finished" podID="c0483f79-b854-42da-a3b9-ac80a7066269" containerID="c1e5b541652f0550429021125523ad2ffd143dede0a4ad2c4ca10e01e804606a" exitCode=0 Apr 24 21:56:45.847570 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:45.847426 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r" Apr 24 21:56:45.847570 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:45.847440 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r" event={"ID":"c0483f79-b854-42da-a3b9-ac80a7066269","Type":"ContainerDied","Data":"c1e5b541652f0550429021125523ad2ffd143dede0a4ad2c4ca10e01e804606a"} Apr 24 21:56:45.847570 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:45.847484 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r" event={"ID":"c0483f79-b854-42da-a3b9-ac80a7066269","Type":"ContainerDied","Data":"9a5e8a604e1c135c83a1fe0a31a3c58bac525378ebec9d37c6ab54dc556dc22e"} Apr 24 21:56:45.847570 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:45.847499 2573 scope.go:117] "RemoveContainer" containerID="c1e5b541652f0550429021125523ad2ffd143dede0a4ad2c4ca10e01e804606a" Apr 24 21:56:45.855715 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:45.855691 2573 scope.go:117] "RemoveContainer" containerID="c1e5b541652f0550429021125523ad2ffd143dede0a4ad2c4ca10e01e804606a" Apr 24 21:56:45.856003 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:56:45.855984 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1e5b541652f0550429021125523ad2ffd143dede0a4ad2c4ca10e01e804606a\": container with ID starting with c1e5b541652f0550429021125523ad2ffd143dede0a4ad2c4ca10e01e804606a not found: ID does not exist" containerID="c1e5b541652f0550429021125523ad2ffd143dede0a4ad2c4ca10e01e804606a" Apr 24 21:56:45.856082 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:45.856012 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1e5b541652f0550429021125523ad2ffd143dede0a4ad2c4ca10e01e804606a"} err="failed to get container status \"c1e5b541652f0550429021125523ad2ffd143dede0a4ad2c4ca10e01e804606a\": rpc error: code = NotFound desc = could not find container \"c1e5b541652f0550429021125523ad2ffd143dede0a4ad2c4ca10e01e804606a\": container with ID starting with c1e5b541652f0550429021125523ad2ffd143dede0a4ad2c4ca10e01e804606a not found: ID does not exist" Apr 24 21:56:45.868920 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:45.868894 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r"] Apr 24 21:56:45.875111 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:45.875071 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-d1134-69b44c49f9-tm66r"] Apr 24 21:56:47.357064 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:47.357033 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0483f79-b854-42da-a3b9-ac80a7066269" path="/var/lib/kubelet/pods/c0483f79-b854-42da-a3b9-ac80a7066269/volumes" Apr 24 21:56:47.825014 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:47.824884 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd" podUID="e3da9271-340b-4dd7-9eb3-9bbb1c51ec42" containerName="splitter-graph-6842b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:56:52.824751 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:52.824712 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd" podUID="e3da9271-340b-4dd7-9eb3-9bbb1c51ec42" containerName="splitter-graph-6842b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:56:57.824327 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:57.824286 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd" podUID="e3da9271-340b-4dd7-9eb3-9bbb1c51ec42" containerName="splitter-graph-6842b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:56:57.824795 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:56:57.824386 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd" Apr 24 21:57:02.824064 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:02.824025 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd" podUID="e3da9271-340b-4dd7-9eb3-9bbb1c51ec42" containerName="splitter-graph-6842b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:57:03.371122 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:03.371081 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-59fzh_494a20d0-ace3-481d-ae39-780b958f7150/cluster-monitoring-operator/0.log" Apr 24 21:57:03.375560 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:03.375538 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-59fzh_494a20d0-ace3-481d-ae39-780b958f7150/cluster-monitoring-operator/0.log" Apr 24 21:57:03.376176 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:03.376154 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4nztl_b3e83451-d37f-4c5b-837f-6440bed57b91/console-operator/1.log" Apr 24 21:57:03.380393 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:03.380373 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4nztl_b3e83451-d37f-4c5b-837f-6440bed57b91/console-operator/1.log" Apr 24 21:57:07.825059 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:07.825016 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd" podUID="e3da9271-340b-4dd7-9eb3-9bbb1c51ec42" containerName="splitter-graph-6842b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:57:12.824875 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:12.824831 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd" podUID="e3da9271-340b-4dd7-9eb3-9bbb1c51ec42" containerName="splitter-graph-6842b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:57:15.318875 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.318833 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996"] Apr 24 21:57:15.319363 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.319304 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0483f79-b854-42da-a3b9-ac80a7066269" containerName="sequence-graph-d1134" Apr 24 21:57:15.319363 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.319324 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0483f79-b854-42da-a3b9-ac80a7066269" containerName="sequence-graph-d1134" Apr 24 21:57:15.319474 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.319398 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0483f79-b854-42da-a3b9-ac80a7066269" containerName="sequence-graph-d1134" Apr 24 21:57:15.323346 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.323321 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996" Apr 24 21:57:15.326157 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.326132 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-fc125-serving-cert\"" Apr 24 21:57:15.326274 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.326156 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-fc125-kube-rbac-proxy-sar-config\"" Apr 24 21:57:15.332236 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.332213 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996"] Apr 24 21:57:15.489237 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.489206 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97a1ecf0-a7f7-49a7-8370-f7975294b638-proxy-tls\") pod \"switch-graph-fc125-5c9566c648-xg996\" (UID: \"97a1ecf0-a7f7-49a7-8370-f7975294b638\") " pod="kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996" Apr 24 21:57:15.489415 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.489290 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97a1ecf0-a7f7-49a7-8370-f7975294b638-openshift-service-ca-bundle\") pod \"switch-graph-fc125-5c9566c648-xg996\" (UID: \"97a1ecf0-a7f7-49a7-8370-f7975294b638\") " pod="kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996" Apr 24 21:57:15.590384 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.590279 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97a1ecf0-a7f7-49a7-8370-f7975294b638-proxy-tls\") pod \"switch-graph-fc125-5c9566c648-xg996\" (UID: \"97a1ecf0-a7f7-49a7-8370-f7975294b638\") " pod="kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996" Apr 24 21:57:15.590384 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.590361 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97a1ecf0-a7f7-49a7-8370-f7975294b638-openshift-service-ca-bundle\") pod \"switch-graph-fc125-5c9566c648-xg996\" (UID: \"97a1ecf0-a7f7-49a7-8370-f7975294b638\") " pod="kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996" Apr 24 21:57:15.590958 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.590938 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97a1ecf0-a7f7-49a7-8370-f7975294b638-openshift-service-ca-bundle\") pod \"switch-graph-fc125-5c9566c648-xg996\" (UID: \"97a1ecf0-a7f7-49a7-8370-f7975294b638\") " pod="kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996" Apr 24 21:57:15.592891 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.592859 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97a1ecf0-a7f7-49a7-8370-f7975294b638-proxy-tls\") pod \"switch-graph-fc125-5c9566c648-xg996\" (UID: \"97a1ecf0-a7f7-49a7-8370-f7975294b638\") " pod="kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996" Apr 24 21:57:15.634427 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.634373 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996" Apr 24 21:57:15.740348 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:57:15.740312 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3da9271_340b_4dd7_9eb3_9bbb1c51ec42.slice/crio-conmon-087064ed0e16357fc60546e8b41dc761d9d1b12c1e63601e3f296c01dd0ee2b5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3da9271_340b_4dd7_9eb3_9bbb1c51ec42.slice/crio-087064ed0e16357fc60546e8b41dc761d9d1b12c1e63601e3f296c01dd0ee2b5.scope\": RecentStats: unable to find data in memory cache], [\"/system.slice/NetworkManager-dispatcher.service\": RecentStats: unable to find data in memory cache]" Apr 24 21:57:15.740514 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:57:15.740392 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3da9271_340b_4dd7_9eb3_9bbb1c51ec42.slice/crio-conmon-087064ed0e16357fc60546e8b41dc761d9d1b12c1e63601e3f296c01dd0ee2b5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3da9271_340b_4dd7_9eb3_9bbb1c51ec42.slice/crio-087064ed0e16357fc60546e8b41dc761d9d1b12c1e63601e3f296c01dd0ee2b5.scope\": RecentStats: unable to find data in memory cache]" Apr 24 21:57:15.770523 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.770478 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996"] Apr 24 21:57:15.778502 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:57:15.778461 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97a1ecf0_a7f7_49a7_8370_f7975294b638.slice/crio-5cb2aa847c99f767a74b8e40c5cdcf9ac464e2c1bb026a297fcac421403288ad WatchSource:0}: Error finding container 5cb2aa847c99f767a74b8e40c5cdcf9ac464e2c1bb026a297fcac421403288ad: Status 404 returned error can't find the container with id 5cb2aa847c99f767a74b8e40c5cdcf9ac464e2c1bb026a297fcac421403288ad Apr 24 21:57:15.853084 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.853063 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd" Apr 24 21:57:15.937396 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.937357 2573 generic.go:358] "Generic (PLEG): container finished" podID="e3da9271-340b-4dd7-9eb3-9bbb1c51ec42" containerID="087064ed0e16357fc60546e8b41dc761d9d1b12c1e63601e3f296c01dd0ee2b5" exitCode=0 Apr 24 21:57:15.937581 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.937422 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd" Apr 24 21:57:15.937581 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.937444 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd" event={"ID":"e3da9271-340b-4dd7-9eb3-9bbb1c51ec42","Type":"ContainerDied","Data":"087064ed0e16357fc60546e8b41dc761d9d1b12c1e63601e3f296c01dd0ee2b5"} Apr 24 21:57:15.937581 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.937479 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd" event={"ID":"e3da9271-340b-4dd7-9eb3-9bbb1c51ec42","Type":"ContainerDied","Data":"e0291e4c052142d1adaa9ca219e405f1368dc609ed5e81191ad77fb563f3f6fe"} Apr 24 21:57:15.937581 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.937500 2573 scope.go:117] "RemoveContainer" containerID="087064ed0e16357fc60546e8b41dc761d9d1b12c1e63601e3f296c01dd0ee2b5" Apr 24 21:57:15.938765 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.938745 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996" event={"ID":"97a1ecf0-a7f7-49a7-8370-f7975294b638","Type":"ContainerStarted","Data":"acf75c76d808fe334509e0e197641b5f2ad59865d5de595070858f02426ffe59"} Apr 24 21:57:15.938865 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.938770 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996" event={"ID":"97a1ecf0-a7f7-49a7-8370-f7975294b638","Type":"ContainerStarted","Data":"5cb2aa847c99f767a74b8e40c5cdcf9ac464e2c1bb026a297fcac421403288ad"} Apr 24 21:57:15.938958 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.938939 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996" Apr 24 21:57:15.945256 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.945240 2573 scope.go:117] "RemoveContainer" containerID="087064ed0e16357fc60546e8b41dc761d9d1b12c1e63601e3f296c01dd0ee2b5" Apr 24 21:57:15.945533 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:57:15.945513 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"087064ed0e16357fc60546e8b41dc761d9d1b12c1e63601e3f296c01dd0ee2b5\": container with ID starting with 087064ed0e16357fc60546e8b41dc761d9d1b12c1e63601e3f296c01dd0ee2b5 not found: ID does not exist" containerID="087064ed0e16357fc60546e8b41dc761d9d1b12c1e63601e3f296c01dd0ee2b5" Apr 24 21:57:15.945594 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.945541 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"087064ed0e16357fc60546e8b41dc761d9d1b12c1e63601e3f296c01dd0ee2b5"} err="failed to get container status \"087064ed0e16357fc60546e8b41dc761d9d1b12c1e63601e3f296c01dd0ee2b5\": rpc error: code = NotFound desc = could not find container \"087064ed0e16357fc60546e8b41dc761d9d1b12c1e63601e3f296c01dd0ee2b5\": container with ID starting with 087064ed0e16357fc60546e8b41dc761d9d1b12c1e63601e3f296c01dd0ee2b5 not found: ID does not exist" Apr 24 21:57:15.957011 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.956964 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996" podStartSLOduration=0.956949347 podStartE2EDuration="956.949347ms" podCreationTimestamp="2026-04-24 21:57:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:57:15.956763256 +0000 UTC m=+1813.217332084" watchObservedRunningTime="2026-04-24 21:57:15.956949347 +0000 UTC m=+1813.217518179" Apr 24 21:57:15.993215 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.993183 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3da9271-340b-4dd7-9eb3-9bbb1c51ec42-proxy-tls\") pod \"e3da9271-340b-4dd7-9eb3-9bbb1c51ec42\" (UID: \"e3da9271-340b-4dd7-9eb3-9bbb1c51ec42\") " Apr 24 21:57:15.993388 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.993225 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3da9271-340b-4dd7-9eb3-9bbb1c51ec42-openshift-service-ca-bundle\") pod \"e3da9271-340b-4dd7-9eb3-9bbb1c51ec42\" (UID: \"e3da9271-340b-4dd7-9eb3-9bbb1c51ec42\") " Apr 24 21:57:15.993609 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.993584 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3da9271-340b-4dd7-9eb3-9bbb1c51ec42-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "e3da9271-340b-4dd7-9eb3-9bbb1c51ec42" (UID: "e3da9271-340b-4dd7-9eb3-9bbb1c51ec42"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:57:15.995380 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:15.995360 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3da9271-340b-4dd7-9eb3-9bbb1c51ec42-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e3da9271-340b-4dd7-9eb3-9bbb1c51ec42" (UID: "e3da9271-340b-4dd7-9eb3-9bbb1c51ec42"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:57:16.094005 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:16.093964 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3da9271-340b-4dd7-9eb3-9bbb1c51ec42-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:57:16.094005 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:16.093998 2573 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3da9271-340b-4dd7-9eb3-9bbb1c51ec42-openshift-service-ca-bundle\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 21:57:16.259072 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:16.259037 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd"] Apr 24 21:57:16.262248 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:16.262221 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-6842b-55bb6cfccf-g8lvd"] Apr 24 21:57:17.356313 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:17.356278 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3da9271-340b-4dd7-9eb3-9bbb1c51ec42" path="/var/lib/kubelet/pods/e3da9271-340b-4dd7-9eb3-9bbb1c51ec42/volumes" Apr 24 21:57:21.948823 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:21.948790 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996" Apr 24 21:57:45.952881 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:45.952840 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp"] Apr 24 21:57:45.953376 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:45.953144 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e3da9271-340b-4dd7-9eb3-9bbb1c51ec42" containerName="splitter-graph-6842b" Apr 24 21:57:45.953376 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:45.953156 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3da9271-340b-4dd7-9eb3-9bbb1c51ec42" containerName="splitter-graph-6842b" Apr 24 21:57:45.953376 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:45.953221 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e3da9271-340b-4dd7-9eb3-9bbb1c51ec42" containerName="splitter-graph-6842b" Apr 24 21:57:45.955806 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:45.955786 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp" Apr 24 21:57:45.959437 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:45.959411 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-6941b-serving-cert\"" Apr 24 21:57:45.959580 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:45.959557 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-6941b-kube-rbac-proxy-sar-config\"" Apr 24 21:57:45.964700 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:45.964675 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp"] Apr 24 21:57:46.011085 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:46.011038 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/783d3997-6bfd-4ec9-bff5-91a0abf8f875-openshift-service-ca-bundle\") pod \"splitter-graph-6941b-78cf747fb6-bxvdp\" (UID: \"783d3997-6bfd-4ec9-bff5-91a0abf8f875\") " pod="kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp" Apr 24 21:57:46.011085 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:46.011081 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/783d3997-6bfd-4ec9-bff5-91a0abf8f875-proxy-tls\") pod \"splitter-graph-6941b-78cf747fb6-bxvdp\" (UID: \"783d3997-6bfd-4ec9-bff5-91a0abf8f875\") " pod="kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp" Apr 24 21:57:46.112189 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:46.112147 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/783d3997-6bfd-4ec9-bff5-91a0abf8f875-openshift-service-ca-bundle\") pod \"splitter-graph-6941b-78cf747fb6-bxvdp\" (UID: \"783d3997-6bfd-4ec9-bff5-91a0abf8f875\") " pod="kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp" Apr 24 21:57:46.112189 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:46.112193 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/783d3997-6bfd-4ec9-bff5-91a0abf8f875-proxy-tls\") pod \"splitter-graph-6941b-78cf747fb6-bxvdp\" (UID: \"783d3997-6bfd-4ec9-bff5-91a0abf8f875\") " pod="kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp" Apr 24 21:57:46.112409 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:57:46.112361 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-6941b-serving-cert: secret "splitter-graph-6941b-serving-cert" not found Apr 24 21:57:46.112453 ip-10-0-136-201 kubenswrapper[2573]: E0424 21:57:46.112441 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/783d3997-6bfd-4ec9-bff5-91a0abf8f875-proxy-tls podName:783d3997-6bfd-4ec9-bff5-91a0abf8f875 nodeName:}" failed. No retries permitted until 2026-04-24 21:57:46.612423869 +0000 UTC m=+1843.872992683 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/783d3997-6bfd-4ec9-bff5-91a0abf8f875-proxy-tls") pod "splitter-graph-6941b-78cf747fb6-bxvdp" (UID: "783d3997-6bfd-4ec9-bff5-91a0abf8f875") : secret "splitter-graph-6941b-serving-cert" not found Apr 24 21:57:46.112798 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:46.112778 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/783d3997-6bfd-4ec9-bff5-91a0abf8f875-openshift-service-ca-bundle\") pod \"splitter-graph-6941b-78cf747fb6-bxvdp\" (UID: \"783d3997-6bfd-4ec9-bff5-91a0abf8f875\") " pod="kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp" Apr 24 21:57:46.617408 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:46.617372 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/783d3997-6bfd-4ec9-bff5-91a0abf8f875-proxy-tls\") pod \"splitter-graph-6941b-78cf747fb6-bxvdp\" (UID: \"783d3997-6bfd-4ec9-bff5-91a0abf8f875\") " pod="kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp" Apr 24 21:57:46.619839 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:46.619817 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/783d3997-6bfd-4ec9-bff5-91a0abf8f875-proxy-tls\") pod \"splitter-graph-6941b-78cf747fb6-bxvdp\" (UID: \"783d3997-6bfd-4ec9-bff5-91a0abf8f875\") " pod="kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp" Apr 24 21:57:46.866919 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:46.866886 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp" Apr 24 21:57:46.993929 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:46.993887 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp"] Apr 24 21:57:46.996868 ip-10-0-136-201 kubenswrapper[2573]: W0424 21:57:46.996838 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod783d3997_6bfd_4ec9_bff5_91a0abf8f875.slice/crio-07d43ef7a24ebacd6daa95b2b3ca28ea2dd149a07a79ce1040856d19546a4842 WatchSource:0}: Error finding container 07d43ef7a24ebacd6daa95b2b3ca28ea2dd149a07a79ce1040856d19546a4842: Status 404 returned error can't find the container with id 07d43ef7a24ebacd6daa95b2b3ca28ea2dd149a07a79ce1040856d19546a4842 Apr 24 21:57:47.027380 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:47.027342 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp" event={"ID":"783d3997-6bfd-4ec9-bff5-91a0abf8f875","Type":"ContainerStarted","Data":"07d43ef7a24ebacd6daa95b2b3ca28ea2dd149a07a79ce1040856d19546a4842"} Apr 24 21:57:48.031529 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:48.031495 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp" event={"ID":"783d3997-6bfd-4ec9-bff5-91a0abf8f875","Type":"ContainerStarted","Data":"90f2dd32560910c320819476fa5c9c69597e7c2c61896d7beead6f90508fe694"} Apr 24 21:57:48.031932 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:48.031611 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp" Apr 24 21:57:48.049874 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:48.049821 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp" podStartSLOduration=3.049804416 podStartE2EDuration="3.049804416s" podCreationTimestamp="2026-04-24 21:57:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:57:48.048427489 +0000 UTC m=+1845.308996321" watchObservedRunningTime="2026-04-24 21:57:48.049804416 +0000 UTC m=+1845.310373248" Apr 24 21:57:54.040805 ip-10-0-136-201 kubenswrapper[2573]: I0424 21:57:54.040774 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp" Apr 24 22:02:03.391265 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:02:03.391231 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-59fzh_494a20d0-ace3-481d-ae39-780b958f7150/cluster-monitoring-operator/0.log" Apr 24 22:02:03.396311 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:02:03.396276 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4nztl_b3e83451-d37f-4c5b-837f-6440bed57b91/console-operator/1.log" Apr 24 22:02:03.397268 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:02:03.397247 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-59fzh_494a20d0-ace3-481d-ae39-780b958f7150/cluster-monitoring-operator/0.log" Apr 24 22:02:03.401869 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:02:03.401848 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4nztl_b3e83451-d37f-4c5b-837f-6440bed57b91/console-operator/1.log" Apr 24 22:06:00.467860 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:06:00.467757 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp"] Apr 24 22:06:00.468423 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:06:00.468025 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp" podUID="783d3997-6bfd-4ec9-bff5-91a0abf8f875" containerName="splitter-graph-6941b" containerID="cri-o://90f2dd32560910c320819476fa5c9c69597e7c2c61896d7beead6f90508fe694" gracePeriod=30 Apr 24 22:06:04.038661 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:06:04.038617 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp" podUID="783d3997-6bfd-4ec9-bff5-91a0abf8f875" containerName="splitter-graph-6941b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:06:09.038309 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:06:09.038269 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp" podUID="783d3997-6bfd-4ec9-bff5-91a0abf8f875" containerName="splitter-graph-6941b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:06:14.038286 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:06:14.038247 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp" podUID="783d3997-6bfd-4ec9-bff5-91a0abf8f875" containerName="splitter-graph-6941b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:06:14.038706 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:06:14.038353 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp" Apr 24 22:06:19.038435 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:06:19.038394 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp" podUID="783d3997-6bfd-4ec9-bff5-91a0abf8f875" containerName="splitter-graph-6941b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:06:24.037770 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:06:24.037728 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp" podUID="783d3997-6bfd-4ec9-bff5-91a0abf8f875" containerName="splitter-graph-6941b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:06:29.038481 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:06:29.038439 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp" podUID="783d3997-6bfd-4ec9-bff5-91a0abf8f875" containerName="splitter-graph-6941b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:06:30.559164 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:06:30.559130 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp" event={"ID":"783d3997-6bfd-4ec9-bff5-91a0abf8f875","Type":"ContainerDied","Data":"90f2dd32560910c320819476fa5c9c69597e7c2c61896d7beead6f90508fe694"} Apr 24 22:06:30.559523 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:06:30.559136 2573 generic.go:358] "Generic (PLEG): container finished" podID="783d3997-6bfd-4ec9-bff5-91a0abf8f875" containerID="90f2dd32560910c320819476fa5c9c69597e7c2c61896d7beead6f90508fe694" exitCode=0 Apr 24 22:06:30.617508 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:06:30.617483 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp" Apr 24 22:06:30.673907 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:06:30.673855 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/783d3997-6bfd-4ec9-bff5-91a0abf8f875-proxy-tls\") pod \"783d3997-6bfd-4ec9-bff5-91a0abf8f875\" (UID: \"783d3997-6bfd-4ec9-bff5-91a0abf8f875\") " Apr 24 22:06:30.673907 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:06:30.673920 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/783d3997-6bfd-4ec9-bff5-91a0abf8f875-openshift-service-ca-bundle\") pod \"783d3997-6bfd-4ec9-bff5-91a0abf8f875\" (UID: \"783d3997-6bfd-4ec9-bff5-91a0abf8f875\") " Apr 24 22:06:30.674359 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:06:30.674334 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/783d3997-6bfd-4ec9-bff5-91a0abf8f875-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "783d3997-6bfd-4ec9-bff5-91a0abf8f875" (UID: "783d3997-6bfd-4ec9-bff5-91a0abf8f875"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:06:30.676115 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:06:30.676065 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/783d3997-6bfd-4ec9-bff5-91a0abf8f875-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "783d3997-6bfd-4ec9-bff5-91a0abf8f875" (UID: "783d3997-6bfd-4ec9-bff5-91a0abf8f875"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:06:30.774884 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:06:30.774785 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/783d3997-6bfd-4ec9-bff5-91a0abf8f875-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 22:06:30.774884 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:06:30.774827 2573 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/783d3997-6bfd-4ec9-bff5-91a0abf8f875-openshift-service-ca-bundle\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 22:06:31.562914 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:06:31.562876 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp" event={"ID":"783d3997-6bfd-4ec9-bff5-91a0abf8f875","Type":"ContainerDied","Data":"07d43ef7a24ebacd6daa95b2b3ca28ea2dd149a07a79ce1040856d19546a4842"} Apr 24 22:06:31.563350 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:06:31.562930 2573 scope.go:117] "RemoveContainer" containerID="90f2dd32560910c320819476fa5c9c69597e7c2c61896d7beead6f90508fe694" Apr 24 22:06:31.563350 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:06:31.562931 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp" Apr 24 22:06:31.578008 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:06:31.577980 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp"] Apr 24 22:06:31.579870 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:06:31.579845 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-6941b-78cf747fb6-bxvdp"] Apr 24 22:06:33.360623 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:06:33.360588 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="783d3997-6bfd-4ec9-bff5-91a0abf8f875" path="/var/lib/kubelet/pods/783d3997-6bfd-4ec9-bff5-91a0abf8f875/volumes" Apr 24 22:07:03.412123 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:07:03.411994 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-59fzh_494a20d0-ace3-481d-ae39-780b958f7150/cluster-monitoring-operator/0.log" Apr 24 22:07:03.416812 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:07:03.416792 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4nztl_b3e83451-d37f-4c5b-837f-6440bed57b91/console-operator/1.log" Apr 24 22:07:03.417517 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:07:03.417496 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-59fzh_494a20d0-ace3-481d-ae39-780b958f7150/cluster-monitoring-operator/0.log" Apr 24 22:07:03.422356 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:07:03.422336 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4nztl_b3e83451-d37f-4c5b-837f-6440bed57b91/console-operator/1.log" Apr 24 22:12:03.432004 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:12:03.431886 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-59fzh_494a20d0-ace3-481d-ae39-780b958f7150/cluster-monitoring-operator/0.log" Apr 24 22:12:03.436825 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:12:03.436803 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4nztl_b3e83451-d37f-4c5b-837f-6440bed57b91/console-operator/1.log" Apr 24 22:12:03.440051 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:12:03.440032 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-59fzh_494a20d0-ace3-481d-ae39-780b958f7150/cluster-monitoring-operator/0.log" Apr 24 22:12:03.444854 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:12:03.444837 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4nztl_b3e83451-d37f-4c5b-837f-6440bed57b91/console-operator/1.log" Apr 24 22:13:34.680127 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:13:34.680015 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996"] Apr 24 22:13:34.680844 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:13:34.680331 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996" podUID="97a1ecf0-a7f7-49a7-8370-f7975294b638" containerName="switch-graph-fc125" containerID="cri-o://acf75c76d808fe334509e0e197641b5f2ad59865d5de595070858f02426ffe59" gracePeriod=30 Apr 24 22:13:36.946358 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:13:36.946321 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996" podUID="97a1ecf0-a7f7-49a7-8370-f7975294b638" containerName="switch-graph-fc125" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:13:41.946146 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:13:41.946085 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996" podUID="97a1ecf0-a7f7-49a7-8370-f7975294b638" containerName="switch-graph-fc125" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:13:46.946280 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:13:46.946241 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996" podUID="97a1ecf0-a7f7-49a7-8370-f7975294b638" containerName="switch-graph-fc125" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:13:46.946707 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:13:46.946356 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996" Apr 24 22:13:50.101802 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:13:50.101761 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-fc125-5c9566c648-xg996_97a1ecf0-a7f7-49a7-8370-f7975294b638/switch-graph-fc125/0.log" Apr 24 22:13:50.899623 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:13:50.899589 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-fc125-5c9566c648-xg996_97a1ecf0-a7f7-49a7-8370-f7975294b638/switch-graph-fc125/0.log" Apr 24 22:13:51.694755 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:13:51.694722 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-fc125-5c9566c648-xg996_97a1ecf0-a7f7-49a7-8370-f7975294b638/switch-graph-fc125/0.log" Apr 24 22:13:51.947880 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:13:51.947786 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996" podUID="97a1ecf0-a7f7-49a7-8370-f7975294b638" containerName="switch-graph-fc125" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:13:52.460009 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:13:52.459977 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-fc125-5c9566c648-xg996_97a1ecf0-a7f7-49a7-8370-f7975294b638/switch-graph-fc125/0.log" Apr 24 22:13:53.215436 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:13:53.215409 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-fc125-5c9566c648-xg996_97a1ecf0-a7f7-49a7-8370-f7975294b638/switch-graph-fc125/0.log" Apr 24 22:13:53.970479 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:13:53.970446 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-fc125-5c9566c648-xg996_97a1ecf0-a7f7-49a7-8370-f7975294b638/switch-graph-fc125/0.log" Apr 24 22:13:54.733583 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:13:54.733552 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-fc125-5c9566c648-xg996_97a1ecf0-a7f7-49a7-8370-f7975294b638/switch-graph-fc125/0.log" Apr 24 22:13:55.516953 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:13:55.516921 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-fc125-5c9566c648-xg996_97a1ecf0-a7f7-49a7-8370-f7975294b638/switch-graph-fc125/0.log" Apr 24 22:13:56.284511 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:13:56.284479 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-fc125-5c9566c648-xg996_97a1ecf0-a7f7-49a7-8370-f7975294b638/switch-graph-fc125/0.log" Apr 24 22:13:56.947031 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:13:56.946989 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996" podUID="97a1ecf0-a7f7-49a7-8370-f7975294b638" containerName="switch-graph-fc125" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:13:57.044459 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:13:57.044427 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-fc125-5c9566c648-xg996_97a1ecf0-a7f7-49a7-8370-f7975294b638/switch-graph-fc125/0.log" Apr 24 22:13:57.790916 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:13:57.790882 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-fc125-5c9566c648-xg996_97a1ecf0-a7f7-49a7-8370-f7975294b638/switch-graph-fc125/0.log" Apr 24 22:13:58.568807 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:13:58.568775 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-fc125-5c9566c648-xg996_97a1ecf0-a7f7-49a7-8370-f7975294b638/switch-graph-fc125/0.log" Apr 24 22:14:01.946143 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:01.946086 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996" podUID="97a1ecf0-a7f7-49a7-8370-f7975294b638" containerName="switch-graph-fc125" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 22:14:04.827958 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:04.827932 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996" Apr 24 22:14:04.870114 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:04.870069 2573 generic.go:358] "Generic (PLEG): container finished" podID="97a1ecf0-a7f7-49a7-8370-f7975294b638" containerID="acf75c76d808fe334509e0e197641b5f2ad59865d5de595070858f02426ffe59" exitCode=0 Apr 24 22:14:04.870307 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:04.870130 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996" event={"ID":"97a1ecf0-a7f7-49a7-8370-f7975294b638","Type":"ContainerDied","Data":"acf75c76d808fe334509e0e197641b5f2ad59865d5de595070858f02426ffe59"} Apr 24 22:14:04.870307 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:04.870149 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996" Apr 24 22:14:04.870307 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:04.870174 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996" event={"ID":"97a1ecf0-a7f7-49a7-8370-f7975294b638","Type":"ContainerDied","Data":"5cb2aa847c99f767a74b8e40c5cdcf9ac464e2c1bb026a297fcac421403288ad"} Apr 24 22:14:04.870307 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:04.870195 2573 scope.go:117] "RemoveContainer" containerID="acf75c76d808fe334509e0e197641b5f2ad59865d5de595070858f02426ffe59" Apr 24 22:14:04.878232 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:04.878213 2573 scope.go:117] "RemoveContainer" containerID="acf75c76d808fe334509e0e197641b5f2ad59865d5de595070858f02426ffe59" Apr 24 22:14:04.878495 ip-10-0-136-201 kubenswrapper[2573]: E0424 22:14:04.878477 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acf75c76d808fe334509e0e197641b5f2ad59865d5de595070858f02426ffe59\": container with ID starting with acf75c76d808fe334509e0e197641b5f2ad59865d5de595070858f02426ffe59 not found: ID does not exist" containerID="acf75c76d808fe334509e0e197641b5f2ad59865d5de595070858f02426ffe59" Apr 24 22:14:04.878540 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:04.878504 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acf75c76d808fe334509e0e197641b5f2ad59865d5de595070858f02426ffe59"} err="failed to get container status \"acf75c76d808fe334509e0e197641b5f2ad59865d5de595070858f02426ffe59\": rpc error: code = NotFound desc = could not find container \"acf75c76d808fe334509e0e197641b5f2ad59865d5de595070858f02426ffe59\": container with ID starting with acf75c76d808fe334509e0e197641b5f2ad59865d5de595070858f02426ffe59 not found: ID does not exist" Apr 24 22:14:04.960300 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:04.960219 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97a1ecf0-a7f7-49a7-8370-f7975294b638-proxy-tls\") pod \"97a1ecf0-a7f7-49a7-8370-f7975294b638\" (UID: \"97a1ecf0-a7f7-49a7-8370-f7975294b638\") " Apr 24 22:14:04.960300 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:04.960265 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97a1ecf0-a7f7-49a7-8370-f7975294b638-openshift-service-ca-bundle\") pod \"97a1ecf0-a7f7-49a7-8370-f7975294b638\" (UID: \"97a1ecf0-a7f7-49a7-8370-f7975294b638\") " Apr 24 22:14:04.960635 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:04.960611 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97a1ecf0-a7f7-49a7-8370-f7975294b638-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "97a1ecf0-a7f7-49a7-8370-f7975294b638" (UID: "97a1ecf0-a7f7-49a7-8370-f7975294b638"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 22:14:04.962348 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:04.962325 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97a1ecf0-a7f7-49a7-8370-f7975294b638-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "97a1ecf0-a7f7-49a7-8370-f7975294b638" (UID: "97a1ecf0-a7f7-49a7-8370-f7975294b638"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:14:05.061184 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:05.061146 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97a1ecf0-a7f7-49a7-8370-f7975294b638-proxy-tls\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 22:14:05.061184 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:05.061177 2573 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97a1ecf0-a7f7-49a7-8370-f7975294b638-openshift-service-ca-bundle\") on node \"ip-10-0-136-201.ec2.internal\" DevicePath \"\"" Apr 24 22:14:05.191337 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:05.191306 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996"] Apr 24 22:14:05.195310 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:05.195277 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-fc125-5c9566c648-xg996"] Apr 24 22:14:05.313024 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:05.312936 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-8bd94_e9999088-0209-4da1-b3c6-92c0d2e48409/global-pull-secret-syncer/0.log" Apr 24 22:14:05.356758 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:05.356721 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97a1ecf0-a7f7-49a7-8370-f7975294b638" path="/var/lib/kubelet/pods/97a1ecf0-a7f7-49a7-8370-f7975294b638/volumes" Apr 24 22:14:05.445436 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:05.445406 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-4smd5_d43e5662-c703-427c-ba4c-60f3f0029405/konnectivity-agent/0.log" Apr 24 22:14:05.598168 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:05.598063 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-201.ec2.internal_bcf950b6b9eef658d8ab20236281bc71/haproxy/0.log" Apr 24 22:14:09.139321 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:09.139287 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-59fzh_494a20d0-ace3-481d-ae39-780b958f7150/cluster-monitoring-operator/1.log" Apr 24 22:14:09.211699 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:09.211668 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-59fzh_494a20d0-ace3-481d-ae39-780b958f7150/cluster-monitoring-operator/0.log" Apr 24 22:14:09.544158 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:09.544049 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zrjdd_d07de7db-2978-4232-a544-00d2bced602e/node-exporter/0.log" Apr 24 22:14:09.569903 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:09.569875 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zrjdd_d07de7db-2978-4232-a544-00d2bced602e/kube-rbac-proxy/0.log" Apr 24 22:14:09.602107 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:09.602077 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zrjdd_d07de7db-2978-4232-a544-00d2bced602e/init-textfile/0.log" Apr 24 22:14:11.188811 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:11.188781 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-wh284_5153ee6e-f9de-4ac6-afdf-6f0f0bd574d4/networking-console-plugin/0.log" Apr 24 22:14:11.590189 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:11.590032 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4nztl_b3e83451-d37f-4c5b-837f-6440bed57b91/console-operator/1.log" Apr 24 22:14:11.594805 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:11.594777 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4nztl_b3e83451-d37f-4c5b-837f-6440bed57b91/console-operator/2.log" Apr 24 22:14:12.399472 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.399434 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ltmfk/perf-node-gather-daemonset-4qv9j"] Apr 24 22:14:12.399855 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.399723 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97a1ecf0-a7f7-49a7-8370-f7975294b638" containerName="switch-graph-fc125" Apr 24 22:14:12.399855 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.399735 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="97a1ecf0-a7f7-49a7-8370-f7975294b638" containerName="switch-graph-fc125" Apr 24 22:14:12.399855 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.399756 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="783d3997-6bfd-4ec9-bff5-91a0abf8f875" containerName="splitter-graph-6941b" Apr 24 22:14:12.399855 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.399762 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="783d3997-6bfd-4ec9-bff5-91a0abf8f875" containerName="splitter-graph-6941b" Apr 24 22:14:12.399855 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.399816 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="783d3997-6bfd-4ec9-bff5-91a0abf8f875" containerName="splitter-graph-6941b" Apr 24 22:14:12.399855 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.399829 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="97a1ecf0-a7f7-49a7-8370-f7975294b638" containerName="switch-graph-fc125" Apr 24 22:14:12.403005 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.402981 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-4qv9j" Apr 24 22:14:12.405235 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.405214 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-ltmfk\"/\"kube-root-ca.crt\"" Apr 24 22:14:12.405365 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.405256 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-ltmfk\"/\"openshift-service-ca.crt\"" Apr 24 22:14:12.406051 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.406036 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-ltmfk\"/\"default-dockercfg-9d9fp\"" Apr 24 22:14:12.413053 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.413030 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ltmfk/perf-node-gather-daemonset-4qv9j"] Apr 24 22:14:12.414747 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.414729 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-7qths_f2cf6003-5a3e-497b-82b1-afb7021314fc/volume-data-source-validator/0.log" Apr 24 22:14:12.515735 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.515691 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0b6efb8f-c585-4644-ab5d-9fea7f2f94ce-proc\") pod \"perf-node-gather-daemonset-4qv9j\" (UID: \"0b6efb8f-c585-4644-ab5d-9fea7f2f94ce\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-4qv9j" Apr 24 22:14:12.515931 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.515756 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwq6m\" (UniqueName: \"kubernetes.io/projected/0b6efb8f-c585-4644-ab5d-9fea7f2f94ce-kube-api-access-rwq6m\") pod \"perf-node-gather-daemonset-4qv9j\" (UID: \"0b6efb8f-c585-4644-ab5d-9fea7f2f94ce\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-4qv9j" Apr 24 22:14:12.515931 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.515784 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0b6efb8f-c585-4644-ab5d-9fea7f2f94ce-podres\") pod \"perf-node-gather-daemonset-4qv9j\" (UID: \"0b6efb8f-c585-4644-ab5d-9fea7f2f94ce\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-4qv9j" Apr 24 22:14:12.515931 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.515816 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0b6efb8f-c585-4644-ab5d-9fea7f2f94ce-sys\") pod \"perf-node-gather-daemonset-4qv9j\" (UID: \"0b6efb8f-c585-4644-ab5d-9fea7f2f94ce\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-4qv9j" Apr 24 22:14:12.515931 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.515902 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0b6efb8f-c585-4644-ab5d-9fea7f2f94ce-lib-modules\") pod \"perf-node-gather-daemonset-4qv9j\" (UID: \"0b6efb8f-c585-4644-ab5d-9fea7f2f94ce\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-4qv9j" Apr 24 22:14:12.616793 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.616754 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0b6efb8f-c585-4644-ab5d-9fea7f2f94ce-proc\") pod \"perf-node-gather-daemonset-4qv9j\" (UID: \"0b6efb8f-c585-4644-ab5d-9fea7f2f94ce\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-4qv9j" Apr 24 22:14:12.616977 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.616806 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwq6m\" (UniqueName: \"kubernetes.io/projected/0b6efb8f-c585-4644-ab5d-9fea7f2f94ce-kube-api-access-rwq6m\") pod \"perf-node-gather-daemonset-4qv9j\" (UID: \"0b6efb8f-c585-4644-ab5d-9fea7f2f94ce\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-4qv9j" Apr 24 22:14:12.616977 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.616832 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0b6efb8f-c585-4644-ab5d-9fea7f2f94ce-podres\") pod \"perf-node-gather-daemonset-4qv9j\" (UID: \"0b6efb8f-c585-4644-ab5d-9fea7f2f94ce\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-4qv9j" Apr 24 22:14:12.616977 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.616872 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0b6efb8f-c585-4644-ab5d-9fea7f2f94ce-sys\") pod \"perf-node-gather-daemonset-4qv9j\" (UID: \"0b6efb8f-c585-4644-ab5d-9fea7f2f94ce\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-4qv9j" Apr 24 22:14:12.616977 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.616884 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0b6efb8f-c585-4644-ab5d-9fea7f2f94ce-proc\") pod \"perf-node-gather-daemonset-4qv9j\" (UID: \"0b6efb8f-c585-4644-ab5d-9fea7f2f94ce\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-4qv9j" Apr 24 22:14:12.616977 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.616905 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0b6efb8f-c585-4644-ab5d-9fea7f2f94ce-lib-modules\") pod \"perf-node-gather-daemonset-4qv9j\" (UID: \"0b6efb8f-c585-4644-ab5d-9fea7f2f94ce\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-4qv9j" Apr 24 22:14:12.617235 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.616979 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0b6efb8f-c585-4644-ab5d-9fea7f2f94ce-sys\") pod \"perf-node-gather-daemonset-4qv9j\" (UID: \"0b6efb8f-c585-4644-ab5d-9fea7f2f94ce\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-4qv9j" Apr 24 22:14:12.617235 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.617010 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0b6efb8f-c585-4644-ab5d-9fea7f2f94ce-podres\") pod \"perf-node-gather-daemonset-4qv9j\" (UID: \"0b6efb8f-c585-4644-ab5d-9fea7f2f94ce\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-4qv9j" Apr 24 22:14:12.617235 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.617013 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0b6efb8f-c585-4644-ab5d-9fea7f2f94ce-lib-modules\") pod \"perf-node-gather-daemonset-4qv9j\" (UID: \"0b6efb8f-c585-4644-ab5d-9fea7f2f94ce\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-4qv9j" Apr 24 22:14:12.625516 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.625491 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwq6m\" (UniqueName: \"kubernetes.io/projected/0b6efb8f-c585-4644-ab5d-9fea7f2f94ce-kube-api-access-rwq6m\") pod \"perf-node-gather-daemonset-4qv9j\" (UID: \"0b6efb8f-c585-4644-ab5d-9fea7f2f94ce\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-4qv9j" Apr 24 22:14:12.713193 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.713076 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-4qv9j" Apr 24 22:14:12.838726 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.838685 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ltmfk/perf-node-gather-daemonset-4qv9j"] Apr 24 22:14:12.841886 ip-10-0-136-201 kubenswrapper[2573]: W0424 22:14:12.841849 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0b6efb8f_c585_4644_ab5d_9fea7f2f94ce.slice/crio-2254da3c7984d3cfbe6d80aa4d1a2588ecb390fa1b1dc4f6fd660414b33c9546 WatchSource:0}: Error finding container 2254da3c7984d3cfbe6d80aa4d1a2588ecb390fa1b1dc4f6fd660414b33c9546: Status 404 returned error can't find the container with id 2254da3c7984d3cfbe6d80aa4d1a2588ecb390fa1b1dc4f6fd660414b33c9546 Apr 24 22:14:12.843628 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.843612 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:14:12.894125 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:12.894080 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-4qv9j" event={"ID":"0b6efb8f-c585-4644-ab5d-9fea7f2f94ce","Type":"ContainerStarted","Data":"2254da3c7984d3cfbe6d80aa4d1a2588ecb390fa1b1dc4f6fd660414b33c9546"} Apr 24 22:14:13.079713 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:13.079634 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bxds4_49cf726e-12df-4887-9b95-ee4375115c11/dns/0.log" Apr 24 22:14:13.101163 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:13.101124 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-bxds4_49cf726e-12df-4887-9b95-ee4375115c11/kube-rbac-proxy/0.log" Apr 24 22:14:13.251377 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:13.251350 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-z65xz_8159f631-f735-47c0-8dd1-8342be18cbcf/dns-node-resolver/0.log" Apr 24 22:14:13.712935 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:13.712899 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-gtgdx_e61e2625-9935-4ec2-88cd-d2ac5c781886/node-ca/0.log" Apr 24 22:14:13.898032 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:13.897994 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-4qv9j" event={"ID":"0b6efb8f-c585-4644-ab5d-9fea7f2f94ce","Type":"ContainerStarted","Data":"2b212472218cdce66d53d63ed1309b0abf5c952b58c68178d80408a5f599c303"} Apr 24 22:14:13.898242 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:13.898145 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-4qv9j" Apr 24 22:14:13.918206 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:13.918157 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-4qv9j" podStartSLOduration=1.918142303 podStartE2EDuration="1.918142303s" podCreationTimestamp="2026-04-24 22:14:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:14:13.916196622 +0000 UTC m=+2831.176765454" watchObservedRunningTime="2026-04-24 22:14:13.918142303 +0000 UTC m=+2831.178711134" Apr 24 22:14:14.417261 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:14.417222 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-58974b8966-z2xx8_1768c1b9-8390-4833-9613-4efec510f36b/router/0.log" Apr 24 22:14:14.774620 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:14.774525 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8rjd5_6024e94b-b234-46ca-80d6-f505949e48ac/serve-healthcheck-canary/0.log" Apr 24 22:14:15.189653 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:15.189617 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-44vlp_c4788265-16ce-4770-b838-025f0f7d06aa/insights-operator/0.log" Apr 24 22:14:15.190415 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:15.190397 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-44vlp_c4788265-16ce-4770-b838-025f0f7d06aa/insights-operator/1.log" Apr 24 22:14:15.275175 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:15.275145 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jkxl9_6847a7b4-b0b1-42ab-8771-88161eb09cc7/kube-rbac-proxy/0.log" Apr 24 22:14:15.296876 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:15.296851 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jkxl9_6847a7b4-b0b1-42ab-8771-88161eb09cc7/exporter/0.log" Apr 24 22:14:15.319615 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:15.319582 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jkxl9_6847a7b4-b0b1-42ab-8771-88161eb09cc7/extractor/0.log" Apr 24 22:14:19.911040 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:19.910998 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-4qv9j" Apr 24 22:14:22.034652 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:22.034572 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-v4rfn_bd2084eb-fcc8-42e3-b526-171c67ac7a71/kube-storage-version-migrator-operator/1.log" Apr 24 22:14:22.035502 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:22.035472 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-v4rfn_bd2084eb-fcc8-42e3-b526-171c67ac7a71/kube-storage-version-migrator-operator/0.log" Apr 24 22:14:22.864870 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:22.864833 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-29bpz_102ddca3-9b65-4682-a3d0-5bf546504d17/kube-multus/0.log" Apr 24 22:14:22.914937 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:22.914912 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5fn4d_3807cdc4-74ff-4e27-bde0-2ed93b428a58/kube-multus-additional-cni-plugins/0.log" Apr 24 22:14:22.939385 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:22.939360 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5fn4d_3807cdc4-74ff-4e27-bde0-2ed93b428a58/egress-router-binary-copy/0.log" Apr 24 22:14:22.960475 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:22.960448 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5fn4d_3807cdc4-74ff-4e27-bde0-2ed93b428a58/cni-plugins/0.log" Apr 24 22:14:22.987888 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:22.987851 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5fn4d_3807cdc4-74ff-4e27-bde0-2ed93b428a58/bond-cni-plugin/0.log" Apr 24 22:14:23.008076 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:23.008042 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5fn4d_3807cdc4-74ff-4e27-bde0-2ed93b428a58/routeoverride-cni/0.log" Apr 24 22:14:23.030592 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:23.030569 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5fn4d_3807cdc4-74ff-4e27-bde0-2ed93b428a58/whereabouts-cni-bincopy/0.log" Apr 24 22:14:23.055164 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:23.055137 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5fn4d_3807cdc4-74ff-4e27-bde0-2ed93b428a58/whereabouts-cni/0.log" Apr 24 22:14:23.540410 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:23.540304 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kpb2c_066d9094-f992-4853-86f1-b25700fe6070/network-metrics-daemon/0.log" Apr 24 22:14:23.562878 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:23.562854 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kpb2c_066d9094-f992-4853-86f1-b25700fe6070/kube-rbac-proxy/0.log" Apr 24 22:14:24.892332 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:24.892300 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kcl2z_c595912d-1543-4c43-8a11-3dbcf0f15050/ovn-controller/0.log" Apr 24 22:14:24.927323 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:24.927288 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kcl2z_c595912d-1543-4c43-8a11-3dbcf0f15050/ovn-acl-logging/0.log" Apr 24 22:14:24.946806 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:24.946769 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kcl2z_c595912d-1543-4c43-8a11-3dbcf0f15050/kube-rbac-proxy-node/0.log" Apr 24 22:14:24.969290 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:24.969253 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kcl2z_c595912d-1543-4c43-8a11-3dbcf0f15050/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 22:14:24.992669 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:24.992640 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kcl2z_c595912d-1543-4c43-8a11-3dbcf0f15050/northd/0.log" Apr 24 22:14:25.025587 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:25.025564 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kcl2z_c595912d-1543-4c43-8a11-3dbcf0f15050/nbdb/0.log" Apr 24 22:14:25.054433 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:25.054407 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kcl2z_c595912d-1543-4c43-8a11-3dbcf0f15050/sbdb/0.log" Apr 24 22:14:25.165042 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:25.164922 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kcl2z_c595912d-1543-4c43-8a11-3dbcf0f15050/ovnkube-controller/0.log" Apr 24 22:14:26.065359 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:26.065329 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-vlq8x_9b1819a7-8889-48bc-9e09-7f5fef3c6672/check-endpoints/0.log" Apr 24 22:14:26.109640 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:26.109608 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-6xchx_1612b97a-f223-4e83-8710-5764a3765126/network-check-target-container/0.log" Apr 24 22:14:26.997712 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:26.997684 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-h2b2b_701e782c-df09-4f2a-a23e-72ec0675fcb7/iptables-alerter/0.log" Apr 24 22:14:27.553795 ip-10-0-136-201 kubenswrapper[2573]: I0424 22:14:27.553769 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-jhprx_8b726083-445b-4ee0-8797-c710268b6b65/tuned/0.log"