Apr 24 16:38:49.503650 ip-10-0-131-47 systemd[1]: Starting Kubernetes Kubelet... Apr 24 16:38:49.969960 ip-10-0-131-47 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 16:38:49.969960 ip-10-0-131-47 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 16:38:49.969960 ip-10-0-131-47 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 16:38:49.969960 ip-10-0-131-47 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 16:38:49.969960 ip-10-0-131-47 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 16:38:49.972655 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.972568 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 16:38:49.976129 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976111 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:38:49.976129 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976130 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:38:49.976302 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976133 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:38:49.976302 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976137 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:38:49.976302 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976140 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:38:49.976302 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976143 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:38:49.976302 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976146 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:38:49.976302 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976149 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:38:49.976302 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976153 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:38:49.976302 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976155 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:38:49.976302 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976159 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:38:49.976302 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976162 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:38:49.976302 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976165 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:38:49.976302 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976169 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:38:49.976302 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976172 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:38:49.976302 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976175 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:38:49.976302 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976177 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:38:49.976302 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976180 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:38:49.976302 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976183 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:38:49.976302 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976186 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:38:49.976302 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976189 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:38:49.976302 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976191 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:38:49.976780 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976194 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:38:49.976780 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976196 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:38:49.976780 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976199 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:38:49.976780 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976202 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:38:49.976780 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976204 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:38:49.976780 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976207 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:38:49.976780 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976209 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:38:49.976780 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976212 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:38:49.976780 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976214 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:38:49.976780 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976217 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:38:49.976780 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976219 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:38:49.976780 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976222 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:38:49.976780 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976225 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:38:49.976780 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976228 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:38:49.976780 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976231 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:38:49.976780 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976234 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:38:49.976780 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976237 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:38:49.976780 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976240 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:38:49.976780 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976243 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:38:49.976780 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976245 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:38:49.977307 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976248 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:38:49.977307 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976250 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:38:49.977307 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976253 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:38:49.977307 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976256 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:38:49.977307 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976261 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:38:49.977307 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976265 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:38:49.977307 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976268 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:38:49.977307 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976270 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:38:49.977307 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976273 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:38:49.977307 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976276 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:38:49.977307 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976279 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:38:49.977307 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976296 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:38:49.977307 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976298 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:38:49.977307 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976301 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:38:49.977307 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976304 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:38:49.977307 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976308 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:38:49.977307 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976312 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:38:49.977307 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976315 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:38:49.977307 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976318 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:38:49.977769 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976320 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:38:49.977769 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976323 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:38:49.977769 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976326 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:38:49.977769 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976328 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:38:49.977769 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976331 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:38:49.977769 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976334 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:38:49.977769 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976338 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:38:49.977769 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976341 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:38:49.977769 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976343 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:38:49.977769 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976346 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:38:49.977769 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976348 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:38:49.977769 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976351 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:38:49.977769 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976355 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:38:49.977769 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976357 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:38:49.977769 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976360 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:38:49.977769 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976363 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:38:49.977769 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976365 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:38:49.977769 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976367 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:38:49.977769 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976370 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:38:49.977769 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976372 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:38:49.978235 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976375 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:38:49.978235 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976377 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:38:49.978235 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976380 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:38:49.978235 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976383 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:38:49.978235 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976386 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:38:49.978235 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976845 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:38:49.978235 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976850 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:38:49.978235 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976853 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:38:49.978235 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976855 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:38:49.978235 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976858 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:38:49.978235 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976860 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:38:49.978235 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976863 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:38:49.978235 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976865 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:38:49.978235 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976868 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:38:49.978235 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976870 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:38:49.978235 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976873 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:38:49.978235 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976875 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:38:49.978235 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976884 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:38:49.978235 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976888 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:38:49.978235 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976891 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:38:49.978753 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976894 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:38:49.978753 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976897 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:38:49.978753 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976899 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:38:49.978753 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976902 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:38:49.978753 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976904 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:38:49.978753 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976907 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:38:49.978753 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976909 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:38:49.978753 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976912 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:38:49.978753 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976915 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:38:49.978753 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976917 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:38:49.978753 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976920 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:38:49.978753 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976922 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:38:49.978753 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976924 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:38:49.978753 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976927 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:38:49.978753 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976929 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:38:49.978753 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976932 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:38:49.978753 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976934 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:38:49.978753 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976937 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:38:49.978753 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976939 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:38:49.978753 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976942 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:38:49.979333 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976944 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:38:49.979333 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976946 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:38:49.979333 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976949 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:38:49.979333 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976951 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:38:49.979333 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976954 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:38:49.979333 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976956 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:38:49.979333 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976958 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:38:49.979333 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976961 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:38:49.979333 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976964 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:38:49.979333 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976966 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:38:49.979333 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976976 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:38:49.979333 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976979 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:38:49.979333 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976982 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:38:49.979333 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976984 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:38:49.979333 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976987 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:38:49.979333 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976989 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:38:49.979333 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976991 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:38:49.979333 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976994 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:38:49.979333 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976996 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:38:49.979807 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.976998 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:38:49.979807 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977001 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:38:49.979807 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977003 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:38:49.979807 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977006 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:38:49.979807 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977008 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:38:49.979807 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977011 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:38:49.979807 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977013 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:38:49.979807 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977016 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:38:49.979807 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977021 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:38:49.979807 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977025 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:38:49.979807 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977028 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:38:49.979807 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977031 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:38:49.979807 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977034 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:38:49.979807 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977036 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:38:49.979807 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977039 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:38:49.979807 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977042 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:38:49.979807 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977045 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:38:49.979807 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977049 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:38:49.979807 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977052 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:38:49.980265 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977055 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:38:49.980265 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977057 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:38:49.980265 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977060 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:38:49.980265 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977062 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:38:49.980265 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977065 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:38:49.980265 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977079 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:38:49.980265 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977082 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:38:49.980265 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977084 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:38:49.980265 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977087 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:38:49.980265 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977089 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:38:49.980265 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977092 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:38:49.980265 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977094 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:38:49.980265 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977097 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:38:49.980265 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977181 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 16:38:49.980265 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977188 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 16:38:49.980265 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977202 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 16:38:49.980265 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977207 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 16:38:49.980265 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977211 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 16:38:49.980265 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977214 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 16:38:49.980265 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977219 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 16:38:49.980265 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977223 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 16:38:49.980786 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977227 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 16:38:49.980786 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977230 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 16:38:49.980786 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977233 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 16:38:49.980786 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977237 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 16:38:49.980786 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977240 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 16:38:49.980786 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977243 2578 flags.go:64] FLAG: --cgroup-root="" Apr 24 16:38:49.980786 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977246 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 16:38:49.980786 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977249 2578 flags.go:64] FLAG: --client-ca-file="" Apr 24 16:38:49.980786 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977252 2578 flags.go:64] FLAG: --cloud-config="" Apr 24 16:38:49.980786 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977255 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 24 16:38:49.980786 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977258 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 16:38:49.980786 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977265 2578 flags.go:64] FLAG: --cluster-domain="" Apr 24 16:38:49.980786 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977267 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 16:38:49.980786 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977270 2578 flags.go:64] FLAG: --config-dir="" Apr 24 16:38:49.980786 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977273 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 16:38:49.980786 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977277 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 16:38:49.980786 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977295 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 16:38:49.980786 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977306 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 16:38:49.980786 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977309 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 16:38:49.980786 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977313 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 16:38:49.980786 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977316 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 24 16:38:49.980786 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977319 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 16:38:49.980786 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977322 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 16:38:49.980786 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977325 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 16:38:49.980786 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977328 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 16:38:49.981388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977333 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 16:38:49.981388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977335 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 16:38:49.981388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977338 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 16:38:49.981388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977342 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 16:38:49.981388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977345 2578 flags.go:64] FLAG: --enable-server="true" Apr 24 16:38:49.981388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977348 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 16:38:49.981388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977358 2578 flags.go:64] FLAG: --event-burst="100" Apr 24 16:38:49.981388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977362 2578 flags.go:64] FLAG: --event-qps="50" Apr 24 16:38:49.981388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977365 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 16:38:49.981388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977368 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 16:38:49.981388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977371 2578 flags.go:64] FLAG: --eviction-hard="" Apr 24 16:38:49.981388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977375 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 16:38:49.981388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977378 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 16:38:49.981388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977381 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 16:38:49.981388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977383 2578 flags.go:64] FLAG: --eviction-soft="" Apr 24 16:38:49.981388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977386 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 16:38:49.981388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977389 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 16:38:49.981388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977392 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 16:38:49.981388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977395 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 16:38:49.981388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977397 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 16:38:49.981388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977400 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 16:38:49.981388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977403 2578 flags.go:64] FLAG: --feature-gates="" Apr 24 16:38:49.981388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977411 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 16:38:49.981388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977414 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 16:38:49.981388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977417 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 16:38:49.981997 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977428 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 16:38:49.981997 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977431 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 24 16:38:49.981997 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977434 2578 flags.go:64] FLAG: --help="false" Apr 24 16:38:49.981997 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977437 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-131-47.ec2.internal" Apr 24 16:38:49.981997 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977440 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 16:38:49.981997 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977443 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 16:38:49.981997 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977446 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 16:38:49.981997 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977449 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 16:38:49.981997 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977453 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 16:38:49.981997 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977455 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 16:38:49.981997 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977458 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 16:38:49.981997 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977461 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 16:38:49.981997 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977466 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 16:38:49.981997 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977469 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 16:38:49.981997 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977472 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 16:38:49.981997 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977474 2578 flags.go:64] FLAG: --kube-reserved="" Apr 24 16:38:49.981997 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977477 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 16:38:49.981997 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977480 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 16:38:49.981997 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977483 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 16:38:49.981997 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977486 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 16:38:49.981997 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977488 2578 flags.go:64] FLAG: --lock-file="" Apr 24 16:38:49.981997 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977491 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 16:38:49.981997 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977494 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 16:38:49.981997 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977497 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 16:38:49.982588 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977503 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 16:38:49.982588 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977505 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 16:38:49.982588 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977508 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 16:38:49.982588 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977511 2578 flags.go:64] FLAG: --logging-format="text" Apr 24 16:38:49.982588 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977514 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 16:38:49.982588 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977518 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 16:38:49.982588 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977521 2578 flags.go:64] FLAG: --manifest-url="" Apr 24 16:38:49.982588 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977524 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 24 16:38:49.982588 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977529 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 16:38:49.982588 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977533 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 16:38:49.982588 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977537 2578 flags.go:64] FLAG: --max-pods="110" Apr 24 16:38:49.982588 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977541 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 16:38:49.982588 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977543 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 16:38:49.982588 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977546 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 16:38:49.982588 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977549 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 16:38:49.982588 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977552 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 16:38:49.982588 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977555 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 16:38:49.982588 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977558 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 16:38:49.982588 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977565 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 16:38:49.982588 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977568 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 16:38:49.982588 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977573 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 16:38:49.982588 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977576 2578 flags.go:64] FLAG: --pod-cidr="" Apr 24 16:38:49.982588 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977579 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 16:38:49.983150 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977585 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 16:38:49.983150 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977588 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 16:38:49.983150 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977591 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 24 16:38:49.983150 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977594 2578 flags.go:64] FLAG: --port="10250" Apr 24 16:38:49.983150 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977597 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 16:38:49.983150 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977599 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0105e14821833e37f" Apr 24 16:38:49.983150 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977602 2578 flags.go:64] FLAG: --qos-reserved="" Apr 24 16:38:49.983150 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977606 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 24 16:38:49.983150 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977608 2578 flags.go:64] FLAG: --register-node="true" Apr 24 16:38:49.983150 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977611 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 24 16:38:49.983150 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977614 2578 flags.go:64] FLAG: --register-with-taints="" Apr 24 16:38:49.983150 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977618 2578 flags.go:64] FLAG: --registry-burst="10" Apr 24 16:38:49.983150 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977621 2578 flags.go:64] FLAG: --registry-qps="5" Apr 24 16:38:49.983150 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977624 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 24 16:38:49.983150 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977627 2578 flags.go:64] FLAG: --reserved-memory="" Apr 24 16:38:49.983150 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977632 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 16:38:49.983150 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977635 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 16:38:49.983150 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977638 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 16:38:49.983150 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977641 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 16:38:49.983150 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977645 2578 flags.go:64] FLAG: --runonce="false" Apr 24 16:38:49.983150 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977647 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 16:38:49.983150 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977650 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 16:38:49.983150 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977653 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 24 16:38:49.983150 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977656 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 16:38:49.983150 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977659 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 16:38:49.983150 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977662 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 16:38:49.983777 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977665 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 16:38:49.983777 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977668 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 16:38:49.983777 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977671 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 16:38:49.983777 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977675 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 16:38:49.983777 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977678 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 16:38:49.983777 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977681 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 16:38:49.983777 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977684 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 16:38:49.983777 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977687 2578 flags.go:64] FLAG: --system-cgroups="" Apr 24 16:38:49.983777 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977690 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 16:38:49.983777 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977695 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 16:38:49.983777 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977698 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 24 16:38:49.983777 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977700 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 16:38:49.983777 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977706 2578 flags.go:64] FLAG: --tls-min-version="" Apr 24 16:38:49.983777 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977710 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 16:38:49.983777 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977713 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 16:38:49.983777 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977715 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 16:38:49.983777 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977718 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 16:38:49.983777 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977721 2578 flags.go:64] FLAG: --v="2" Apr 24 16:38:49.983777 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977725 2578 flags.go:64] FLAG: --version="false" Apr 24 16:38:49.983777 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977729 2578 flags.go:64] FLAG: --vmodule="" Apr 24 16:38:49.983777 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977733 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 16:38:49.983777 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.977738 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 16:38:49.983777 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977838 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:38:49.983777 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977843 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:38:49.984360 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977846 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:38:49.984360 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977849 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:38:49.984360 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977852 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:38:49.984360 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977855 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:38:49.984360 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977858 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:38:49.984360 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977861 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:38:49.984360 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977864 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:38:49.984360 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977867 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:38:49.984360 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977869 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:38:49.984360 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977872 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:38:49.984360 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977874 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:38:49.984360 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977878 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:38:49.984360 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977881 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:38:49.984360 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977883 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:38:49.984360 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977886 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:38:49.984360 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977888 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:38:49.984360 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977891 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:38:49.984360 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977893 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:38:49.984360 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977896 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:38:49.984360 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977898 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:38:49.984880 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977901 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:38:49.984880 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977904 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:38:49.984880 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977906 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:38:49.984880 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977909 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:38:49.984880 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977911 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:38:49.984880 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977914 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:38:49.984880 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977916 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:38:49.984880 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977919 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:38:49.984880 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977921 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:38:49.984880 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977925 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:38:49.984880 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977927 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:38:49.984880 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977930 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:38:49.984880 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977932 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:38:49.984880 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977935 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:38:49.984880 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977938 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:38:49.984880 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977941 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:38:49.984880 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977944 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:38:49.984880 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977947 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:38:49.984880 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977949 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:38:49.984880 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977952 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:38:49.985384 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977956 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:38:49.985384 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977959 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:38:49.985384 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977962 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:38:49.985384 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977969 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:38:49.985384 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977972 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:38:49.985384 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977975 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:38:49.985384 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977977 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:38:49.985384 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977980 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:38:49.985384 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977982 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:38:49.985384 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977985 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:38:49.985384 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977988 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:38:49.985384 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977990 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:38:49.985384 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977993 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:38:49.985384 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977995 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:38:49.985384 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.977997 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:38:49.985384 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.978000 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:38:49.985384 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.978002 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:38:49.985384 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.978005 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:38:49.985384 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.978007 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:38:49.985827 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.978010 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:38:49.985827 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.978012 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:38:49.985827 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.978016 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:38:49.985827 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.978019 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:38:49.985827 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.978021 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:38:49.985827 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.978024 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:38:49.985827 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.978027 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:38:49.985827 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.978029 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:38:49.985827 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.978032 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:38:49.985827 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.978035 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:38:49.985827 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.978038 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:38:49.985827 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.978040 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:38:49.985827 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.978043 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:38:49.985827 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.978045 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:38:49.985827 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.978048 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:38:49.985827 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.978050 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:38:49.985827 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.978054 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:38:49.985827 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.978057 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:38:49.985827 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.978060 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:38:49.985827 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.978063 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:38:49.986324 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.978065 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:38:49.986324 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.978068 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:38:49.986324 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.978070 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:38:49.986324 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.978073 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:38:49.986324 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.978075 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:38:49.986324 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.978933 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 16:38:49.986324 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.985611 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 16:38:49.986324 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.985631 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 16:38:49.986324 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985682 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:38:49.986324 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985687 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:38:49.986324 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985692 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:38:49.986324 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985697 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:38:49.986324 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985700 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:38:49.986324 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985703 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:38:49.986324 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985706 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:38:49.986709 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985709 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:38:49.986709 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985712 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:38:49.986709 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985714 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:38:49.986709 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985717 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:38:49.986709 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985721 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:38:49.986709 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985723 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:38:49.986709 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985726 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:38:49.986709 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985729 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:38:49.986709 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985731 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:38:49.986709 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985734 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:38:49.986709 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985736 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:38:49.986709 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985739 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:38:49.986709 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985741 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:38:49.986709 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985744 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:38:49.986709 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985746 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:38:49.986709 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985748 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:38:49.986709 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985751 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:38:49.986709 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985753 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:38:49.986709 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985756 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:38:49.986709 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985758 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:38:49.987215 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985761 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:38:49.987215 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985763 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:38:49.987215 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985766 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:38:49.987215 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985769 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:38:49.987215 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985773 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:38:49.987215 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985776 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:38:49.987215 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985778 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:38:49.987215 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985781 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:38:49.987215 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985784 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:38:49.987215 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985786 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:38:49.987215 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985789 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:38:49.987215 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985791 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:38:49.987215 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985794 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:38:49.987215 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985796 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:38:49.987215 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985799 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:38:49.987215 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985801 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:38:49.987215 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985803 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:38:49.987215 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985806 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:38:49.987215 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985808 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:38:49.987215 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985811 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:38:49.987794 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985813 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:38:49.987794 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985816 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:38:49.987794 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985819 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:38:49.987794 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985822 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:38:49.987794 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985824 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:38:49.987794 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985827 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:38:49.987794 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985829 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:38:49.987794 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985832 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:38:49.987794 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985835 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:38:49.987794 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985838 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:38:49.987794 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985841 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:38:49.987794 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985845 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:38:49.987794 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985849 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:38:49.987794 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985851 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:38:49.987794 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985855 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:38:49.987794 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985857 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:38:49.987794 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985861 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:38:49.987794 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985864 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:38:49.987794 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985866 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:38:49.987794 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985869 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:38:49.988331 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985872 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:38:49.988331 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985874 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:38:49.988331 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985876 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:38:49.988331 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985879 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:38:49.988331 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985881 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:38:49.988331 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985884 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:38:49.988331 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985886 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:38:49.988331 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985888 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:38:49.988331 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985891 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:38:49.988331 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985893 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:38:49.988331 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985896 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:38:49.988331 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985898 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:38:49.988331 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985901 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:38:49.988331 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985903 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:38:49.988331 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985906 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:38:49.988331 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985908 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:38:49.988331 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985911 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:38:49.988331 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985913 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:38:49.988331 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.985916 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:38:49.988793 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.985921 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 16:38:49.988793 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986017 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:38:49.988793 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986021 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:38:49.988793 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986024 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:38:49.988793 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986027 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:38:49.988793 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986030 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:38:49.988793 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986032 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:38:49.988793 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986034 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:38:49.988793 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986037 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:38:49.988793 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986040 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:38:49.988793 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986043 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:38:49.988793 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986046 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:38:49.988793 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986049 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:38:49.988793 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986051 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:38:49.988793 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986054 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:38:49.989157 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986057 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:38:49.989157 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986059 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:38:49.989157 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986063 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:38:49.989157 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986067 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:38:49.989157 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986069 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:38:49.989157 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986072 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:38:49.989157 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986075 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:38:49.989157 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986078 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:38:49.989157 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986080 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:38:49.989157 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986083 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:38:49.989157 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986085 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:38:49.989157 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986088 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:38:49.989157 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986090 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:38:49.989157 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986093 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:38:49.989157 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986095 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:38:49.989157 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986097 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:38:49.989157 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986100 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:38:49.989157 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986103 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:38:49.989157 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986105 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:38:49.989618 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986107 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:38:49.989618 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986110 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:38:49.989618 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986112 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:38:49.989618 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986115 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:38:49.989618 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986117 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:38:49.989618 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986120 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:38:49.989618 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986122 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:38:49.989618 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986125 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:38:49.989618 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986128 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:38:49.989618 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986131 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:38:49.989618 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986134 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:38:49.989618 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986136 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:38:49.989618 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986139 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:38:49.989618 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986141 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:38:49.989618 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986144 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:38:49.989618 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986146 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:38:49.989618 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986149 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:38:49.989618 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986152 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:38:49.989618 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986154 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:38:49.989618 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986157 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:38:49.990098 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986159 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:38:49.990098 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986162 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:38:49.990098 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986164 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:38:49.990098 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986166 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:38:49.990098 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986169 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:38:49.990098 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986171 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:38:49.990098 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986173 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:38:49.990098 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986176 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:38:49.990098 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986178 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:38:49.990098 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986181 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:38:49.990098 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986183 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:38:49.990098 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986186 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:38:49.990098 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986188 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:38:49.990098 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986190 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:38:49.990098 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986193 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:38:49.990098 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986195 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:38:49.990098 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986197 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:38:49.990098 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986200 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:38:49.990098 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986202 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:38:49.990098 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986205 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:38:49.990597 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986208 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:38:49.990597 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986210 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:38:49.990597 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986213 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:38:49.990597 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986216 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:38:49.990597 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986218 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:38:49.990597 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986221 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:38:49.990597 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986223 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:38:49.990597 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986225 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:38:49.990597 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986228 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:38:49.990597 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986231 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:38:49.990597 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986234 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:38:49.990597 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986236 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:38:49.990597 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:49.986239 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:38:49.990597 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.986244 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 16:38:49.990597 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.987018 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 16:38:49.990983 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.989048 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 16:38:49.990983 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.989975 2578 server.go:1019] "Starting client certificate rotation" Apr 24 16:38:49.990983 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.990089 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 16:38:49.990983 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:49.990693 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 16:38:50.016988 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.016964 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 16:38:50.022842 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.022819 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 16:38:50.040917 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.040894 2578 log.go:25] "Validated CRI v1 runtime API" Apr 24 16:38:50.046419 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.046401 2578 log.go:25] "Validated CRI v1 image API" Apr 24 16:38:50.047244 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.047213 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 16:38:50.047764 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.047738 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 16:38:50.055042 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.055021 2578 fs.go:135] Filesystem UUIDs: map[2a4b402a-8b85-4ea7-a775-c709e8e878c6:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 db410199-0528-4448-86df-a65d3d7a5d2f:/dev/nvme0n1p3] Apr 24 16:38:50.055133 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.055041 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 16:38:50.060202 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.060092 2578 manager.go:217] Machine: {Timestamp:2026-04-24 16:38:50.058798261 +0000 UTC m=+0.430502479 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3104708 MemoryCapacity:32812179456 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2fb61d08be4755defd2a60a0a9d960 SystemUUID:ec2fb61d-08be-4755-defd-2a60a0a9d960 BootID:2b85707e-5de9-413c-804c-9a7d3bfc4de2 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406089728 Type:vfs Inodes:4005393 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562439168 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:58:61:f7:05:3d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:58:61:f7:05:3d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:9e:47:e6:68:03:63 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812179456 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 16:38:50.060202 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.060197 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 16:38:50.060329 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.060277 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 16:38:50.061883 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.061854 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 16:38:50.062024 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.061885 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-47.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 16:38:50.062102 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.062036 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 16:38:50.062102 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.062045 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 16:38:50.062102 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.062062 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 16:38:50.062745 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.062734 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 16:38:50.064380 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.064369 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 24 16:38:50.064660 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.064651 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 16:38:50.067052 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.067042 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 24 16:38:50.067090 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.067056 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 16:38:50.067090 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.067068 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 16:38:50.067090 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.067078 2578 kubelet.go:397] "Adding apiserver pod source" Apr 24 16:38:50.067090 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.067086 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 16:38:50.068126 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.068114 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 16:38:50.068173 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.068133 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 16:38:50.073686 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.073662 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 16:38:50.075919 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.075889 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 16:38:50.077248 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.077237 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 16:38:50.077314 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.077254 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 16:38:50.077314 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.077260 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 16:38:50.077314 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.077266 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 16:38:50.077314 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.077272 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 16:38:50.077314 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.077277 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 16:38:50.077314 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.077300 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 16:38:50.077314 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.077306 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 16:38:50.077314 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.077314 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 16:38:50.077522 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.077320 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 16:38:50.077522 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.077332 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 16:38:50.077522 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.077343 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 16:38:50.079481 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.079469 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 16:38:50.079481 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.079481 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 16:38:50.080954 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:50.080923 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 16:38:50.081024 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:50.080977 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-47.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 16:38:50.083199 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.083186 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 16:38:50.083251 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.083222 2578 server.go:1295] "Started kubelet" Apr 24 16:38:50.083377 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.083339 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 16:38:50.083490 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.083366 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 16:38:50.083490 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.083432 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 16:38:50.084133 ip-10-0-131-47 systemd[1]: Started Kubernetes Kubelet. Apr 24 16:38:50.085171 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.084843 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 16:38:50.086970 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.086954 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 24 16:38:50.089920 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.089901 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 16:38:50.090018 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.089923 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 16:38:50.090491 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.090470 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 16:38:50.090577 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.090495 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 16:38:50.090626 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.090581 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 16:38:50.090671 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.090646 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 24 16:38:50.090671 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.090654 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 24 16:38:50.090769 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:50.090730 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-47.ec2.internal\" not found" Apr 24 16:38:50.091175 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.091154 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-47.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 16:38:50.091662 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.091623 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 16:38:50.091662 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.091651 2578 factory.go:55] Registering systemd factory Apr 24 16:38:50.091662 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.091661 2578 factory.go:223] Registration of the systemd container factory successfully Apr 24 16:38:50.091932 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:50.091000 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-47.ec2.internal.18a95867a12b0625 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-47.ec2.internal,UID:ip-10-0-131-47.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-47.ec2.internal,},FirstTimestamp:2026-04-24 16:38:50.083198501 +0000 UTC m=+0.454902718,LastTimestamp:2026-04-24 16:38:50.083198501 +0000 UTC m=+0.454902718,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-47.ec2.internal,}" Apr 24 16:38:50.092049 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.091933 2578 factory.go:153] Registering CRI-O factory Apr 24 16:38:50.092121 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.092064 2578 factory.go:223] Registration of the crio container factory successfully Apr 24 16:38:50.092121 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.092093 2578 factory.go:103] Registering Raw factory Apr 24 16:38:50.092211 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.092161 2578 manager.go:1196] Started watching for new ooms in manager Apr 24 16:38:50.092913 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.092856 2578 manager.go:319] Starting recovery of all containers Apr 24 16:38:50.093738 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:50.093691 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 16:38:50.094644 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.094624 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wzrd9" Apr 24 16:38:50.096730 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:50.096703 2578 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-47.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 16:38:50.096880 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:50.096811 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 16:38:50.100535 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.100357 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wzrd9" Apr 24 16:38:50.105596 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.105563 2578 manager.go:324] Recovery completed Apr 24 16:38:50.109907 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.109892 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:38:50.112657 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.112641 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:38:50.112752 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.112674 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:38:50.112752 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.112689 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:38:50.113237 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.113221 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 16:38:50.113237 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.113234 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 16:38:50.113343 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.113249 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 24 16:38:50.114748 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:50.114675 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-47.ec2.internal.18a95867a2ec8876 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-47.ec2.internal,UID:ip-10-0-131-47.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-131-47.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-131-47.ec2.internal,},FirstTimestamp:2026-04-24 16:38:50.112657526 +0000 UTC m=+0.484361749,LastTimestamp:2026-04-24 16:38:50.112657526 +0000 UTC m=+0.484361749,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-47.ec2.internal,}" Apr 24 16:38:50.115459 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.115446 2578 policy_none.go:49] "None policy: Start" Apr 24 16:38:50.115459 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.115462 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 16:38:50.115552 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.115472 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 24 16:38:50.152936 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.152918 2578 manager.go:341] "Starting Device Plugin manager" Apr 24 16:38:50.165072 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:50.152960 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 16:38:50.165072 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.152970 2578 server.go:85] "Starting device plugin registration server" Apr 24 16:38:50.165072 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.153269 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 16:38:50.165072 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.153338 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 16:38:50.165072 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.153432 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 16:38:50.165072 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.153522 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 16:38:50.165072 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.153531 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 16:38:50.165072 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:50.154021 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 16:38:50.165072 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:50.154056 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-47.ec2.internal\" not found" Apr 24 16:38:50.231153 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.231060 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 16:38:50.232473 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.232456 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 16:38:50.232519 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.232487 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 16:38:50.232519 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.232507 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 16:38:50.232519 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.232515 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 16:38:50.232622 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:50.232553 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 16:38:50.235213 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.235183 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:38:50.254353 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.254323 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:38:50.255477 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.255431 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:38:50.255585 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.255501 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:38:50.255585 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.255516 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:38:50.255585 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.255545 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-47.ec2.internal" Apr 24 16:38:50.265004 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.264978 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-47.ec2.internal" Apr 24 16:38:50.265004 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:50.265002 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-47.ec2.internal\": node \"ip-10-0-131-47.ec2.internal\" not found" Apr 24 16:38:50.286112 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:50.286086 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-47.ec2.internal\" not found" Apr 24 16:38:50.333158 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.333120 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-47.ec2.internal"] Apr 24 16:38:50.333336 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.333209 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:38:50.334881 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.334858 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:38:50.334994 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.334890 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:38:50.334994 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.334901 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:38:50.336250 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.336236 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:38:50.336454 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.336439 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal" Apr 24 16:38:50.336492 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.336471 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:38:50.337131 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.337113 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:38:50.337211 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.337136 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:38:50.337211 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.337145 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:38:50.337211 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.337151 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:38:50.337211 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.337174 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:38:50.337211 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.337196 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:38:50.338352 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.338338 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-47.ec2.internal" Apr 24 16:38:50.338415 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.338363 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:38:50.339062 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.339047 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:38:50.339141 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.339073 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:38:50.339141 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.339084 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:38:50.356869 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:50.356848 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-47.ec2.internal\" not found" node="ip-10-0-131-47.ec2.internal" Apr 24 16:38:50.360709 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:50.360695 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-47.ec2.internal\" not found" node="ip-10-0-131-47.ec2.internal" Apr 24 16:38:50.387175 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:50.387153 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-47.ec2.internal\" not found" Apr 24 16:38:50.487617 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:50.487526 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-47.ec2.internal\" not found" Apr 24 16:38:50.491852 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.491834 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8b712f12e316b1a6ded9d349ca82d37a-config\") pod \"kube-apiserver-proxy-ip-10-0-131-47.ec2.internal\" (UID: \"8b712f12e316b1a6ded9d349ca82d37a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-47.ec2.internal" Apr 24 16:38:50.491914 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.491861 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/438108d93a37ab59c6a0c9e57eee327c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal\" (UID: \"438108d93a37ab59c6a0c9e57eee327c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal" Apr 24 16:38:50.491914 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.491879 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/438108d93a37ab59c6a0c9e57eee327c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal\" (UID: \"438108d93a37ab59c6a0c9e57eee327c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal" Apr 24 16:38:50.588292 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:50.588251 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-47.ec2.internal\" not found" Apr 24 16:38:50.592636 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.592616 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8b712f12e316b1a6ded9d349ca82d37a-config\") pod \"kube-apiserver-proxy-ip-10-0-131-47.ec2.internal\" (UID: \"8b712f12e316b1a6ded9d349ca82d37a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-47.ec2.internal" Apr 24 16:38:50.592694 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.592660 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/438108d93a37ab59c6a0c9e57eee327c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal\" (UID: \"438108d93a37ab59c6a0c9e57eee327c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal" Apr 24 16:38:50.592694 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.592679 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/438108d93a37ab59c6a0c9e57eee327c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal\" (UID: \"438108d93a37ab59c6a0c9e57eee327c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal" Apr 24 16:38:50.592762 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.592706 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/438108d93a37ab59c6a0c9e57eee327c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal\" (UID: \"438108d93a37ab59c6a0c9e57eee327c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal" Apr 24 16:38:50.592762 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.592723 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8b712f12e316b1a6ded9d349ca82d37a-config\") pod \"kube-apiserver-proxy-ip-10-0-131-47.ec2.internal\" (UID: \"8b712f12e316b1a6ded9d349ca82d37a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-47.ec2.internal" Apr 24 16:38:50.592818 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.592723 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/438108d93a37ab59c6a0c9e57eee327c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal\" (UID: \"438108d93a37ab59c6a0c9e57eee327c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal" Apr 24 16:38:50.658833 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.658785 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal" Apr 24 16:38:50.662428 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.662412 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-47.ec2.internal" Apr 24 16:38:50.688997 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:50.688945 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-47.ec2.internal\" not found" Apr 24 16:38:50.789577 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:50.789476 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-47.ec2.internal\" not found" Apr 24 16:38:50.890018 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:50.889972 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-47.ec2.internal\" not found" Apr 24 16:38:50.990468 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:50.990417 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-47.ec2.internal\" not found" Apr 24 16:38:50.990468 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.990459 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 16:38:50.991074 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:50.990603 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 16:38:51.090442 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:51.090367 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 16:38:51.090922 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:51.090898 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-47.ec2.internal\" not found" Apr 24 16:38:51.104772 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:51.104732 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 16:33:50 +0000 UTC" deadline="2028-01-11 07:55:43.364128718 +0000 UTC" Apr 24 16:38:51.104772 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:51.104766 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15039h16m52.259366092s" Apr 24 16:38:51.107781 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:51.107761 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 16:38:51.130536 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:51.130505 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-rxln9" Apr 24 16:38:51.137807 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:51.137793 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-rxln9" Apr 24 16:38:51.191050 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:51.190996 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-47.ec2.internal\" not found" Apr 24 16:38:51.216650 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:51.216619 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b712f12e316b1a6ded9d349ca82d37a.slice/crio-778386dc24ea6c3dcbdd39efc8f5608cc0a6ee0a558944c2ed45e578b9869974 WatchSource:0}: Error finding container 778386dc24ea6c3dcbdd39efc8f5608cc0a6ee0a558944c2ed45e578b9869974: Status 404 returned error can't find the container with id 778386dc24ea6c3dcbdd39efc8f5608cc0a6ee0a558944c2ed45e578b9869974 Apr 24 16:38:51.217075 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:51.217053 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod438108d93a37ab59c6a0c9e57eee327c.slice/crio-0825a2a454ceb838852f94df060e1cbef9a7970bc1b963a86187b41194bbb961 WatchSource:0}: Error finding container 0825a2a454ceb838852f94df060e1cbef9a7970bc1b963a86187b41194bbb961: Status 404 returned error can't find the container with id 0825a2a454ceb838852f94df060e1cbef9a7970bc1b963a86187b41194bbb961 Apr 24 16:38:51.221906 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:51.221890 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:38:51.235619 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:51.235580 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal" event={"ID":"438108d93a37ab59c6a0c9e57eee327c","Type":"ContainerStarted","Data":"0825a2a454ceb838852f94df060e1cbef9a7970bc1b963a86187b41194bbb961"} Apr 24 16:38:51.236573 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:51.236542 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-47.ec2.internal" event={"ID":"8b712f12e316b1a6ded9d349ca82d37a","Type":"ContainerStarted","Data":"778386dc24ea6c3dcbdd39efc8f5608cc0a6ee0a558944c2ed45e578b9869974"} Apr 24 16:38:51.291899 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:51.291865 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-47.ec2.internal\" not found" Apr 24 16:38:51.366511 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:51.366427 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:38:51.392547 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:51.392510 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-47.ec2.internal\" not found" Apr 24 16:38:51.492856 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:51.492827 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-47.ec2.internal\" not found" Apr 24 16:38:51.493608 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:51.493587 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:38:51.571414 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:51.571379 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:38:51.591196 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:51.591162 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal" Apr 24 16:38:51.605710 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:51.605684 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 16:38:51.607174 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:51.607157 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-47.ec2.internal" Apr 24 16:38:51.616309 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:51.616272 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 16:38:52.068388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.068349 2578 apiserver.go:52] "Watching apiserver" Apr 24 16:38:52.074168 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.074125 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 16:38:52.076214 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.076178 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-thz9k","openshift-network-diagnostics/network-check-target-hcjpg","openshift-network-operator/iptables-alerter-46rrs","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sxnh6","openshift-image-registry/node-ca-7m4xb","openshift-multus/multus-additional-cni-plugins-hgbpw","openshift-multus/multus-x2tv7","openshift-ovn-kubernetes/ovnkube-node-mgfp8","kube-system/global-pull-secret-syncer-rlzxl","kube-system/konnectivity-agent-dt6kn","kube-system/kube-apiserver-proxy-ip-10-0-131-47.ec2.internal","openshift-cluster-node-tuning-operator/tuned-bhmmj","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal"] Apr 24 16:38:52.078859 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.078835 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.080125 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.079862 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:38:52.080125 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:52.079936 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hcjpg" podUID="7230245a-1622-4c39-9d99-ab2e06ac0daf" Apr 24 16:38:52.081100 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.081080 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 16:38:52.081244 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.081224 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sxnh6" Apr 24 16:38:52.081572 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.081555 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-tr4zg\"" Apr 24 16:38:52.081741 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.081726 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 16:38:52.081896 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.081881 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 16:38:52.082066 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.082051 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 16:38:52.082718 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.082696 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-46rrs" Apr 24 16:38:52.083464 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.083173 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 16:38:52.084745 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.084185 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 16:38:52.084745 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.084441 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 16:38:52.084745 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.084562 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-xqkrc\"" Apr 24 16:38:52.084745 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.084445 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:38:52.084929 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.084909 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7m4xb" Apr 24 16:38:52.086068 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.085560 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:38:52.086068 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.085599 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 16:38:52.086068 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.085609 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-hpbqv\"" Apr 24 16:38:52.086068 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.085901 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 16:38:52.086827 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.086779 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 16:38:52.086827 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.086790 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hgbpw" Apr 24 16:38:52.087225 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.087207 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4h7z6\"" Apr 24 16:38:52.087342 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.087324 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 16:38:52.087562 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.087546 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 16:38:52.088300 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.088134 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:38:52.088300 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:52.088189 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-thz9k" podUID="0f368d48-c79b-45b5-8879-9dac1c5cfe3f" Apr 24 16:38:52.088478 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.088300 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.090578 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.090216 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 16:38:52.090578 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.090239 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-q4njf\"" Apr 24 16:38:52.090578 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.090401 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 16:38:52.090578 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.090482 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 16:38:52.090818 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.090798 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 16:38:52.090878 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.090827 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 16:38:52.091112 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.091092 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 16:38:52.091112 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.091101 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-dzdl9\"" Apr 24 16:38:52.091260 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.091243 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 16:38:52.091421 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.091406 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 16:38:52.091461 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.091421 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dt6kn" Apr 24 16:38:52.091678 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.091663 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.093102 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.093078 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:38:52.093205 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:52.093136 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rlzxl" podUID="81d7a862-3288-4023-b0d6-2464e9278dac" Apr 24 16:38:52.093683 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.093500 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-mtr2z\"" Apr 24 16:38:52.093683 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.093511 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 16:38:52.093683 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.093594 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 16:38:52.093885 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.093830 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 16:38:52.093885 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.093842 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-7t26t\"" Apr 24 16:38:52.093967 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.093948 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:38:52.100920 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.100563 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/104dda0f-092c-4fa1-98cb-7e6dcc147db2-etc-tuned\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.100920 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.100621 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-host-slash\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.100920 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.100661 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/705121d3-75ff-4f72-9362-f2f98bdb4bd4-host-slash\") pod \"iptables-alerter-46rrs\" (UID: \"705121d3-75ff-4f72-9362-f2f98bdb4bd4\") " pod="openshift-network-operator/iptables-alerter-46rrs" Apr 24 16:38:52.100920 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.100689 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4c3c8dac-2a32-4ed7-9c10-c067aed23653-registration-dir\") pod \"aws-ebs-csi-driver-node-sxnh6\" (UID: \"4c3c8dac-2a32-4ed7-9c10-c067aed23653\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sxnh6" Apr 24 16:38:52.100920 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.100721 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-host-cni-bin\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.100920 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.100750 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-ovnkube-config\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.101299 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.100794 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-hostroot\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.101299 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.101051 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f368d48-c79b-45b5-8879-9dac1c5cfe3f-metrics-certs\") pod \"network-metrics-daemon-thz9k\" (UID: \"0f368d48-c79b-45b5-8879-9dac1c5cfe3f\") " pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:38:52.101299 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.101089 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-env-overrides\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.101299 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.101117 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-host\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.101299 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.101156 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0c733890-1ac2-464e-9672-65bf90aded78-os-release\") pod \"multus-additional-cni-plugins-hgbpw\" (UID: \"0c733890-1ac2-464e-9672-65bf90aded78\") " pod="openshift-multus/multus-additional-cni-plugins-hgbpw" Apr 24 16:38:52.101527 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.101330 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk8k4\" (UniqueName: \"kubernetes.io/projected/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-kube-api-access-zk8k4\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.101527 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.101369 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4c3c8dac-2a32-4ed7-9c10-c067aed23653-etc-selinux\") pod \"aws-ebs-csi-driver-node-sxnh6\" (UID: \"4c3c8dac-2a32-4ed7-9c10-c067aed23653\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sxnh6" Apr 24 16:38:52.101527 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.101432 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-run-systemd\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.101527 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.101477 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-run-openvswitch\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.101527 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.101511 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-host-run-ovn-kubernetes\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.101747 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.101538 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-host-cni-netd\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.101747 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.101567 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-multus-cni-dir\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.101747 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.101597 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-host-run-netns\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.101747 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.101629 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nwz7\" (UniqueName: \"kubernetes.io/projected/e8baf786-1fb8-494a-bdb7-c724c853faa3-kube-api-access-6nwz7\") pod \"node-ca-7m4xb\" (UID: \"e8baf786-1fb8-494a-bdb7-c724c853faa3\") " pod="openshift-image-registry/node-ca-7m4xb" Apr 24 16:38:52.101922 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.101783 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0c733890-1ac2-464e-9672-65bf90aded78-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hgbpw\" (UID: \"0c733890-1ac2-464e-9672-65bf90aded78\") " pod="openshift-multus/multus-additional-cni-plugins-hgbpw" Apr 24 16:38:52.101922 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.101814 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-etc-openvswitch\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.101922 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.101901 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-run-ovn\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.102045 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.101950 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-var-lib-kubelet\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.102045 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.101989 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h47hs\" (UniqueName: \"kubernetes.io/projected/4c3c8dac-2a32-4ed7-9c10-c067aed23653-kube-api-access-h47hs\") pod \"aws-ebs-csi-driver-node-sxnh6\" (UID: \"4c3c8dac-2a32-4ed7-9c10-c067aed23653\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sxnh6" Apr 24 16:38:52.102045 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.102008 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-etc-modprobe-d\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.102170 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.102057 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-ovnkube-script-lib\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.102170 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.102092 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4c3c8dac-2a32-4ed7-9c10-c067aed23653-sys-fs\") pod \"aws-ebs-csi-driver-node-sxnh6\" (UID: \"4c3c8dac-2a32-4ed7-9c10-c067aed23653\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sxnh6" Apr 24 16:38:52.102170 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.102122 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/46e943b2-628a-486a-adb0-7bb92be03a03-konnectivity-ca\") pod \"konnectivity-agent-dt6kn\" (UID: \"46e943b2-628a-486a-adb0-7bb92be03a03\") " pod="kube-system/konnectivity-agent-dt6kn" Apr 24 16:38:52.102306 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.102188 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-log-socket\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.102306 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.102222 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-os-release\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.102425 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.102300 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-host-var-lib-cni-multus\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.102425 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.102342 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/33870c17-d8aa-426a-9bde-7d0a1e04404a-multus-daemon-config\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.102425 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.102386 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4c3c8dac-2a32-4ed7-9c10-c067aed23653-socket-dir\") pod \"aws-ebs-csi-driver-node-sxnh6\" (UID: \"4c3c8dac-2a32-4ed7-9c10-c067aed23653\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sxnh6" Apr 24 16:38:52.102549 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.102436 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4c3c8dac-2a32-4ed7-9c10-c067aed23653-device-dir\") pod \"aws-ebs-csi-driver-node-sxnh6\" (UID: \"4c3c8dac-2a32-4ed7-9c10-c067aed23653\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sxnh6" Apr 24 16:38:52.102549 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.102486 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-etc-sysctl-d\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.102549 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.102506 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl9zr\" (UniqueName: \"kubernetes.io/projected/104dda0f-092c-4fa1-98cb-7e6dcc147db2-kube-api-access-dl9zr\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.102549 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.102544 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2r84\" (UniqueName: \"kubernetes.io/projected/7230245a-1622-4c39-9d99-ab2e06ac0daf-kube-api-access-f2r84\") pod \"network-check-target-hcjpg\" (UID: \"7230245a-1622-4c39-9d99-ab2e06ac0daf\") " pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:38:52.102741 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.102582 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-var-lib-openvswitch\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.102741 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.102607 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-host-var-lib-cni-bin\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.102741 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.102670 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-etc-kubernetes\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.102741 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.102707 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-etc-sysconfig\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.102920 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.102741 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-node-log\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.103310 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.103020 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/705121d3-75ff-4f72-9362-f2f98bdb4bd4-iptables-alerter-script\") pod \"iptables-alerter-46rrs\" (UID: \"705121d3-75ff-4f72-9362-f2f98bdb4bd4\") " pod="openshift-network-operator/iptables-alerter-46rrs" Apr 24 16:38:52.103310 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.103061 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0c733890-1ac2-464e-9672-65bf90aded78-system-cni-dir\") pod \"multus-additional-cni-plugins-hgbpw\" (UID: \"0c733890-1ac2-464e-9672-65bf90aded78\") " pod="openshift-multus/multus-additional-cni-plugins-hgbpw" Apr 24 16:38:52.103310 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.103087 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0c733890-1ac2-464e-9672-65bf90aded78-cnibin\") pod \"multus-additional-cni-plugins-hgbpw\" (UID: \"0c733890-1ac2-464e-9672-65bf90aded78\") " pod="openshift-multus/multus-additional-cni-plugins-hgbpw" Apr 24 16:38:52.103310 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.103126 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-systemd-units\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.103310 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.103175 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-etc-kubernetes\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.103310 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.103202 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e8baf786-1fb8-494a-bdb7-c724c853faa3-host\") pod \"node-ca-7m4xb\" (UID: \"e8baf786-1fb8-494a-bdb7-c724c853faa3\") " pod="openshift-image-registry/node-ca-7m4xb" Apr 24 16:38:52.103310 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.103228 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e8baf786-1fb8-494a-bdb7-c724c853faa3-serviceca\") pod \"node-ca-7m4xb\" (UID: \"e8baf786-1fb8-494a-bdb7-c724c853faa3\") " pod="openshift-image-registry/node-ca-7m4xb" Apr 24 16:38:52.103310 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.103260 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0c733890-1ac2-464e-9672-65bf90aded78-cni-binary-copy\") pod \"multus-additional-cni-plugins-hgbpw\" (UID: \"0c733890-1ac2-464e-9672-65bf90aded78\") " pod="openshift-multus/multus-additional-cni-plugins-hgbpw" Apr 24 16:38:52.103703 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.103325 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0c733890-1ac2-464e-9672-65bf90aded78-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hgbpw\" (UID: \"0c733890-1ac2-464e-9672-65bf90aded78\") " pod="openshift-multus/multus-additional-cni-plugins-hgbpw" Apr 24 16:38:52.103703 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.103418 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-host-kubelet\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.103703 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.103462 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbjlg\" (UniqueName: \"kubernetes.io/projected/0c733890-1ac2-464e-9672-65bf90aded78-kube-api-access-qbjlg\") pod \"multus-additional-cni-plugins-hgbpw\" (UID: \"0c733890-1ac2-464e-9672-65bf90aded78\") " pod="openshift-multus/multus-additional-cni-plugins-hgbpw" Apr 24 16:38:52.103703 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.103500 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-system-cni-dir\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.103703 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.103533 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-cnibin\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.103703 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.103567 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-host-run-k8s-cni-cncf-io\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.103703 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.103598 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-multus-conf-dir\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.103703 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.103629 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2k88\" (UniqueName: \"kubernetes.io/projected/33870c17-d8aa-426a-9bde-7d0a1e04404a-kube-api-access-m2k88\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.103703 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.103664 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/104dda0f-092c-4fa1-98cb-7e6dcc147db2-tmp\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.103703 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.103694 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.104145 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.103751 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-ovn-node-metrics-cert\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.104145 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.103796 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-host-var-lib-kubelet\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.104145 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.103827 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-sys\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.104145 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.103872 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-lib-modules\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.104145 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.103906 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x45br\" (UniqueName: \"kubernetes.io/projected/0f368d48-c79b-45b5-8879-9dac1c5cfe3f-kube-api-access-x45br\") pod \"network-metrics-daemon-thz9k\" (UID: \"0f368d48-c79b-45b5-8879-9dac1c5cfe3f\") " pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:38:52.104145 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.103945 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/46e943b2-628a-486a-adb0-7bb92be03a03-agent-certs\") pod \"konnectivity-agent-dt6kn\" (UID: \"46e943b2-628a-486a-adb0-7bb92be03a03\") " pod="kube-system/konnectivity-agent-dt6kn" Apr 24 16:38:52.104145 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.103977 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-etc-sysctl-conf\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.104145 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.104009 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/33870c17-d8aa-426a-9bde-7d0a1e04404a-cni-binary-copy\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.104145 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.104038 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-multus-socket-dir-parent\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.104145 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.104098 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-run\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.104635 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.104141 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0c733890-1ac2-464e-9672-65bf90aded78-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hgbpw\" (UID: \"0c733890-1ac2-464e-9672-65bf90aded78\") " pod="openshift-multus/multus-additional-cni-plugins-hgbpw" Apr 24 16:38:52.104635 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.104259 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dq7f\" (UniqueName: \"kubernetes.io/projected/705121d3-75ff-4f72-9362-f2f98bdb4bd4-kube-api-access-7dq7f\") pod \"iptables-alerter-46rrs\" (UID: \"705121d3-75ff-4f72-9362-f2f98bdb4bd4\") " pod="openshift-network-operator/iptables-alerter-46rrs" Apr 24 16:38:52.104635 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.104367 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4c3c8dac-2a32-4ed7-9c10-c067aed23653-kubelet-dir\") pod \"aws-ebs-csi-driver-node-sxnh6\" (UID: \"4c3c8dac-2a32-4ed7-9c10-c067aed23653\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sxnh6" Apr 24 16:38:52.104635 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.104407 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-host-run-netns\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.104635 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.104444 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-host-run-multus-certs\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.104635 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.104478 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-etc-systemd\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.139017 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.138977 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 16:33:51 +0000 UTC" deadline="2027-10-23 11:00:43.559638617 +0000 UTC" Apr 24 16:38:52.139141 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.139025 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13122h21m51.420617398s" Apr 24 16:38:52.191767 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.191738 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 16:38:52.204738 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.204702 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4c3c8dac-2a32-4ed7-9c10-c067aed23653-etc-selinux\") pod \"aws-ebs-csi-driver-node-sxnh6\" (UID: \"4c3c8dac-2a32-4ed7-9c10-c067aed23653\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sxnh6" Apr 24 16:38:52.204738 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.204743 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-run-systemd\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.204941 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.204766 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-run-openvswitch\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.204941 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.204791 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-host-run-ovn-kubernetes\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.204941 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.204816 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-host-cni-netd\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.204941 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.204865 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-multus-cni-dir\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.204941 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.204875 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-host-run-ovn-kubernetes\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.204941 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.204897 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/4c3c8dac-2a32-4ed7-9c10-c067aed23653-etc-selinux\") pod \"aws-ebs-csi-driver-node-sxnh6\" (UID: \"4c3c8dac-2a32-4ed7-9c10-c067aed23653\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sxnh6" Apr 24 16:38:52.204941 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.204913 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-run-systemd\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.204941 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.204911 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-host-run-netns\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.204941 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.204933 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-host-run-netns\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.205371 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.204949 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-multus-cni-dir\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.205371 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.204962 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-run-openvswitch\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.205371 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.204964 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-host-cni-netd\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.205371 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.204992 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6nwz7\" (UniqueName: \"kubernetes.io/projected/e8baf786-1fb8-494a-bdb7-c724c853faa3-kube-api-access-6nwz7\") pod \"node-ca-7m4xb\" (UID: \"e8baf786-1fb8-494a-bdb7-c724c853faa3\") " pod="openshift-image-registry/node-ca-7m4xb" Apr 24 16:38:52.205371 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205038 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0c733890-1ac2-464e-9672-65bf90aded78-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hgbpw\" (UID: \"0c733890-1ac2-464e-9672-65bf90aded78\") " pod="openshift-multus/multus-additional-cni-plugins-hgbpw" Apr 24 16:38:52.205371 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205063 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-etc-openvswitch\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.205371 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205087 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-run-ovn\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.205371 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205121 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-var-lib-kubelet\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.205371 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205142 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h47hs\" (UniqueName: \"kubernetes.io/projected/4c3c8dac-2a32-4ed7-9c10-c067aed23653-kube-api-access-h47hs\") pod \"aws-ebs-csi-driver-node-sxnh6\" (UID: \"4c3c8dac-2a32-4ed7-9c10-c067aed23653\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sxnh6" Apr 24 16:38:52.205371 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205159 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-etc-modprobe-d\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.205371 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205175 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-ovnkube-script-lib\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.205371 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205175 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-run-ovn\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.205371 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205186 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0c733890-1ac2-464e-9672-65bf90aded78-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hgbpw\" (UID: \"0c733890-1ac2-464e-9672-65bf90aded78\") " pod="openshift-multus/multus-additional-cni-plugins-hgbpw" Apr 24 16:38:52.205371 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205228 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-etc-openvswitch\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.205371 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205233 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4c3c8dac-2a32-4ed7-9c10-c067aed23653-sys-fs\") pod \"aws-ebs-csi-driver-node-sxnh6\" (UID: \"4c3c8dac-2a32-4ed7-9c10-c067aed23653\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sxnh6" Apr 24 16:38:52.205371 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205262 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/46e943b2-628a-486a-adb0-7bb92be03a03-konnectivity-ca\") pod \"konnectivity-agent-dt6kn\" (UID: \"46e943b2-628a-486a-adb0-7bb92be03a03\") " pod="kube-system/konnectivity-agent-dt6kn" Apr 24 16:38:52.205371 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205298 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-var-lib-kubelet\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.206180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205309 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-log-socket\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.206180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205333 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-os-release\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.206180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205358 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-host-var-lib-cni-multus\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.206180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205390 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/33870c17-d8aa-426a-9bde-7d0a1e04404a-multus-daemon-config\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.206180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205409 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4c3c8dac-2a32-4ed7-9c10-c067aed23653-socket-dir\") pod \"aws-ebs-csi-driver-node-sxnh6\" (UID: \"4c3c8dac-2a32-4ed7-9c10-c067aed23653\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sxnh6" Apr 24 16:38:52.206180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205419 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-etc-modprobe-d\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.206180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205430 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4c3c8dac-2a32-4ed7-9c10-c067aed23653-device-dir\") pod \"aws-ebs-csi-driver-node-sxnh6\" (UID: \"4c3c8dac-2a32-4ed7-9c10-c067aed23653\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sxnh6" Apr 24 16:38:52.206180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205440 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-log-socket\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.206180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205460 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-host-var-lib-cni-multus\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.206180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205451 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-etc-sysctl-d\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.206180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205503 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dl9zr\" (UniqueName: \"kubernetes.io/projected/104dda0f-092c-4fa1-98cb-7e6dcc147db2-kube-api-access-dl9zr\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.206180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205530 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2r84\" (UniqueName: \"kubernetes.io/projected/7230245a-1622-4c39-9d99-ab2e06ac0daf-kube-api-access-f2r84\") pod \"network-check-target-hcjpg\" (UID: \"7230245a-1622-4c39-9d99-ab2e06ac0daf\") " pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:38:52.206180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205550 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/4c3c8dac-2a32-4ed7-9c10-c067aed23653-sys-fs\") pod \"aws-ebs-csi-driver-node-sxnh6\" (UID: \"4c3c8dac-2a32-4ed7-9c10-c067aed23653\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sxnh6" Apr 24 16:38:52.206180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205556 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-var-lib-openvswitch\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.206180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205573 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-etc-sysctl-d\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.206180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205584 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-host-var-lib-cni-bin\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.206180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205608 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-etc-kubernetes\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.206960 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205631 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-etc-sysconfig\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.206960 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205654 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-node-log\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.206960 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205678 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/705121d3-75ff-4f72-9362-f2f98bdb4bd4-iptables-alerter-script\") pod \"iptables-alerter-46rrs\" (UID: \"705121d3-75ff-4f72-9362-f2f98bdb4bd4\") " pod="openshift-network-operator/iptables-alerter-46rrs" Apr 24 16:38:52.206960 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205695 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4c3c8dac-2a32-4ed7-9c10-c067aed23653-socket-dir\") pod \"aws-ebs-csi-driver-node-sxnh6\" (UID: \"4c3c8dac-2a32-4ed7-9c10-c067aed23653\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sxnh6" Apr 24 16:38:52.206960 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205705 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0c733890-1ac2-464e-9672-65bf90aded78-system-cni-dir\") pod \"multus-additional-cni-plugins-hgbpw\" (UID: \"0c733890-1ac2-464e-9672-65bf90aded78\") " pod="openshift-multus/multus-additional-cni-plugins-hgbpw" Apr 24 16:38:52.206960 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205741 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0c733890-1ac2-464e-9672-65bf90aded78-cnibin\") pod \"multus-additional-cni-plugins-hgbpw\" (UID: \"0c733890-1ac2-464e-9672-65bf90aded78\") " pod="openshift-multus/multus-additional-cni-plugins-hgbpw" Apr 24 16:38:52.206960 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205750 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0c733890-1ac2-464e-9672-65bf90aded78-system-cni-dir\") pod \"multus-additional-cni-plugins-hgbpw\" (UID: \"0c733890-1ac2-464e-9672-65bf90aded78\") " pod="openshift-multus/multus-additional-cni-plugins-hgbpw" Apr 24 16:38:52.206960 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205803 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-os-release\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.206960 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205841 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/4c3c8dac-2a32-4ed7-9c10-c067aed23653-device-dir\") pod \"aws-ebs-csi-driver-node-sxnh6\" (UID: \"4c3c8dac-2a32-4ed7-9c10-c067aed23653\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sxnh6" Apr 24 16:38:52.206960 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205863 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0c733890-1ac2-464e-9672-65bf90aded78-cnibin\") pod \"multus-additional-cni-plugins-hgbpw\" (UID: \"0c733890-1ac2-464e-9672-65bf90aded78\") " pod="openshift-multus/multus-additional-cni-plugins-hgbpw" Apr 24 16:38:52.206960 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205964 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-ovnkube-script-lib\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.206960 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.205974 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-var-lib-openvswitch\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.206960 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206017 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-host-var-lib-cni-bin\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.206960 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206088 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-systemd-units\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.206960 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206124 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-systemd-units\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.206960 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206150 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-etc-kubernetes\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.206960 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206184 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-etc-kubernetes\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.207710 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206125 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-etc-kubernetes\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.207710 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206184 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-etc-sysconfig\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.207710 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206221 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e8baf786-1fb8-494a-bdb7-c724c853faa3-host\") pod \"node-ca-7m4xb\" (UID: \"e8baf786-1fb8-494a-bdb7-c724c853faa3\") " pod="openshift-image-registry/node-ca-7m4xb" Apr 24 16:38:52.207710 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206232 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-node-log\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.207710 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206257 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/33870c17-d8aa-426a-9bde-7d0a1e04404a-multus-daemon-config\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.207710 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206273 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e8baf786-1fb8-494a-bdb7-c724c853faa3-serviceca\") pod \"node-ca-7m4xb\" (UID: \"e8baf786-1fb8-494a-bdb7-c724c853faa3\") " pod="openshift-image-registry/node-ca-7m4xb" Apr 24 16:38:52.207710 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206319 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0c733890-1ac2-464e-9672-65bf90aded78-cni-binary-copy\") pod \"multus-additional-cni-plugins-hgbpw\" (UID: \"0c733890-1ac2-464e-9672-65bf90aded78\") " pod="openshift-multus/multus-additional-cni-plugins-hgbpw" Apr 24 16:38:52.207710 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206391 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0c733890-1ac2-464e-9672-65bf90aded78-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hgbpw\" (UID: \"0c733890-1ac2-464e-9672-65bf90aded78\") " pod="openshift-multus/multus-additional-cni-plugins-hgbpw" Apr 24 16:38:52.207710 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206428 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-host-kubelet\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.207710 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206431 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/46e943b2-628a-486a-adb0-7bb92be03a03-konnectivity-ca\") pod \"konnectivity-agent-dt6kn\" (UID: \"46e943b2-628a-486a-adb0-7bb92be03a03\") " pod="kube-system/konnectivity-agent-dt6kn" Apr 24 16:38:52.207710 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206463 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e8baf786-1fb8-494a-bdb7-c724c853faa3-host\") pod \"node-ca-7m4xb\" (UID: \"e8baf786-1fb8-494a-bdb7-c724c853faa3\") " pod="openshift-image-registry/node-ca-7m4xb" Apr 24 16:38:52.207710 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206511 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/81d7a862-3288-4023-b0d6-2464e9278dac-kubelet-config\") pod \"global-pull-secret-syncer-rlzxl\" (UID: \"81d7a862-3288-4023-b0d6-2464e9278dac\") " pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:38:52.207710 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206542 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbjlg\" (UniqueName: \"kubernetes.io/projected/0c733890-1ac2-464e-9672-65bf90aded78-kube-api-access-qbjlg\") pod \"multus-additional-cni-plugins-hgbpw\" (UID: \"0c733890-1ac2-464e-9672-65bf90aded78\") " pod="openshift-multus/multus-additional-cni-plugins-hgbpw" Apr 24 16:38:52.207710 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206568 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-system-cni-dir\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.207710 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206610 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-cnibin\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.207710 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206645 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-host-run-k8s-cni-cncf-io\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.207710 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206669 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-multus-conf-dir\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.207710 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206686 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2k88\" (UniqueName: \"kubernetes.io/projected/33870c17-d8aa-426a-9bde-7d0a1e04404a-kube-api-access-m2k88\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.208458 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206690 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/705121d3-75ff-4f72-9362-f2f98bdb4bd4-iptables-alerter-script\") pod \"iptables-alerter-46rrs\" (UID: \"705121d3-75ff-4f72-9362-f2f98bdb4bd4\") " pod="openshift-network-operator/iptables-alerter-46rrs" Apr 24 16:38:52.208458 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206701 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/104dda0f-092c-4fa1-98cb-7e6dcc147db2-tmp\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.208458 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206739 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.208458 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206811 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-ovn-node-metrics-cert\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.208458 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206835 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-host-var-lib-kubelet\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.208458 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206837 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0c733890-1ac2-464e-9672-65bf90aded78-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hgbpw\" (UID: \"0c733890-1ac2-464e-9672-65bf90aded78\") " pod="openshift-multus/multus-additional-cni-plugins-hgbpw" Apr 24 16:38:52.208458 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206850 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-sys\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.208458 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206858 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e8baf786-1fb8-494a-bdb7-c724c853faa3-serviceca\") pod \"node-ca-7m4xb\" (UID: \"e8baf786-1fb8-494a-bdb7-c724c853faa3\") " pod="openshift-image-registry/node-ca-7m4xb" Apr 24 16:38:52.208458 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206866 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-lib-modules\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.208458 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206884 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-cnibin\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.208458 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206908 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x45br\" (UniqueName: \"kubernetes.io/projected/0f368d48-c79b-45b5-8879-9dac1c5cfe3f-kube-api-access-x45br\") pod \"network-metrics-daemon-thz9k\" (UID: \"0f368d48-c79b-45b5-8879-9dac1c5cfe3f\") " pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:38:52.208458 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206938 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/46e943b2-628a-486a-adb0-7bb92be03a03-agent-certs\") pod \"konnectivity-agent-dt6kn\" (UID: \"46e943b2-628a-486a-adb0-7bb92be03a03\") " pod="kube-system/konnectivity-agent-dt6kn" Apr 24 16:38:52.208458 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206942 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-lib-modules\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.208458 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206912 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-system-cni-dir\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.208458 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.206992 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.208458 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.207014 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-etc-sysctl-conf\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.208458 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.207031 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/33870c17-d8aa-426a-9bde-7d0a1e04404a-cni-binary-copy\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.209180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.207048 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-multus-socket-dir-parent\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.209180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.207075 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-run\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.209180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.207109 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0c733890-1ac2-464e-9672-65bf90aded78-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hgbpw\" (UID: \"0c733890-1ac2-464e-9672-65bf90aded78\") " pod="openshift-multus/multus-additional-cni-plugins-hgbpw" Apr 24 16:38:52.209180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.207134 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7dq7f\" (UniqueName: \"kubernetes.io/projected/705121d3-75ff-4f72-9362-f2f98bdb4bd4-kube-api-access-7dq7f\") pod \"iptables-alerter-46rrs\" (UID: \"705121d3-75ff-4f72-9362-f2f98bdb4bd4\") " pod="openshift-network-operator/iptables-alerter-46rrs" Apr 24 16:38:52.209180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.207160 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/81d7a862-3288-4023-b0d6-2464e9278dac-dbus\") pod \"global-pull-secret-syncer-rlzxl\" (UID: \"81d7a862-3288-4023-b0d6-2464e9278dac\") " pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:38:52.209180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.207168 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-sys\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.209180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.207181 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/81d7a862-3288-4023-b0d6-2464e9278dac-original-pull-secret\") pod \"global-pull-secret-syncer-rlzxl\" (UID: \"81d7a862-3288-4023-b0d6-2464e9278dac\") " pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:38:52.209180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.207215 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4c3c8dac-2a32-4ed7-9c10-c067aed23653-kubelet-dir\") pod \"aws-ebs-csi-driver-node-sxnh6\" (UID: \"4c3c8dac-2a32-4ed7-9c10-c067aed23653\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sxnh6" Apr 24 16:38:52.209180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.207223 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0c733890-1ac2-464e-9672-65bf90aded78-cni-binary-copy\") pod \"multus-additional-cni-plugins-hgbpw\" (UID: \"0c733890-1ac2-464e-9672-65bf90aded78\") " pod="openshift-multus/multus-additional-cni-plugins-hgbpw" Apr 24 16:38:52.209180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.207310 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-host-run-netns\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.209180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.207328 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 16:38:52.209180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.207364 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-host-run-multus-certs\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.209180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.207361 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-run\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.209180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.207394 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4c3c8dac-2a32-4ed7-9c10-c067aed23653-kubelet-dir\") pod \"aws-ebs-csi-driver-node-sxnh6\" (UID: \"4c3c8dac-2a32-4ed7-9c10-c067aed23653\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sxnh6" Apr 24 16:38:52.209180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.207419 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-host-run-netns\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.209180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.207588 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-etc-sysctl-conf\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.209180 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.207787 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0c733890-1ac2-464e-9672-65bf90aded78-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hgbpw\" (UID: \"0c733890-1ac2-464e-9672-65bf90aded78\") " pod="openshift-multus/multus-additional-cni-plugins-hgbpw" Apr 24 16:38:52.209819 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.208034 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-multus-conf-dir\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.209819 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.208054 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/33870c17-d8aa-426a-9bde-7d0a1e04404a-cni-binary-copy\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.209819 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.208089 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-multus-socket-dir-parent\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.209819 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.207341 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-host-run-multus-certs\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.209819 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.208107 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-host-kubelet\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.209819 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.208135 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-etc-systemd\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.209819 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.208157 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-host-run-k8s-cni-cncf-io\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.209819 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.208161 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/104dda0f-092c-4fa1-98cb-7e6dcc147db2-etc-tuned\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.209819 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.208198 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-host-slash\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.209819 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.208241 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-etc-systemd\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.209819 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.208273 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/705121d3-75ff-4f72-9362-f2f98bdb4bd4-host-slash\") pod \"iptables-alerter-46rrs\" (UID: \"705121d3-75ff-4f72-9362-f2f98bdb4bd4\") " pod="openshift-network-operator/iptables-alerter-46rrs" Apr 24 16:38:52.209819 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.208318 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4c3c8dac-2a32-4ed7-9c10-c067aed23653-registration-dir\") pod \"aws-ebs-csi-driver-node-sxnh6\" (UID: \"4c3c8dac-2a32-4ed7-9c10-c067aed23653\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sxnh6" Apr 24 16:38:52.209819 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.208329 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/705121d3-75ff-4f72-9362-f2f98bdb4bd4-host-slash\") pod \"iptables-alerter-46rrs\" (UID: \"705121d3-75ff-4f72-9362-f2f98bdb4bd4\") " pod="openshift-network-operator/iptables-alerter-46rrs" Apr 24 16:38:52.209819 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.208359 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-host-slash\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.209819 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.208412 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-host-cni-bin\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.209819 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.208438 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-host-cni-bin\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.209819 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.208443 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-ovnkube-config\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.209819 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.208445 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-host-var-lib-kubelet\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.210366 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.208478 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-hostroot\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.210366 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.208475 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4c3c8dac-2a32-4ed7-9c10-c067aed23653-registration-dir\") pod \"aws-ebs-csi-driver-node-sxnh6\" (UID: \"4c3c8dac-2a32-4ed7-9c10-c067aed23653\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sxnh6" Apr 24 16:38:52.210366 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.208515 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f368d48-c79b-45b5-8879-9dac1c5cfe3f-metrics-certs\") pod \"network-metrics-daemon-thz9k\" (UID: \"0f368d48-c79b-45b5-8879-9dac1c5cfe3f\") " pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:38:52.210366 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.208588 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-env-overrides\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.210366 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.208614 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-host\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.210366 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.208648 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/33870c17-d8aa-426a-9bde-7d0a1e04404a-hostroot\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.210366 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.208692 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0c733890-1ac2-464e-9672-65bf90aded78-os-release\") pod \"multus-additional-cni-plugins-hgbpw\" (UID: \"0c733890-1ac2-464e-9672-65bf90aded78\") " pod="openshift-multus/multus-additional-cni-plugins-hgbpw" Apr 24 16:38:52.210366 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.208936 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0c733890-1ac2-464e-9672-65bf90aded78-os-release\") pod \"multus-additional-cni-plugins-hgbpw\" (UID: \"0c733890-1ac2-464e-9672-65bf90aded78\") " pod="openshift-multus/multus-additional-cni-plugins-hgbpw" Apr 24 16:38:52.210366 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.208913 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zk8k4\" (UniqueName: \"kubernetes.io/projected/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-kube-api-access-zk8k4\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.210366 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.208971 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-ovnkube-config\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.210366 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:52.209034 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:52.210366 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.209046 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/104dda0f-092c-4fa1-98cb-7e6dcc147db2-host\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.210366 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:52.209166 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f368d48-c79b-45b5-8879-9dac1c5cfe3f-metrics-certs podName:0f368d48-c79b-45b5-8879-9dac1c5cfe3f nodeName:}" failed. No retries permitted until 2026-04-24 16:38:52.70912399 +0000 UTC m=+3.080828220 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f368d48-c79b-45b5-8879-9dac1c5cfe3f-metrics-certs") pod "network-metrics-daemon-thz9k" (UID: "0f368d48-c79b-45b5-8879-9dac1c5cfe3f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:52.210366 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.209161 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-env-overrides\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.210951 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.210758 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/104dda0f-092c-4fa1-98cb-7e6dcc147db2-tmp\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.210951 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.210929 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/104dda0f-092c-4fa1-98cb-7e6dcc147db2-etc-tuned\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.211744 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.211509 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-ovn-node-metrics-cert\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.211744 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:52.211618 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:38:52.211744 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:52.211638 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:38:52.211744 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:52.211654 2578 projected.go:194] Error preparing data for projected volume kube-api-access-f2r84 for pod openshift-network-diagnostics/network-check-target-hcjpg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:52.211744 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:52.211707 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7230245a-1622-4c39-9d99-ab2e06ac0daf-kube-api-access-f2r84 podName:7230245a-1622-4c39-9d99-ab2e06ac0daf nodeName:}" failed. No retries permitted until 2026-04-24 16:38:52.71168878 +0000 UTC m=+3.083393001 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-f2r84" (UniqueName: "kubernetes.io/projected/7230245a-1622-4c39-9d99-ab2e06ac0daf-kube-api-access-f2r84") pod "network-check-target-hcjpg" (UID: "7230245a-1622-4c39-9d99-ab2e06ac0daf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:52.212113 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.211950 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/46e943b2-628a-486a-adb0-7bb92be03a03-agent-certs\") pod \"konnectivity-agent-dt6kn\" (UID: \"46e943b2-628a-486a-adb0-7bb92be03a03\") " pod="kube-system/konnectivity-agent-dt6kn" Apr 24 16:38:52.213354 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.213335 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nwz7\" (UniqueName: \"kubernetes.io/projected/e8baf786-1fb8-494a-bdb7-c724c853faa3-kube-api-access-6nwz7\") pod \"node-ca-7m4xb\" (UID: \"e8baf786-1fb8-494a-bdb7-c724c853faa3\") " pod="openshift-image-registry/node-ca-7m4xb" Apr 24 16:38:52.213501 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.213351 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h47hs\" (UniqueName: \"kubernetes.io/projected/4c3c8dac-2a32-4ed7-9c10-c067aed23653-kube-api-access-h47hs\") pod \"aws-ebs-csi-driver-node-sxnh6\" (UID: \"4c3c8dac-2a32-4ed7-9c10-c067aed23653\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sxnh6" Apr 24 16:38:52.214215 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.214136 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl9zr\" (UniqueName: \"kubernetes.io/projected/104dda0f-092c-4fa1-98cb-7e6dcc147db2-kube-api-access-dl9zr\") pod \"tuned-bhmmj\" (UID: \"104dda0f-092c-4fa1-98cb-7e6dcc147db2\") " pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.215571 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.215525 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbjlg\" (UniqueName: \"kubernetes.io/projected/0c733890-1ac2-464e-9672-65bf90aded78-kube-api-access-qbjlg\") pod \"multus-additional-cni-plugins-hgbpw\" (UID: \"0c733890-1ac2-464e-9672-65bf90aded78\") " pod="openshift-multus/multus-additional-cni-plugins-hgbpw" Apr 24 16:38:52.215672 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.215585 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x45br\" (UniqueName: \"kubernetes.io/projected/0f368d48-c79b-45b5-8879-9dac1c5cfe3f-kube-api-access-x45br\") pod \"network-metrics-daemon-thz9k\" (UID: \"0f368d48-c79b-45b5-8879-9dac1c5cfe3f\") " pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:38:52.215994 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.215932 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2k88\" (UniqueName: \"kubernetes.io/projected/33870c17-d8aa-426a-9bde-7d0a1e04404a-kube-api-access-m2k88\") pod \"multus-x2tv7\" (UID: \"33870c17-d8aa-426a-9bde-7d0a1e04404a\") " pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.216062 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.216020 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dq7f\" (UniqueName: \"kubernetes.io/projected/705121d3-75ff-4f72-9362-f2f98bdb4bd4-kube-api-access-7dq7f\") pod \"iptables-alerter-46rrs\" (UID: \"705121d3-75ff-4f72-9362-f2f98bdb4bd4\") " pod="openshift-network-operator/iptables-alerter-46rrs" Apr 24 16:38:52.217378 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.217360 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk8k4\" (UniqueName: \"kubernetes.io/projected/76a7c3cf-c152-4bdc-8d94-50d4af52aeee-kube-api-access-zk8k4\") pod \"ovnkube-node-mgfp8\" (UID: \"76a7c3cf-c152-4bdc-8d94-50d4af52aeee\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.310165 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.310130 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/81d7a862-3288-4023-b0d6-2464e9278dac-dbus\") pod \"global-pull-secret-syncer-rlzxl\" (UID: \"81d7a862-3288-4023-b0d6-2464e9278dac\") " pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:38:52.310331 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.310177 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/81d7a862-3288-4023-b0d6-2464e9278dac-original-pull-secret\") pod \"global-pull-secret-syncer-rlzxl\" (UID: \"81d7a862-3288-4023-b0d6-2464e9278dac\") " pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:38:52.310331 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.310260 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/81d7a862-3288-4023-b0d6-2464e9278dac-kubelet-config\") pod \"global-pull-secret-syncer-rlzxl\" (UID: \"81d7a862-3288-4023-b0d6-2464e9278dac\") " pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:38:52.310418 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.310328 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/81d7a862-3288-4023-b0d6-2464e9278dac-dbus\") pod \"global-pull-secret-syncer-rlzxl\" (UID: \"81d7a862-3288-4023-b0d6-2464e9278dac\") " pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:38:52.310418 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.310348 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/81d7a862-3288-4023-b0d6-2464e9278dac-kubelet-config\") pod \"global-pull-secret-syncer-rlzxl\" (UID: \"81d7a862-3288-4023-b0d6-2464e9278dac\") " pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:38:52.310418 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:52.310370 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 16:38:52.310545 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:52.310441 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81d7a862-3288-4023-b0d6-2464e9278dac-original-pull-secret podName:81d7a862-3288-4023-b0d6-2464e9278dac nodeName:}" failed. No retries permitted until 2026-04-24 16:38:52.810425627 +0000 UTC m=+3.182129836 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/81d7a862-3288-4023-b0d6-2464e9278dac-original-pull-secret") pod "global-pull-secret-syncer-rlzxl" (UID: "81d7a862-3288-4023-b0d6-2464e9278dac") : object "kube-system"/"original-pull-secret" not registered Apr 24 16:38:52.394421 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.394228 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-x2tv7" Apr 24 16:38:52.402137 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.402106 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sxnh6" Apr 24 16:38:52.412844 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.412820 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-46rrs" Apr 24 16:38:52.418431 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.418410 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7m4xb" Apr 24 16:38:52.424978 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.424958 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hgbpw" Apr 24 16:38:52.433590 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.433561 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:38:52.440113 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.440093 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dt6kn" Apr 24 16:38:52.447676 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.447657 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" Apr 24 16:38:52.712301 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.712247 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f368d48-c79b-45b5-8879-9dac1c5cfe3f-metrics-certs\") pod \"network-metrics-daemon-thz9k\" (UID: \"0f368d48-c79b-45b5-8879-9dac1c5cfe3f\") " pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:38:52.712490 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.712331 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2r84\" (UniqueName: \"kubernetes.io/projected/7230245a-1622-4c39-9d99-ab2e06ac0daf-kube-api-access-f2r84\") pod \"network-check-target-hcjpg\" (UID: \"7230245a-1622-4c39-9d99-ab2e06ac0daf\") " pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:38:52.712490 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:52.712401 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:52.712490 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:52.712462 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f368d48-c79b-45b5-8879-9dac1c5cfe3f-metrics-certs podName:0f368d48-c79b-45b5-8879-9dac1c5cfe3f nodeName:}" failed. No retries permitted until 2026-04-24 16:38:53.712446819 +0000 UTC m=+4.084151026 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f368d48-c79b-45b5-8879-9dac1c5cfe3f-metrics-certs") pod "network-metrics-daemon-thz9k" (UID: "0f368d48-c79b-45b5-8879-9dac1c5cfe3f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:52.712490 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:52.712475 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:38:52.712670 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:52.712494 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:38:52.712670 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:52.712507 2578 projected.go:194] Error preparing data for projected volume kube-api-access-f2r84 for pod openshift-network-diagnostics/network-check-target-hcjpg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:52.712670 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:52.712558 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7230245a-1622-4c39-9d99-ab2e06ac0daf-kube-api-access-f2r84 podName:7230245a-1622-4c39-9d99-ab2e06ac0daf nodeName:}" failed. No retries permitted until 2026-04-24 16:38:53.712541152 +0000 UTC m=+4.084245370 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-f2r84" (UniqueName: "kubernetes.io/projected/7230245a-1622-4c39-9d99-ab2e06ac0daf-kube-api-access-f2r84") pod "network-check-target-hcjpg" (UID: "7230245a-1622-4c39-9d99-ab2e06ac0daf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:52.813560 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:52.813524 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/81d7a862-3288-4023-b0d6-2464e9278dac-original-pull-secret\") pod \"global-pull-secret-syncer-rlzxl\" (UID: \"81d7a862-3288-4023-b0d6-2464e9278dac\") " pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:38:52.813735 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:52.813649 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 16:38:52.813735 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:52.813712 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81d7a862-3288-4023-b0d6-2464e9278dac-original-pull-secret podName:81d7a862-3288-4023-b0d6-2464e9278dac nodeName:}" failed. No retries permitted until 2026-04-24 16:38:53.81369906 +0000 UTC m=+4.185403264 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/81d7a862-3288-4023-b0d6-2464e9278dac-original-pull-secret") pod "global-pull-secret-syncer-rlzxl" (UID: "81d7a862-3288-4023-b0d6-2464e9278dac") : object "kube-system"/"original-pull-secret" not registered Apr 24 16:38:52.923813 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:52.923788 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c733890_1ac2_464e_9672_65bf90aded78.slice/crio-d9df9fa6091b791cd1e0d83f5333982f19e4db833279f08c19937ed1c8ae397f WatchSource:0}: Error finding container d9df9fa6091b791cd1e0d83f5333982f19e4db833279f08c19937ed1c8ae397f: Status 404 returned error can't find the container with id d9df9fa6091b791cd1e0d83f5333982f19e4db833279f08c19937ed1c8ae397f Apr 24 16:38:52.924889 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:52.924845 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c3c8dac_2a32_4ed7_9c10_c067aed23653.slice/crio-b784bfd113644e033d02d3d994640bfd237d8433b4895529cb927303144ed3a3 WatchSource:0}: Error finding container b784bfd113644e033d02d3d994640bfd237d8433b4895529cb927303144ed3a3: Status 404 returned error can't find the container with id b784bfd113644e033d02d3d994640bfd237d8433b4895529cb927303144ed3a3 Apr 24 16:38:52.929123 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:52.929098 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33870c17_d8aa_426a_9bde_7d0a1e04404a.slice/crio-76b6434d27df52b6e7e51cd6f562293dd65baaa978c9ca8e936bcadc7853228e WatchSource:0}: Error finding container 76b6434d27df52b6e7e51cd6f562293dd65baaa978c9ca8e936bcadc7853228e: Status 404 returned error can't find the container with id 76b6434d27df52b6e7e51cd6f562293dd65baaa978c9ca8e936bcadc7853228e Apr 24 16:38:52.930155 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:52.930083 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46e943b2_628a_486a_adb0_7bb92be03a03.slice/crio-131f7b3f380885b1158758f6b91dade0eb38e605470b1b33b50f0da7ba8e9579 WatchSource:0}: Error finding container 131f7b3f380885b1158758f6b91dade0eb38e605470b1b33b50f0da7ba8e9579: Status 404 returned error can't find the container with id 131f7b3f380885b1158758f6b91dade0eb38e605470b1b33b50f0da7ba8e9579 Apr 24 16:38:52.931972 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:52.931859 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod104dda0f_092c_4fa1_98cb_7e6dcc147db2.slice/crio-be6ef23c5c2fa363cdd82e5488bdb8fd8dfcae8630834ba8e5eed716ada0937e WatchSource:0}: Error finding container be6ef23c5c2fa363cdd82e5488bdb8fd8dfcae8630834ba8e5eed716ada0937e: Status 404 returned error can't find the container with id be6ef23c5c2fa363cdd82e5488bdb8fd8dfcae8630834ba8e5eed716ada0937e Apr 24 16:38:52.932864 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:52.932334 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod705121d3_75ff_4f72_9362_f2f98bdb4bd4.slice/crio-eb9e2a1361937a9da4313bc9a44a8d0612fad85c47228c3743f2e99b39a91a5e WatchSource:0}: Error finding container eb9e2a1361937a9da4313bc9a44a8d0612fad85c47228c3743f2e99b39a91a5e: Status 404 returned error can't find the container with id eb9e2a1361937a9da4313bc9a44a8d0612fad85c47228c3743f2e99b39a91a5e Apr 24 16:38:52.933968 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:52.933945 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8baf786_1fb8_494a_bdb7_c724c853faa3.slice/crio-4a5c573a3c449991b27c699f56ab91b548e5cfd22ec0e45b14506c8010186d95 WatchSource:0}: Error finding container 4a5c573a3c449991b27c699f56ab91b548e5cfd22ec0e45b14506c8010186d95: Status 404 returned error can't find the container with id 4a5c573a3c449991b27c699f56ab91b548e5cfd22ec0e45b14506c8010186d95 Apr 24 16:38:52.934569 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:38:52.934494 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76a7c3cf_c152_4bdc_8d94_50d4af52aeee.slice/crio-00d65eb978707114b006224124aeca50d62af0d5caa606aa7d98c232973bfe43 WatchSource:0}: Error finding container 00d65eb978707114b006224124aeca50d62af0d5caa606aa7d98c232973bfe43: Status 404 returned error can't find the container with id 00d65eb978707114b006224124aeca50d62af0d5caa606aa7d98c232973bfe43 Apr 24 16:38:53.140187 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:53.139989 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 16:33:51 +0000 UTC" deadline="2028-02-03 05:36:39.579331138 +0000 UTC" Apr 24 16:38:53.140187 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:53.140181 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15588h57m46.439153336s" Apr 24 16:38:53.232757 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:53.232689 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:38:53.232879 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:53.232799 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rlzxl" podUID="81d7a862-3288-4023-b0d6-2464e9278dac" Apr 24 16:38:53.243597 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:53.243566 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x2tv7" event={"ID":"33870c17-d8aa-426a-9bde-7d0a1e04404a","Type":"ContainerStarted","Data":"76b6434d27df52b6e7e51cd6f562293dd65baaa978c9ca8e936bcadc7853228e"} Apr 24 16:38:53.244743 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:53.244714 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sxnh6" event={"ID":"4c3c8dac-2a32-4ed7-9c10-c067aed23653","Type":"ContainerStarted","Data":"b784bfd113644e033d02d3d994640bfd237d8433b4895529cb927303144ed3a3"} Apr 24 16:38:53.246226 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:53.246205 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-47.ec2.internal" event={"ID":"8b712f12e316b1a6ded9d349ca82d37a","Type":"ContainerStarted","Data":"71b5772b49c518fe28b3314bf363c75b7c4880bb56801e558717a7ffe0e1841f"} Apr 24 16:38:53.247260 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:53.247241 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" event={"ID":"76a7c3cf-c152-4bdc-8d94-50d4af52aeee","Type":"ContainerStarted","Data":"00d65eb978707114b006224124aeca50d62af0d5caa606aa7d98c232973bfe43"} Apr 24 16:38:53.248165 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:53.248144 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7m4xb" event={"ID":"e8baf786-1fb8-494a-bdb7-c724c853faa3","Type":"ContainerStarted","Data":"4a5c573a3c449991b27c699f56ab91b548e5cfd22ec0e45b14506c8010186d95"} Apr 24 16:38:53.249015 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:53.248995 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" event={"ID":"104dda0f-092c-4fa1-98cb-7e6dcc147db2","Type":"ContainerStarted","Data":"be6ef23c5c2fa363cdd82e5488bdb8fd8dfcae8630834ba8e5eed716ada0937e"} Apr 24 16:38:53.249955 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:53.249937 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dt6kn" event={"ID":"46e943b2-628a-486a-adb0-7bb92be03a03","Type":"ContainerStarted","Data":"131f7b3f380885b1158758f6b91dade0eb38e605470b1b33b50f0da7ba8e9579"} Apr 24 16:38:53.250834 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:53.250816 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hgbpw" event={"ID":"0c733890-1ac2-464e-9672-65bf90aded78","Type":"ContainerStarted","Data":"d9df9fa6091b791cd1e0d83f5333982f19e4db833279f08c19937ed1c8ae397f"} Apr 24 16:38:53.253806 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:53.253785 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-46rrs" event={"ID":"705121d3-75ff-4f72-9362-f2f98bdb4bd4","Type":"ContainerStarted","Data":"eb9e2a1361937a9da4313bc9a44a8d0612fad85c47228c3743f2e99b39a91a5e"} Apr 24 16:38:53.260777 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:53.260731 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-47.ec2.internal" podStartSLOduration=2.2607222350000002 podStartE2EDuration="2.260722235s" podCreationTimestamp="2026-04-24 16:38:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:38:53.260617221 +0000 UTC m=+3.632321449" watchObservedRunningTime="2026-04-24 16:38:53.260722235 +0000 UTC m=+3.632426460" Apr 24 16:38:53.722732 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:53.722704 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f368d48-c79b-45b5-8879-9dac1c5cfe3f-metrics-certs\") pod \"network-metrics-daemon-thz9k\" (UID: \"0f368d48-c79b-45b5-8879-9dac1c5cfe3f\") " pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:38:53.722838 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:53.722771 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2r84\" (UniqueName: \"kubernetes.io/projected/7230245a-1622-4c39-9d99-ab2e06ac0daf-kube-api-access-f2r84\") pod \"network-check-target-hcjpg\" (UID: \"7230245a-1622-4c39-9d99-ab2e06ac0daf\") " pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:38:53.722954 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:53.722937 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:38:53.723028 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:53.722956 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:38:53.723028 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:53.722969 2578 projected.go:194] Error preparing data for projected volume kube-api-access-f2r84 for pod openshift-network-diagnostics/network-check-target-hcjpg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:53.723028 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:53.723026 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7230245a-1622-4c39-9d99-ab2e06ac0daf-kube-api-access-f2r84 podName:7230245a-1622-4c39-9d99-ab2e06ac0daf nodeName:}" failed. No retries permitted until 2026-04-24 16:38:55.723005914 +0000 UTC m=+6.094710133 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-f2r84" (UniqueName: "kubernetes.io/projected/7230245a-1622-4c39-9d99-ab2e06ac0daf-kube-api-access-f2r84") pod "network-check-target-hcjpg" (UID: "7230245a-1622-4c39-9d99-ab2e06ac0daf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:53.723201 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:53.723105 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:53.723201 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:53.723154 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f368d48-c79b-45b5-8879-9dac1c5cfe3f-metrics-certs podName:0f368d48-c79b-45b5-8879-9dac1c5cfe3f nodeName:}" failed. No retries permitted until 2026-04-24 16:38:55.723139888 +0000 UTC m=+6.094844112 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f368d48-c79b-45b5-8879-9dac1c5cfe3f-metrics-certs") pod "network-metrics-daemon-thz9k" (UID: "0f368d48-c79b-45b5-8879-9dac1c5cfe3f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:53.823197 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:53.823160 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/81d7a862-3288-4023-b0d6-2464e9278dac-original-pull-secret\") pod \"global-pull-secret-syncer-rlzxl\" (UID: \"81d7a862-3288-4023-b0d6-2464e9278dac\") " pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:38:53.823386 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:53.823323 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 16:38:53.823442 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:53.823388 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81d7a862-3288-4023-b0d6-2464e9278dac-original-pull-secret podName:81d7a862-3288-4023-b0d6-2464e9278dac nodeName:}" failed. No retries permitted until 2026-04-24 16:38:55.823368905 +0000 UTC m=+6.195073124 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/81d7a862-3288-4023-b0d6-2464e9278dac-original-pull-secret") pod "global-pull-secret-syncer-rlzxl" (UID: "81d7a862-3288-4023-b0d6-2464e9278dac") : object "kube-system"/"original-pull-secret" not registered Apr 24 16:38:54.238500 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:54.238464 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:38:54.238972 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:54.238513 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:38:54.238972 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:54.238638 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hcjpg" podUID="7230245a-1622-4c39-9d99-ab2e06ac0daf" Apr 24 16:38:54.238972 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:54.238767 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-thz9k" podUID="0f368d48-c79b-45b5-8879-9dac1c5cfe3f" Apr 24 16:38:54.265123 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:54.265087 2578 generic.go:358] "Generic (PLEG): container finished" podID="438108d93a37ab59c6a0c9e57eee327c" containerID="b6680e683a2eb98db9a9f0e8ce3db477bdfc68c3363cb0a0110c998efbfbf7b4" exitCode=0 Apr 24 16:38:54.266028 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:54.266002 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal" event={"ID":"438108d93a37ab59c6a0c9e57eee327c","Type":"ContainerDied","Data":"b6680e683a2eb98db9a9f0e8ce3db477bdfc68c3363cb0a0110c998efbfbf7b4"} Apr 24 16:38:55.233035 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:55.232998 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:38:55.233242 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:55.233128 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rlzxl" podUID="81d7a862-3288-4023-b0d6-2464e9278dac" Apr 24 16:38:55.279092 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:55.279053 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal" event={"ID":"438108d93a37ab59c6a0c9e57eee327c","Type":"ContainerStarted","Data":"9cbff7953deeeed71a319a2e1331b631ab74daea74d679346999ffa53c7a1f3a"} Apr 24 16:38:55.292217 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:55.292163 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-47.ec2.internal" podStartSLOduration=4.29214303 podStartE2EDuration="4.29214303s" podCreationTimestamp="2026-04-24 16:38:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:38:55.29190871 +0000 UTC m=+5.663612938" watchObservedRunningTime="2026-04-24 16:38:55.29214303 +0000 UTC m=+5.663847257" Apr 24 16:38:55.739054 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:55.738084 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2r84\" (UniqueName: \"kubernetes.io/projected/7230245a-1622-4c39-9d99-ab2e06ac0daf-kube-api-access-f2r84\") pod \"network-check-target-hcjpg\" (UID: \"7230245a-1622-4c39-9d99-ab2e06ac0daf\") " pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:38:55.739054 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:55.738200 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f368d48-c79b-45b5-8879-9dac1c5cfe3f-metrics-certs\") pod \"network-metrics-daemon-thz9k\" (UID: \"0f368d48-c79b-45b5-8879-9dac1c5cfe3f\") " pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:38:55.739054 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:55.738337 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:55.739054 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:55.738397 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f368d48-c79b-45b5-8879-9dac1c5cfe3f-metrics-certs podName:0f368d48-c79b-45b5-8879-9dac1c5cfe3f nodeName:}" failed. No retries permitted until 2026-04-24 16:38:59.738378913 +0000 UTC m=+10.110083124 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f368d48-c79b-45b5-8879-9dac1c5cfe3f-metrics-certs") pod "network-metrics-daemon-thz9k" (UID: "0f368d48-c79b-45b5-8879-9dac1c5cfe3f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:55.739054 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:55.738925 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:38:55.739054 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:55.738947 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:38:55.739054 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:55.738961 2578 projected.go:194] Error preparing data for projected volume kube-api-access-f2r84 for pod openshift-network-diagnostics/network-check-target-hcjpg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:55.739054 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:55.739019 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7230245a-1622-4c39-9d99-ab2e06ac0daf-kube-api-access-f2r84 podName:7230245a-1622-4c39-9d99-ab2e06ac0daf nodeName:}" failed. No retries permitted until 2026-04-24 16:38:59.738993208 +0000 UTC m=+10.110697432 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-f2r84" (UniqueName: "kubernetes.io/projected/7230245a-1622-4c39-9d99-ab2e06ac0daf-kube-api-access-f2r84") pod "network-check-target-hcjpg" (UID: "7230245a-1622-4c39-9d99-ab2e06ac0daf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:55.839101 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:55.839061 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/81d7a862-3288-4023-b0d6-2464e9278dac-original-pull-secret\") pod \"global-pull-secret-syncer-rlzxl\" (UID: \"81d7a862-3288-4023-b0d6-2464e9278dac\") " pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:38:55.839312 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:55.839229 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 16:38:55.839387 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:55.839312 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81d7a862-3288-4023-b0d6-2464e9278dac-original-pull-secret podName:81d7a862-3288-4023-b0d6-2464e9278dac nodeName:}" failed. No retries permitted until 2026-04-24 16:38:59.839275036 +0000 UTC m=+10.210979256 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/81d7a862-3288-4023-b0d6-2464e9278dac-original-pull-secret") pod "global-pull-secret-syncer-rlzxl" (UID: "81d7a862-3288-4023-b0d6-2464e9278dac") : object "kube-system"/"original-pull-secret" not registered Apr 24 16:38:56.233685 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:56.233652 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:38:56.233880 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:56.233782 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hcjpg" podUID="7230245a-1622-4c39-9d99-ab2e06ac0daf" Apr 24 16:38:56.234222 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:56.234182 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:38:56.234329 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:56.234310 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-thz9k" podUID="0f368d48-c79b-45b5-8879-9dac1c5cfe3f" Apr 24 16:38:57.233235 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:57.233199 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:38:57.233659 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:57.233387 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rlzxl" podUID="81d7a862-3288-4023-b0d6-2464e9278dac" Apr 24 16:38:58.234520 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:58.234434 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:38:58.234520 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:58.234466 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:38:58.234908 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:58.234590 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-thz9k" podUID="0f368d48-c79b-45b5-8879-9dac1c5cfe3f" Apr 24 16:38:58.235036 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:58.235013 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hcjpg" podUID="7230245a-1622-4c39-9d99-ab2e06ac0daf" Apr 24 16:38:59.234102 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:59.233569 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:38:59.234102 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:59.233721 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rlzxl" podUID="81d7a862-3288-4023-b0d6-2464e9278dac" Apr 24 16:38:59.773372 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:59.773083 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f368d48-c79b-45b5-8879-9dac1c5cfe3f-metrics-certs\") pod \"network-metrics-daemon-thz9k\" (UID: \"0f368d48-c79b-45b5-8879-9dac1c5cfe3f\") " pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:38:59.773372 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:59.773146 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2r84\" (UniqueName: \"kubernetes.io/projected/7230245a-1622-4c39-9d99-ab2e06ac0daf-kube-api-access-f2r84\") pod \"network-check-target-hcjpg\" (UID: \"7230245a-1622-4c39-9d99-ab2e06ac0daf\") " pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:38:59.773372 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:59.773264 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:38:59.773372 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:59.773264 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:59.773372 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:59.773297 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:38:59.773372 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:59.773315 2578 projected.go:194] Error preparing data for projected volume kube-api-access-f2r84 for pod openshift-network-diagnostics/network-check-target-hcjpg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:59.773372 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:59.773360 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f368d48-c79b-45b5-8879-9dac1c5cfe3f-metrics-certs podName:0f368d48-c79b-45b5-8879-9dac1c5cfe3f nodeName:}" failed. No retries permitted until 2026-04-24 16:39:07.773339107 +0000 UTC m=+18.145043320 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f368d48-c79b-45b5-8879-9dac1c5cfe3f-metrics-certs") pod "network-metrics-daemon-thz9k" (UID: "0f368d48-c79b-45b5-8879-9dac1c5cfe3f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:59.773372 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:59.773383 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7230245a-1622-4c39-9d99-ab2e06ac0daf-kube-api-access-f2r84 podName:7230245a-1622-4c39-9d99-ab2e06ac0daf nodeName:}" failed. No retries permitted until 2026-04-24 16:39:07.773372198 +0000 UTC m=+18.145076404 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-f2r84" (UniqueName: "kubernetes.io/projected/7230245a-1622-4c39-9d99-ab2e06ac0daf-kube-api-access-f2r84") pod "network-check-target-hcjpg" (UID: "7230245a-1622-4c39-9d99-ab2e06ac0daf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:59.873959 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:38:59.873921 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/81d7a862-3288-4023-b0d6-2464e9278dac-original-pull-secret\") pod \"global-pull-secret-syncer-rlzxl\" (UID: \"81d7a862-3288-4023-b0d6-2464e9278dac\") " pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:38:59.874131 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:59.874074 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 16:38:59.874131 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:38:59.874130 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81d7a862-3288-4023-b0d6-2464e9278dac-original-pull-secret podName:81d7a862-3288-4023-b0d6-2464e9278dac nodeName:}" failed. No retries permitted until 2026-04-24 16:39:07.874117253 +0000 UTC m=+18.245821457 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/81d7a862-3288-4023-b0d6-2464e9278dac-original-pull-secret") pod "global-pull-secret-syncer-rlzxl" (UID: "81d7a862-3288-4023-b0d6-2464e9278dac") : object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:00.234397 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:00.233960 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:39:00.234397 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:00.234044 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-thz9k" podUID="0f368d48-c79b-45b5-8879-9dac1c5cfe3f" Apr 24 16:39:00.234616 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:00.234435 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:39:00.234616 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:00.234505 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hcjpg" podUID="7230245a-1622-4c39-9d99-ab2e06ac0daf" Apr 24 16:39:01.232987 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:01.232950 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:39:01.233470 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:01.233076 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rlzxl" podUID="81d7a862-3288-4023-b0d6-2464e9278dac" Apr 24 16:39:02.233552 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:02.233520 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:39:02.233953 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:02.233571 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:39:02.233953 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:02.233693 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-thz9k" podUID="0f368d48-c79b-45b5-8879-9dac1c5cfe3f" Apr 24 16:39:02.233953 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:02.233799 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hcjpg" podUID="7230245a-1622-4c39-9d99-ab2e06ac0daf" Apr 24 16:39:03.233244 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:03.233205 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:39:03.233442 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:03.233356 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rlzxl" podUID="81d7a862-3288-4023-b0d6-2464e9278dac" Apr 24 16:39:04.233387 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:04.233347 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:39:04.233387 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:04.233391 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:39:04.233859 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:04.233489 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hcjpg" podUID="7230245a-1622-4c39-9d99-ab2e06ac0daf" Apr 24 16:39:04.233859 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:04.233613 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-thz9k" podUID="0f368d48-c79b-45b5-8879-9dac1c5cfe3f" Apr 24 16:39:05.232859 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:05.232823 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:39:05.233045 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:05.232955 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rlzxl" podUID="81d7a862-3288-4023-b0d6-2464e9278dac" Apr 24 16:39:06.232760 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:06.232721 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:39:06.233181 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:06.232772 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:39:06.233181 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:06.232836 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-thz9k" podUID="0f368d48-c79b-45b5-8879-9dac1c5cfe3f" Apr 24 16:39:06.233181 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:06.233008 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hcjpg" podUID="7230245a-1622-4c39-9d99-ab2e06ac0daf" Apr 24 16:39:07.233625 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:07.233097 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:39:07.233625 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:07.233226 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rlzxl" podUID="81d7a862-3288-4023-b0d6-2464e9278dac" Apr 24 16:39:07.834034 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:07.833997 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2r84\" (UniqueName: \"kubernetes.io/projected/7230245a-1622-4c39-9d99-ab2e06ac0daf-kube-api-access-f2r84\") pod \"network-check-target-hcjpg\" (UID: \"7230245a-1622-4c39-9d99-ab2e06ac0daf\") " pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:39:07.834238 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:07.834078 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f368d48-c79b-45b5-8879-9dac1c5cfe3f-metrics-certs\") pod \"network-metrics-daemon-thz9k\" (UID: \"0f368d48-c79b-45b5-8879-9dac1c5cfe3f\") " pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:39:07.834238 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:07.834187 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:07.834238 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:07.834196 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:07.834238 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:07.834223 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:07.834238 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:07.834237 2578 projected.go:194] Error preparing data for projected volume kube-api-access-f2r84 for pod openshift-network-diagnostics/network-check-target-hcjpg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:07.834492 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:07.834255 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f368d48-c79b-45b5-8879-9dac1c5cfe3f-metrics-certs podName:0f368d48-c79b-45b5-8879-9dac1c5cfe3f nodeName:}" failed. No retries permitted until 2026-04-24 16:39:23.834238901 +0000 UTC m=+34.205943105 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f368d48-c79b-45b5-8879-9dac1c5cfe3f-metrics-certs") pod "network-metrics-daemon-thz9k" (UID: "0f368d48-c79b-45b5-8879-9dac1c5cfe3f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:07.834492 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:07.834311 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7230245a-1622-4c39-9d99-ab2e06ac0daf-kube-api-access-f2r84 podName:7230245a-1622-4c39-9d99-ab2e06ac0daf nodeName:}" failed. No retries permitted until 2026-04-24 16:39:23.834272647 +0000 UTC m=+34.205976856 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-f2r84" (UniqueName: "kubernetes.io/projected/7230245a-1622-4c39-9d99-ab2e06ac0daf-kube-api-access-f2r84") pod "network-check-target-hcjpg" (UID: "7230245a-1622-4c39-9d99-ab2e06ac0daf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:07.935021 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:07.934984 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/81d7a862-3288-4023-b0d6-2464e9278dac-original-pull-secret\") pod \"global-pull-secret-syncer-rlzxl\" (UID: \"81d7a862-3288-4023-b0d6-2464e9278dac\") " pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:39:07.935195 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:07.935118 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:07.935195 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:07.935184 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81d7a862-3288-4023-b0d6-2464e9278dac-original-pull-secret podName:81d7a862-3288-4023-b0d6-2464e9278dac nodeName:}" failed. No retries permitted until 2026-04-24 16:39:23.935164189 +0000 UTC m=+34.306868406 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/81d7a862-3288-4023-b0d6-2464e9278dac-original-pull-secret") pod "global-pull-secret-syncer-rlzxl" (UID: "81d7a862-3288-4023-b0d6-2464e9278dac") : object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:08.233483 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:08.233444 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:39:08.233649 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:08.233496 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:39:08.233649 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:08.233607 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hcjpg" podUID="7230245a-1622-4c39-9d99-ab2e06ac0daf" Apr 24 16:39:08.233953 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:08.233738 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-thz9k" podUID="0f368d48-c79b-45b5-8879-9dac1c5cfe3f" Apr 24 16:39:09.232832 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:09.232796 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:39:09.233078 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:09.232918 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rlzxl" podUID="81d7a862-3288-4023-b0d6-2464e9278dac" Apr 24 16:39:10.233940 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:10.233767 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:39:10.234419 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:10.234030 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-thz9k" podUID="0f368d48-c79b-45b5-8879-9dac1c5cfe3f" Apr 24 16:39:10.235656 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:10.235620 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:39:10.235760 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:10.235736 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hcjpg" podUID="7230245a-1622-4c39-9d99-ab2e06ac0daf" Apr 24 16:39:10.306145 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:10.306120 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sxnh6" event={"ID":"4c3c8dac-2a32-4ed7-9c10-c067aed23653","Type":"ContainerStarted","Data":"a0463a3daa3dbe02105cc7a62acd4900bc429319ad305d6bb8147ec05bf98dbb"} Apr 24 16:39:10.307654 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:10.307633 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7m4xb" event={"ID":"e8baf786-1fb8-494a-bdb7-c724c853faa3","Type":"ContainerStarted","Data":"d4010a1a2623871f6f5b4c81d4ce5722f7aba37b2d9a0e279af1ce741409c52d"} Apr 24 16:39:10.308851 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:10.308819 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" event={"ID":"104dda0f-092c-4fa1-98cb-7e6dcc147db2","Type":"ContainerStarted","Data":"2fd20d5b7439c49c997c34c8aa6ebc0654811ebad18ee431c9b6c383030b8ca7"} Apr 24 16:39:10.310452 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:10.310428 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dt6kn" event={"ID":"46e943b2-628a-486a-adb0-7bb92be03a03","Type":"ContainerStarted","Data":"498194e6b161dcf7e006c4cb02b2b1c93f493c94880857833639e56e05363700"} Apr 24 16:39:10.311522 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:10.311505 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hgbpw" event={"ID":"0c733890-1ac2-464e-9672-65bf90aded78","Type":"ContainerStarted","Data":"ebd44a7d6724d3d7571d520cfedbe85c133207103ce62aaaa69599774510b3ab"} Apr 24 16:39:10.312967 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:10.312907 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x2tv7" event={"ID":"33870c17-d8aa-426a-9bde-7d0a1e04404a","Type":"ContainerStarted","Data":"038a1783a92ec40d50914ffc296cae97bf7f35795ab0b62d45256e1066c8d737"} Apr 24 16:39:10.350714 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:10.350664 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7m4xb" podStartSLOduration=3.255672324 podStartE2EDuration="20.350648298s" podCreationTimestamp="2026-04-24 16:38:50 +0000 UTC" firstStartedPulling="2026-04-24 16:38:52.935918744 +0000 UTC m=+3.307622953" lastFinishedPulling="2026-04-24 16:39:10.030894716 +0000 UTC m=+20.402598927" observedRunningTime="2026-04-24 16:39:10.338002621 +0000 UTC m=+20.709706847" watchObservedRunningTime="2026-04-24 16:39:10.350648298 +0000 UTC m=+20.722352525" Apr 24 16:39:10.351121 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:10.351086 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-bhmmj" podStartSLOduration=3.253454572 podStartE2EDuration="20.351077854s" podCreationTimestamp="2026-04-24 16:38:50 +0000 UTC" firstStartedPulling="2026-04-24 16:38:52.933266742 +0000 UTC m=+3.304970948" lastFinishedPulling="2026-04-24 16:39:10.030890011 +0000 UTC m=+20.402594230" observedRunningTime="2026-04-24 16:39:10.350745193 +0000 UTC m=+20.722449422" watchObservedRunningTime="2026-04-24 16:39:10.351077854 +0000 UTC m=+20.722782080" Apr 24 16:39:10.366519 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:10.366483 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-x2tv7" podStartSLOduration=3.2160410170000002 podStartE2EDuration="20.366470002s" podCreationTimestamp="2026-04-24 16:38:50 +0000 UTC" firstStartedPulling="2026-04-24 16:38:52.932053586 +0000 UTC m=+3.303757791" lastFinishedPulling="2026-04-24 16:39:10.082482555 +0000 UTC m=+20.454186776" observedRunningTime="2026-04-24 16:39:10.365152789 +0000 UTC m=+20.736857016" watchObservedRunningTime="2026-04-24 16:39:10.366470002 +0000 UTC m=+20.738174228" Apr 24 16:39:11.233767 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:11.233600 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:39:11.233866 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:11.233848 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rlzxl" podUID="81d7a862-3288-4023-b0d6-2464e9278dac" Apr 24 16:39:11.316363 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:11.316333 2578 generic.go:358] "Generic (PLEG): container finished" podID="0c733890-1ac2-464e-9672-65bf90aded78" containerID="ebd44a7d6724d3d7571d520cfedbe85c133207103ce62aaaa69599774510b3ab" exitCode=0 Apr 24 16:39:11.317011 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:11.316413 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hgbpw" event={"ID":"0c733890-1ac2-464e-9672-65bf90aded78","Type":"ContainerDied","Data":"ebd44a7d6724d3d7571d520cfedbe85c133207103ce62aaaa69599774510b3ab"} Apr 24 16:39:11.319067 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:11.319014 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/ovn-acl-logging/0.log" Apr 24 16:39:11.319354 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:11.319321 2578 generic.go:358] "Generic (PLEG): container finished" podID="76a7c3cf-c152-4bdc-8d94-50d4af52aeee" containerID="d5a70a923e3a62a250ee9b8483babe1cbe2725292d732d701960b1066b34893d" exitCode=1 Apr 24 16:39:11.319439 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:11.319417 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" event={"ID":"76a7c3cf-c152-4bdc-8d94-50d4af52aeee","Type":"ContainerStarted","Data":"b589e9074e3c831de6710a9722a2c8e202bb7e4950ec80fd3e98eb93f2e28c44"} Apr 24 16:39:11.319557 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:11.319449 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" event={"ID":"76a7c3cf-c152-4bdc-8d94-50d4af52aeee","Type":"ContainerStarted","Data":"fe4b41078b2088451df1170f1a895b558aea2a46853e6bab135b74f3fd34b743"} Apr 24 16:39:11.319557 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:11.319463 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" event={"ID":"76a7c3cf-c152-4bdc-8d94-50d4af52aeee","Type":"ContainerStarted","Data":"767c3db5cb67c3a3ccd27174307259fa9b4d0321fe40b9239aab7fcfe8d12315"} Apr 24 16:39:11.319557 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:11.319475 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" event={"ID":"76a7c3cf-c152-4bdc-8d94-50d4af52aeee","Type":"ContainerStarted","Data":"accec82f3085bac379d13523d43729f45c5001c4c118d8b00a216567c2ab40e0"} Apr 24 16:39:11.319557 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:11.319486 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" event={"ID":"76a7c3cf-c152-4bdc-8d94-50d4af52aeee","Type":"ContainerDied","Data":"d5a70a923e3a62a250ee9b8483babe1cbe2725292d732d701960b1066b34893d"} Apr 24 16:39:11.319557 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:11.319501 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" event={"ID":"76a7c3cf-c152-4bdc-8d94-50d4af52aeee","Type":"ContainerStarted","Data":"48673b493362719bda31169c756c2be6e8d8eb8d435a55fdddd8a7fbecb4a0e4"} Apr 24 16:39:11.350717 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:11.350670 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-dt6kn" podStartSLOduration=4.24323586 podStartE2EDuration="21.350655844s" podCreationTimestamp="2026-04-24 16:38:50 +0000 UTC" firstStartedPulling="2026-04-24 16:38:52.932878062 +0000 UTC m=+3.304582269" lastFinishedPulling="2026-04-24 16:39:10.040298033 +0000 UTC m=+20.412002253" observedRunningTime="2026-04-24 16:39:11.350542136 +0000 UTC m=+21.722246362" watchObservedRunningTime="2026-04-24 16:39:11.350655844 +0000 UTC m=+21.722360070" Apr 24 16:39:11.605344 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:11.605322 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 16:39:12.164996 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:12.164803 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T16:39:11.605338952Z","UUID":"ab11bbf4-29b1-4c0c-b6b8-52cdcd7a55c0","Handler":null,"Name":"","Endpoint":""} Apr 24 16:39:12.167726 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:12.167698 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 16:39:12.167726 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:12.167732 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 16:39:12.233596 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:12.233563 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:39:12.233596 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:12.233563 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:39:12.233838 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:12.233733 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-thz9k" podUID="0f368d48-c79b-45b5-8879-9dac1c5cfe3f" Apr 24 16:39:12.233882 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:12.233838 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hcjpg" podUID="7230245a-1622-4c39-9d99-ab2e06ac0daf" Apr 24 16:39:12.320080 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:12.320039 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-dt6kn" Apr 24 16:39:12.321009 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:12.320987 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-dt6kn" Apr 24 16:39:12.322820 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:12.322785 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-46rrs" event={"ID":"705121d3-75ff-4f72-9362-f2f98bdb4bd4","Type":"ContainerStarted","Data":"8c4c53f0bdbd51fd1010dc3151452c2c924132137a76ea1b1c66cb9ed804690a"} Apr 24 16:39:12.324966 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:12.324730 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sxnh6" event={"ID":"4c3c8dac-2a32-4ed7-9c10-c067aed23653","Type":"ContainerStarted","Data":"67d826f9580acd3596294759d90ae6cb37215779c7c7a5966c58049d751054c9"} Apr 24 16:39:12.348074 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:12.348022 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-46rrs" podStartSLOduration=5.242613303 podStartE2EDuration="22.348007636s" podCreationTimestamp="2026-04-24 16:38:50 +0000 UTC" firstStartedPulling="2026-04-24 16:38:52.934925875 +0000 UTC m=+3.306630087" lastFinishedPulling="2026-04-24 16:39:10.040320204 +0000 UTC m=+20.412024420" observedRunningTime="2026-04-24 16:39:12.347631651 +0000 UTC m=+22.719335877" watchObservedRunningTime="2026-04-24 16:39:12.348007636 +0000 UTC m=+22.719711861" Apr 24 16:39:13.233232 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:13.233192 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:39:13.233435 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:13.233327 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rlzxl" podUID="81d7a862-3288-4023-b0d6-2464e9278dac" Apr 24 16:39:13.329409 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:13.329359 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sxnh6" event={"ID":"4c3c8dac-2a32-4ed7-9c10-c067aed23653","Type":"ContainerStarted","Data":"3e258ebfbfccd7f82fadc91013df9f4a2108be3ba56602377c1224135ad7322a"} Apr 24 16:39:13.329782 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:13.329447 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 16:39:13.343398 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:13.343331 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sxnh6" podStartSLOduration=3.500081878 podStartE2EDuration="23.343311389s" podCreationTimestamp="2026-04-24 16:38:50 +0000 UTC" firstStartedPulling="2026-04-24 16:38:52.927234161 +0000 UTC m=+3.298938376" lastFinishedPulling="2026-04-24 16:39:12.770463671 +0000 UTC m=+23.142167887" observedRunningTime="2026-04-24 16:39:13.342991028 +0000 UTC m=+23.714695254" watchObservedRunningTime="2026-04-24 16:39:13.343311389 +0000 UTC m=+23.715015613" Apr 24 16:39:14.233071 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:14.232899 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:39:14.233254 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:14.233152 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hcjpg" podUID="7230245a-1622-4c39-9d99-ab2e06ac0daf" Apr 24 16:39:14.233362 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:14.232905 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:39:14.233500 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:14.233479 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-thz9k" podUID="0f368d48-c79b-45b5-8879-9dac1c5cfe3f" Apr 24 16:39:14.333811 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:14.333785 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/ovn-acl-logging/0.log" Apr 24 16:39:14.334200 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:14.334137 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" event={"ID":"76a7c3cf-c152-4bdc-8d94-50d4af52aeee","Type":"ContainerStarted","Data":"fa486dc335dca7897b4266b3b46c1345a1b7710f7cad1ecb91f0f9ecd0ebf1fe"} Apr 24 16:39:15.232859 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:15.232822 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:39:15.233081 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:15.233035 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rlzxl" podUID="81d7a862-3288-4023-b0d6-2464e9278dac" Apr 24 16:39:16.233153 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:16.232985 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:39:16.234007 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:16.232986 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:39:16.234007 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:16.233224 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-thz9k" podUID="0f368d48-c79b-45b5-8879-9dac1c5cfe3f" Apr 24 16:39:16.234007 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:16.233319 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hcjpg" podUID="7230245a-1622-4c39-9d99-ab2e06ac0daf" Apr 24 16:39:16.339898 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:16.339869 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/ovn-acl-logging/0.log" Apr 24 16:39:16.340320 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:16.340271 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" event={"ID":"76a7c3cf-c152-4bdc-8d94-50d4af52aeee","Type":"ContainerStarted","Data":"70056f70424829c5b7dde79b1588c70957b41cfe0589397927b60fdbecc17121"} Apr 24 16:39:16.340644 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:16.340618 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:39:16.340770 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:16.340754 2578 scope.go:117] "RemoveContainer" containerID="d5a70a923e3a62a250ee9b8483babe1cbe2725292d732d701960b1066b34893d" Apr 24 16:39:16.342184 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:16.342162 2578 generic.go:358] "Generic (PLEG): container finished" podID="0c733890-1ac2-464e-9672-65bf90aded78" containerID="fdd7ef2da8560a87fdf92be766cb6a9287ded926a01103ed77bab0f50d9152c3" exitCode=0 Apr 24 16:39:16.342271 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:16.342208 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hgbpw" event={"ID":"0c733890-1ac2-464e-9672-65bf90aded78","Type":"ContainerDied","Data":"fdd7ef2da8560a87fdf92be766cb6a9287ded926a01103ed77bab0f50d9152c3"} Apr 24 16:39:16.357441 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:16.357424 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:39:17.233512 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:17.233485 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:39:17.233857 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:17.233590 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rlzxl" podUID="81d7a862-3288-4023-b0d6-2464e9278dac" Apr 24 16:39:17.346750 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:17.346675 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/ovn-acl-logging/0.log" Apr 24 16:39:17.347028 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:17.347002 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" event={"ID":"76a7c3cf-c152-4bdc-8d94-50d4af52aeee","Type":"ContainerStarted","Data":"726564c4d203d8cbd60523d97783ab878bccf49a9a468d93459df1ed08fce695"} Apr 24 16:39:17.347150 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:17.347133 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 16:39:17.347425 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:17.347404 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:39:17.349125 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:17.349106 2578 generic.go:358] "Generic (PLEG): container finished" podID="0c733890-1ac2-464e-9672-65bf90aded78" containerID="8a980481e9e42ae1d02ce58c93dcdea5ddcf08563b304027b28dabafb64eb28d" exitCode=0 Apr 24 16:39:17.349201 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:17.349136 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hgbpw" event={"ID":"0c733890-1ac2-464e-9672-65bf90aded78","Type":"ContainerDied","Data":"8a980481e9e42ae1d02ce58c93dcdea5ddcf08563b304027b28dabafb64eb28d"} Apr 24 16:39:17.363023 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:17.363003 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:39:17.381111 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:17.381072 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" podStartSLOduration=10.015713388 podStartE2EDuration="27.381061496s" podCreationTimestamp="2026-04-24 16:38:50 +0000 UTC" firstStartedPulling="2026-04-24 16:38:52.936747316 +0000 UTC m=+3.308451526" lastFinishedPulling="2026-04-24 16:39:10.302095425 +0000 UTC m=+20.673799634" observedRunningTime="2026-04-24 16:39:17.377019392 +0000 UTC m=+27.748723618" watchObservedRunningTime="2026-04-24 16:39:17.381061496 +0000 UTC m=+27.752765722" Apr 24 16:39:17.461968 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:17.461939 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rlzxl"] Apr 24 16:39:17.462127 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:17.462061 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:39:17.462167 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:17.462144 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rlzxl" podUID="81d7a862-3288-4023-b0d6-2464e9278dac" Apr 24 16:39:17.463776 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:17.463755 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-thz9k"] Apr 24 16:39:17.463867 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:17.463858 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:39:17.463950 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:17.463931 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-thz9k" podUID="0f368d48-c79b-45b5-8879-9dac1c5cfe3f" Apr 24 16:39:17.474724 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:17.474702 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hcjpg"] Apr 24 16:39:17.474860 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:17.474782 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:39:17.474921 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:17.474841 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hcjpg" podUID="7230245a-1622-4c39-9d99-ab2e06ac0daf" Apr 24 16:39:18.353546 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:18.353456 2578 generic.go:358] "Generic (PLEG): container finished" podID="0c733890-1ac2-464e-9672-65bf90aded78" containerID="72d83c2b3891eef53c93ea8edb6a671b1cce6ce13c92f798319a937f32ba2d07" exitCode=0 Apr 24 16:39:18.353871 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:18.353545 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hgbpw" event={"ID":"0c733890-1ac2-464e-9672-65bf90aded78","Type":"ContainerDied","Data":"72d83c2b3891eef53c93ea8edb6a671b1cce6ce13c92f798319a937f32ba2d07"} Apr 24 16:39:18.353871 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:18.353788 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 16:39:19.232688 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:19.232652 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:39:19.232870 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:19.232776 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:39:19.232870 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:19.232808 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:39:19.232870 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:19.232774 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hcjpg" podUID="7230245a-1622-4c39-9d99-ab2e06ac0daf" Apr 24 16:39:19.233017 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:19.232900 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-thz9k" podUID="0f368d48-c79b-45b5-8879-9dac1c5cfe3f" Apr 24 16:39:19.233017 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:19.232987 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rlzxl" podUID="81d7a862-3288-4023-b0d6-2464e9278dac" Apr 24 16:39:19.356656 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:19.356631 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 16:39:20.435749 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:20.435712 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-dt6kn" Apr 24 16:39:20.436243 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:20.435863 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 16:39:20.436803 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:20.436775 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-dt6kn" Apr 24 16:39:20.445245 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:20.445214 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:39:20.445498 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:20.445483 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 16:39:20.457621 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:20.457570 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" podUID="76a7c3cf-c152-4bdc-8d94-50d4af52aeee" containerName="ovnkube-controller" probeResult="failure" output="" Apr 24 16:39:20.467354 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:20.467324 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" podUID="76a7c3cf-c152-4bdc-8d94-50d4af52aeee" containerName="ovnkube-controller" probeResult="failure" output="" Apr 24 16:39:21.233774 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:21.233734 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:39:21.233961 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:21.233848 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:39:21.233961 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:21.233853 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hcjpg" podUID="7230245a-1622-4c39-9d99-ab2e06ac0daf" Apr 24 16:39:21.233961 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:21.233878 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:39:21.234125 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:21.233965 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rlzxl" podUID="81d7a862-3288-4023-b0d6-2464e9278dac" Apr 24 16:39:21.234125 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:21.234070 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-thz9k" podUID="0f368d48-c79b-45b5-8879-9dac1c5cfe3f" Apr 24 16:39:23.233498 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:23.233456 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:39:23.233498 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:23.233494 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:39:23.234242 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:23.233456 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:39:23.234242 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:23.233598 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-thz9k" podUID="0f368d48-c79b-45b5-8879-9dac1c5cfe3f" Apr 24 16:39:23.234242 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:23.233696 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rlzxl" podUID="81d7a862-3288-4023-b0d6-2464e9278dac" Apr 24 16:39:23.234242 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:23.233803 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hcjpg" podUID="7230245a-1622-4c39-9d99-ab2e06ac0daf" Apr 24 16:39:23.856050 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:23.856010 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f368d48-c79b-45b5-8879-9dac1c5cfe3f-metrics-certs\") pod \"network-metrics-daemon-thz9k\" (UID: \"0f368d48-c79b-45b5-8879-9dac1c5cfe3f\") " pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:39:23.856252 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:23.856070 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2r84\" (UniqueName: \"kubernetes.io/projected/7230245a-1622-4c39-9d99-ab2e06ac0daf-kube-api-access-f2r84\") pod \"network-check-target-hcjpg\" (UID: \"7230245a-1622-4c39-9d99-ab2e06ac0daf\") " pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:39:23.856252 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:23.856175 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:23.856252 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:23.856202 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:23.856252 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:23.856220 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:23.856252 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:23.856232 2578 projected.go:194] Error preparing data for projected volume kube-api-access-f2r84 for pod openshift-network-diagnostics/network-check-target-hcjpg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:23.856252 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:23.856253 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f368d48-c79b-45b5-8879-9dac1c5cfe3f-metrics-certs podName:0f368d48-c79b-45b5-8879-9dac1c5cfe3f nodeName:}" failed. No retries permitted until 2026-04-24 16:39:55.856236113 +0000 UTC m=+66.227940316 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f368d48-c79b-45b5-8879-9dac1c5cfe3f-metrics-certs") pod "network-metrics-daemon-thz9k" (UID: "0f368d48-c79b-45b5-8879-9dac1c5cfe3f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:23.856517 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:23.856278 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7230245a-1622-4c39-9d99-ab2e06ac0daf-kube-api-access-f2r84 podName:7230245a-1622-4c39-9d99-ab2e06ac0daf nodeName:}" failed. No retries permitted until 2026-04-24 16:39:55.856264545 +0000 UTC m=+66.227968766 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-f2r84" (UniqueName: "kubernetes.io/projected/7230245a-1622-4c39-9d99-ab2e06ac0daf-kube-api-access-f2r84") pod "network-check-target-hcjpg" (UID: "7230245a-1622-4c39-9d99-ab2e06ac0daf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:23.956956 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:23.956919 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/81d7a862-3288-4023-b0d6-2464e9278dac-original-pull-secret\") pod \"global-pull-secret-syncer-rlzxl\" (UID: \"81d7a862-3288-4023-b0d6-2464e9278dac\") " pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:39:23.957085 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:23.957063 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:23.957146 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:23.957135 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81d7a862-3288-4023-b0d6-2464e9278dac-original-pull-secret podName:81d7a862-3288-4023-b0d6-2464e9278dac nodeName:}" failed. No retries permitted until 2026-04-24 16:39:55.957118048 +0000 UTC m=+66.328822258 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/81d7a862-3288-4023-b0d6-2464e9278dac-original-pull-secret") pod "global-pull-secret-syncer-rlzxl" (UID: "81d7a862-3288-4023-b0d6-2464e9278dac") : object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:23.995207 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:23.995178 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-47.ec2.internal" event="NodeReady" Apr 24 16:39:23.995375 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:23.995363 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 16:39:24.031734 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.031695 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-8j4f4"] Apr 24 16:39:24.050811 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.050764 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-759b975967-pgkrt"] Apr 24 16:39:24.051004 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.050918 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-8j4f4" Apr 24 16:39:24.053623 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.053597 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 24 16:39:24.053623 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.053613 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 24 16:39:24.053913 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.053898 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 16:39:24.054985 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.054964 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-h6t9d\"" Apr 24 16:39:24.060405 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.060150 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 24 16:39:24.065116 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.065091 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 16:39:24.069741 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.069714 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-n6v4x"] Apr 24 16:39:24.069906 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.069886 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:24.072228 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.072210 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 16:39:24.072703 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.072689 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-ff9nx\"" Apr 24 16:39:24.073208 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.073193 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 16:39:24.073323 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.073211 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 16:39:24.082519 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.082487 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 16:39:24.090232 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.090205 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5958786cb4-48wlg"] Apr 24 16:39:24.090411 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.090392 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n6v4x" Apr 24 16:39:24.094550 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.094523 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 24 16:39:24.095029 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.095003 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-4njbg\"" Apr 24 16:39:24.095149 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.095060 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 16:39:24.095404 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.095251 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 16:39:24.095404 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.095389 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 24 16:39:24.103134 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.103112 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82qpb"] Apr 24 16:39:24.103273 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.103256 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:24.105578 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.105560 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 24 16:39:24.105660 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.105560 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 24 16:39:24.105938 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.105925 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 16:39:24.106114 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.106078 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 24 16:39:24.106192 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.106166 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-556xq\"" Apr 24 16:39:24.106231 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.106173 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 16:39:24.106527 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.106502 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 24 16:39:24.123659 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.123626 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kzb6p"] Apr 24 16:39:24.123793 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.123776 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82qpb" Apr 24 16:39:24.126667 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.126646 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 24 16:39:24.126913 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.126899 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-fws7t\"" Apr 24 16:39:24.126966 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.126915 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 24 16:39:24.127436 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.127421 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:39:24.143336 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.143299 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-f2l2f"] Apr 24 16:39:24.143459 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.143378 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kzb6p" Apr 24 16:39:24.145848 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.145827 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:39:24.145848 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.145838 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 24 16:39:24.146043 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.145931 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-x4w9h\"" Apr 24 16:39:24.158608 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.158586 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea323d90-8b5e-41a9-9633-538a3fdcead6-trusted-ca\") pod \"image-registry-759b975967-pgkrt\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:24.158696 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.158628 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0f4702a-d6d6-410a-9800-fb13b913d223-serving-cert\") pod \"insights-operator-585dfdc468-8j4f4\" (UID: \"f0f4702a-d6d6-410a-9800-fb13b913d223\") " pod="openshift-insights/insights-operator-585dfdc468-8j4f4" Apr 24 16:39:24.158696 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.158661 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ea323d90-8b5e-41a9-9633-538a3fdcead6-ca-trust-extracted\") pod \"image-registry-759b975967-pgkrt\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:24.158696 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.158687 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-registry-tls\") pod \"image-registry-759b975967-pgkrt\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:24.158857 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.158738 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ea323d90-8b5e-41a9-9633-538a3fdcead6-registry-certificates\") pod \"image-registry-759b975967-pgkrt\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:24.158857 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.158771 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f0f4702a-d6d6-410a-9800-fb13b913d223-tmp\") pod \"insights-operator-585dfdc468-8j4f4\" (UID: \"f0f4702a-d6d6-410a-9800-fb13b913d223\") " pod="openshift-insights/insights-operator-585dfdc468-8j4f4" Apr 24 16:39:24.158857 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.158802 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ea323d90-8b5e-41a9-9633-538a3fdcead6-image-registry-private-configuration\") pod \"image-registry-759b975967-pgkrt\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:24.158857 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.158836 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ea323d90-8b5e-41a9-9633-538a3fdcead6-installation-pull-secrets\") pod \"image-registry-759b975967-pgkrt\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:24.159014 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.158872 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccmg4\" (UniqueName: \"kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-kube-api-access-ccmg4\") pod \"image-registry-759b975967-pgkrt\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:24.159014 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.158935 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-bound-sa-token\") pod \"image-registry-759b975967-pgkrt\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:24.159014 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.158960 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcszp\" (UniqueName: \"kubernetes.io/projected/f0f4702a-d6d6-410a-9800-fb13b913d223-kube-api-access-jcszp\") pod \"insights-operator-585dfdc468-8j4f4\" (UID: \"f0f4702a-d6d6-410a-9800-fb13b913d223\") " pod="openshift-insights/insights-operator-585dfdc468-8j4f4" Apr 24 16:39:24.159014 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.159006 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0f4702a-d6d6-410a-9800-fb13b913d223-service-ca-bundle\") pod \"insights-operator-585dfdc468-8j4f4\" (UID: \"f0f4702a-d6d6-410a-9800-fb13b913d223\") " pod="openshift-insights/insights-operator-585dfdc468-8j4f4" Apr 24 16:39:24.159161 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.159027 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/f0f4702a-d6d6-410a-9800-fb13b913d223-snapshots\") pod \"insights-operator-585dfdc468-8j4f4\" (UID: \"f0f4702a-d6d6-410a-9800-fb13b913d223\") " pod="openshift-insights/insights-operator-585dfdc468-8j4f4" Apr 24 16:39:24.159161 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.159042 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0f4702a-d6d6-410a-9800-fb13b913d223-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-8j4f4\" (UID: \"f0f4702a-d6d6-410a-9800-fb13b913d223\") " pod="openshift-insights/insights-operator-585dfdc468-8j4f4" Apr 24 16:39:24.160887 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.160871 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-8j4f4"] Apr 24 16:39:24.160941 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.160898 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-n6v4x"] Apr 24 16:39:24.160941 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.160909 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kzb6p"] Apr 24 16:39:24.160941 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.160918 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5958786cb4-48wlg"] Apr 24 16:39:24.160941 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.160925 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-f2l2f"] Apr 24 16:39:24.160941 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.160934 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnwr7"] Apr 24 16:39:24.160941 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.160923 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-f2l2f" Apr 24 16:39:24.163488 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.163464 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 24 16:39:24.163961 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.163929 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:39:24.164215 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.164144 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 24 16:39:24.164399 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.164385 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-hw6hb\"" Apr 24 16:39:24.164754 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.164732 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 24 16:39:24.179429 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.179401 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-czwgm"] Apr 24 16:39:24.179626 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.179609 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnwr7" Apr 24 16:39:24.182667 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.182511 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 24 16:39:24.182667 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.182636 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 24 16:39:24.183065 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.182756 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 24 16:39:24.183065 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.182759 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:39:24.183065 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.182815 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 24 16:39:24.183426 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.183404 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-f8zf4\"" Apr 24 16:39:24.206613 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.206594 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-xx69l"] Apr 24 16:39:24.206750 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.206734 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-czwgm" Apr 24 16:39:24.209506 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.209486 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 24 16:39:24.209585 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.209534 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 24 16:39:24.209830 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.209814 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:39:24.209870 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.209828 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-52tsx\"" Apr 24 16:39:24.210026 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.210015 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 24 16:39:24.230754 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.230665 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-kcgx6"] Apr 24 16:39:24.230929 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.230809 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xx69l" Apr 24 16:39:24.233348 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.233327 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 16:39:24.233428 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.233406 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-6dhrh\"" Apr 24 16:39:24.233502 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.233327 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 16:39:24.255405 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.255363 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-759b975967-pgkrt"] Apr 24 16:39:24.255405 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.255392 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82qpb"] Apr 24 16:39:24.255405 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.255406 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnwr7"] Apr 24 16:39:24.255838 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.255419 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kcgx6"] Apr 24 16:39:24.255838 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.255431 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-czwgm"] Apr 24 16:39:24.256668 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.256645 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kcgx6" Apr 24 16:39:24.259051 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.259028 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 16:39:24.259170 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.259051 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zcktm\"" Apr 24 16:39:24.259236 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.259218 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 16:39:24.259347 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.259327 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 16:39:24.259527 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.259500 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0f4702a-d6d6-410a-9800-fb13b913d223-service-ca-bundle\") pod \"insights-operator-585dfdc468-8j4f4\" (UID: \"f0f4702a-d6d6-410a-9800-fb13b913d223\") " pod="openshift-insights/insights-operator-585dfdc468-8j4f4" Apr 24 16:39:24.259599 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.259557 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/45369a43-7ed3-4f16-a1dc-7f1a61e06fc4-trusted-ca\") pod \"console-operator-9d4b6777b-f2l2f\" (UID: \"45369a43-7ed3-4f16-a1dc-7f1a61e06fc4\") " pod="openshift-console-operator/console-operator-9d4b6777b-f2l2f" Apr 24 16:39:24.259599 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.259586 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea323d90-8b5e-41a9-9633-538a3fdcead6-trusted-ca\") pod \"image-registry-759b975967-pgkrt\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:24.259704 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.259614 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/32ea204a-32cf-4de1-bd0e-675e568756f9-default-certificate\") pod \"router-default-5958786cb4-48wlg\" (UID: \"32ea204a-32cf-4de1-bd0e-675e568756f9\") " pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:24.259704 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.259653 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ea323d90-8b5e-41a9-9633-538a3fdcead6-ca-trust-extracted\") pod \"image-registry-759b975967-pgkrt\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:24.259704 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.259678 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/32ea204a-32cf-4de1-bd0e-675e568756f9-stats-auth\") pod \"router-default-5958786cb4-48wlg\" (UID: \"32ea204a-32cf-4de1-bd0e-675e568756f9\") " pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:24.259856 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.259707 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-registry-tls\") pod \"image-registry-759b975967-pgkrt\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:24.259856 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.259739 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ea323d90-8b5e-41a9-9633-538a3fdcead6-image-registry-private-configuration\") pod \"image-registry-759b975967-pgkrt\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:24.259856 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.259776 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/4075e057-13f0-412d-96df-8e124be59b52-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-n6v4x\" (UID: \"4075e057-13f0-412d-96df-8e124be59b52\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n6v4x" Apr 24 16:39:24.259856 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.259803 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32ea204a-32cf-4de1-bd0e-675e568756f9-service-ca-bundle\") pod \"router-default-5958786cb4-48wlg\" (UID: \"32ea204a-32cf-4de1-bd0e-675e568756f9\") " pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:24.259856 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.259831 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ea323d90-8b5e-41a9-9633-538a3fdcead6-installation-pull-secrets\") pod \"image-registry-759b975967-pgkrt\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:24.260088 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.259863 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ccmg4\" (UniqueName: \"kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-kube-api-access-ccmg4\") pod \"image-registry-759b975967-pgkrt\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:24.260088 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.259872 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-xx69l"] Apr 24 16:39:24.260088 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.259896 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4075e057-13f0-412d-96df-8e124be59b52-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-n6v4x\" (UID: \"4075e057-13f0-412d-96df-8e124be59b52\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n6v4x" Apr 24 16:39:24.260088 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.259907 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-dzj8d"] Apr 24 16:39:24.260088 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.259929 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klxgt\" (UniqueName: \"kubernetes.io/projected/32ea204a-32cf-4de1-bd0e-675e568756f9-kube-api-access-klxgt\") pod \"router-default-5958786cb4-48wlg\" (UID: \"32ea204a-32cf-4de1-bd0e-675e568756f9\") " pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:24.260088 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.259992 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/335d3a61-4224-4da9-adb2-5f83cb395511-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-jnwr7\" (UID: \"335d3a61-4224-4da9-adb2-5f83cb395511\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnwr7" Apr 24 16:39:24.260088 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.260022 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-bound-sa-token\") pod \"image-registry-759b975967-pgkrt\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:24.260088 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.260050 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4nd4\" (UniqueName: \"kubernetes.io/projected/ee65ef05-2b40-4b45-a683-47cafa91b43c-kube-api-access-s4nd4\") pod \"volume-data-source-validator-7c6cbb6c87-kzb6p\" (UID: \"ee65ef05-2b40-4b45-a683-47cafa91b43c\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kzb6p" Apr 24 16:39:24.260088 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.260062 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0f4702a-d6d6-410a-9800-fb13b913d223-service-ca-bundle\") pod \"insights-operator-585dfdc468-8j4f4\" (UID: \"f0f4702a-d6d6-410a-9800-fb13b913d223\") " pod="openshift-insights/insights-operator-585dfdc468-8j4f4" Apr 24 16:39:24.260498 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.260089 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmwsh\" (UniqueName: \"kubernetes.io/projected/335d3a61-4224-4da9-adb2-5f83cb395511-kube-api-access-lmwsh\") pod \"kube-storage-version-migrator-operator-6769c5d45-jnwr7\" (UID: \"335d3a61-4224-4da9-adb2-5f83cb395511\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnwr7" Apr 24 16:39:24.260498 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.260134 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/f0f4702a-d6d6-410a-9800-fb13b913d223-snapshots\") pod \"insights-operator-585dfdc468-8j4f4\" (UID: \"f0f4702a-d6d6-410a-9800-fb13b913d223\") " pod="openshift-insights/insights-operator-585dfdc468-8j4f4" Apr 24 16:39:24.260498 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.260162 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0f4702a-d6d6-410a-9800-fb13b913d223-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-8j4f4\" (UID: \"f0f4702a-d6d6-410a-9800-fb13b913d223\") " pod="openshift-insights/insights-operator-585dfdc468-8j4f4" Apr 24 16:39:24.260498 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.260190 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbxgv\" (UniqueName: \"kubernetes.io/projected/546b86b2-63c7-46c8-b64c-048aaf992dca-kube-api-access-vbxgv\") pod \"cluster-samples-operator-6dc5bdb6b4-82qpb\" (UID: \"546b86b2-63c7-46c8-b64c-048aaf992dca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82qpb" Apr 24 16:39:24.260498 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.260238 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45369a43-7ed3-4f16-a1dc-7f1a61e06fc4-serving-cert\") pod \"console-operator-9d4b6777b-f2l2f\" (UID: \"45369a43-7ed3-4f16-a1dc-7f1a61e06fc4\") " pod="openshift-console-operator/console-operator-9d4b6777b-f2l2f" Apr 24 16:39:24.260498 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.260267 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0f4702a-d6d6-410a-9800-fb13b913d223-serving-cert\") pod \"insights-operator-585dfdc468-8j4f4\" (UID: \"f0f4702a-d6d6-410a-9800-fb13b913d223\") " pod="openshift-insights/insights-operator-585dfdc468-8j4f4" Apr 24 16:39:24.260498 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.260320 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32ea204a-32cf-4de1-bd0e-675e568756f9-metrics-certs\") pod \"router-default-5958786cb4-48wlg\" (UID: \"32ea204a-32cf-4de1-bd0e-675e568756f9\") " pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:24.260498 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.260346 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/546b86b2-63c7-46c8-b64c-048aaf992dca-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-82qpb\" (UID: \"546b86b2-63c7-46c8-b64c-048aaf992dca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82qpb" Apr 24 16:39:24.260498 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.260393 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ea323d90-8b5e-41a9-9633-538a3fdcead6-registry-certificates\") pod \"image-registry-759b975967-pgkrt\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:24.260498 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.260417 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f0f4702a-d6d6-410a-9800-fb13b913d223-tmp\") pod \"insights-operator-585dfdc468-8j4f4\" (UID: \"f0f4702a-d6d6-410a-9800-fb13b913d223\") " pod="openshift-insights/insights-operator-585dfdc468-8j4f4" Apr 24 16:39:24.260498 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.260442 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s46vw\" (UniqueName: \"kubernetes.io/projected/45369a43-7ed3-4f16-a1dc-7f1a61e06fc4-kube-api-access-s46vw\") pod \"console-operator-9d4b6777b-f2l2f\" (UID: \"45369a43-7ed3-4f16-a1dc-7f1a61e06fc4\") " pod="openshift-console-operator/console-operator-9d4b6777b-f2l2f" Apr 24 16:39:24.260498 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.260467 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/335d3a61-4224-4da9-adb2-5f83cb395511-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-jnwr7\" (UID: \"335d3a61-4224-4da9-adb2-5f83cb395511\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnwr7" Apr 24 16:39:24.260498 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.260493 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvbh6\" (UniqueName: \"kubernetes.io/projected/4075e057-13f0-412d-96df-8e124be59b52-kube-api-access-mvbh6\") pod \"cluster-monitoring-operator-75587bd455-n6v4x\" (UID: \"4075e057-13f0-412d-96df-8e124be59b52\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n6v4x" Apr 24 16:39:24.261109 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.260518 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45369a43-7ed3-4f16-a1dc-7f1a61e06fc4-config\") pod \"console-operator-9d4b6777b-f2l2f\" (UID: \"45369a43-7ed3-4f16-a1dc-7f1a61e06fc4\") " pod="openshift-console-operator/console-operator-9d4b6777b-f2l2f" Apr 24 16:39:24.261109 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.260594 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jcszp\" (UniqueName: \"kubernetes.io/projected/f0f4702a-d6d6-410a-9800-fb13b913d223-kube-api-access-jcszp\") pod \"insights-operator-585dfdc468-8j4f4\" (UID: \"f0f4702a-d6d6-410a-9800-fb13b913d223\") " pod="openshift-insights/insights-operator-585dfdc468-8j4f4" Apr 24 16:39:24.261109 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.260634 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea323d90-8b5e-41a9-9633-538a3fdcead6-trusted-ca\") pod \"image-registry-759b975967-pgkrt\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:24.261109 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.260960 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ea323d90-8b5e-41a9-9633-538a3fdcead6-ca-trust-extracted\") pod \"image-registry-759b975967-pgkrt\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:24.261109 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:24.260974 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:24.261109 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:24.260991 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-759b975967-pgkrt: secret "image-registry-tls" not found Apr 24 16:39:24.261109 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:24.261067 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-registry-tls podName:ea323d90-8b5e-41a9-9633-538a3fdcead6 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:24.761026226 +0000 UTC m=+35.132730431 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-registry-tls") pod "image-registry-759b975967-pgkrt" (UID: "ea323d90-8b5e-41a9-9633-538a3fdcead6") : secret "image-registry-tls" not found Apr 24 16:39:24.261593 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.261143 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f0f4702a-d6d6-410a-9800-fb13b913d223-tmp\") pod \"insights-operator-585dfdc468-8j4f4\" (UID: \"f0f4702a-d6d6-410a-9800-fb13b913d223\") " pod="openshift-insights/insights-operator-585dfdc468-8j4f4" Apr 24 16:39:24.261593 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.261498 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/f0f4702a-d6d6-410a-9800-fb13b913d223-snapshots\") pod \"insights-operator-585dfdc468-8j4f4\" (UID: \"f0f4702a-d6d6-410a-9800-fb13b913d223\") " pod="openshift-insights/insights-operator-585dfdc468-8j4f4" Apr 24 16:39:24.262212 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.262172 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0f4702a-d6d6-410a-9800-fb13b913d223-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-8j4f4\" (UID: \"f0f4702a-d6d6-410a-9800-fb13b913d223\") " pod="openshift-insights/insights-operator-585dfdc468-8j4f4" Apr 24 16:39:24.262605 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.262569 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ea323d90-8b5e-41a9-9633-538a3fdcead6-registry-certificates\") pod \"image-registry-759b975967-pgkrt\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:24.265438 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.265396 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ea323d90-8b5e-41a9-9633-538a3fdcead6-image-registry-private-configuration\") pod \"image-registry-759b975967-pgkrt\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:24.265655 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.265636 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0f4702a-d6d6-410a-9800-fb13b913d223-serving-cert\") pod \"insights-operator-585dfdc468-8j4f4\" (UID: \"f0f4702a-d6d6-410a-9800-fb13b913d223\") " pod="openshift-insights/insights-operator-585dfdc468-8j4f4" Apr 24 16:39:24.267384 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.267362 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ea323d90-8b5e-41a9-9633-538a3fdcead6-installation-pull-secrets\") pod \"image-registry-759b975967-pgkrt\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:24.269921 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.269900 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcszp\" (UniqueName: \"kubernetes.io/projected/f0f4702a-d6d6-410a-9800-fb13b913d223-kube-api-access-jcszp\") pod \"insights-operator-585dfdc468-8j4f4\" (UID: \"f0f4702a-d6d6-410a-9800-fb13b913d223\") " pod="openshift-insights/insights-operator-585dfdc468-8j4f4" Apr 24 16:39:24.270677 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.270415 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccmg4\" (UniqueName: \"kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-kube-api-access-ccmg4\") pod \"image-registry-759b975967-pgkrt\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:24.270677 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.270630 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-bound-sa-token\") pod \"image-registry-759b975967-pgkrt\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:24.280089 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.280069 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dzj8d"] Apr 24 16:39:24.280252 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.280234 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dzj8d" Apr 24 16:39:24.282580 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.282561 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 16:39:24.282685 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.282563 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 16:39:24.282685 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.282625 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-x6pvx\"" Apr 24 16:39:24.282685 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.282665 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 16:39:24.282992 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.282962 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 16:39:24.361048 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.361024 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s46vw\" (UniqueName: \"kubernetes.io/projected/45369a43-7ed3-4f16-a1dc-7f1a61e06fc4-kube-api-access-s46vw\") pod \"console-operator-9d4b6777b-f2l2f\" (UID: \"45369a43-7ed3-4f16-a1dc-7f1a61e06fc4\") " pod="openshift-console-operator/console-operator-9d4b6777b-f2l2f" Apr 24 16:39:24.361137 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.361058 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/335d3a61-4224-4da9-adb2-5f83cb395511-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-jnwr7\" (UID: \"335d3a61-4224-4da9-adb2-5f83cb395511\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnwr7" Apr 24 16:39:24.361137 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.361078 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mvbh6\" (UniqueName: \"kubernetes.io/projected/4075e057-13f0-412d-96df-8e124be59b52-kube-api-access-mvbh6\") pod \"cluster-monitoring-operator-75587bd455-n6v4x\" (UID: \"4075e057-13f0-412d-96df-8e124be59b52\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n6v4x" Apr 24 16:39:24.361137 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.361096 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45369a43-7ed3-4f16-a1dc-7f1a61e06fc4-config\") pod \"console-operator-9d4b6777b-f2l2f\" (UID: \"45369a43-7ed3-4f16-a1dc-7f1a61e06fc4\") " pod="openshift-console-operator/console-operator-9d4b6777b-f2l2f" Apr 24 16:39:24.361318 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.361145 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eefe78cf-7d49-4547-bded-f34c94ebc29b-tmp-dir\") pod \"dns-default-dzj8d\" (UID: \"eefe78cf-7d49-4547-bded-f34c94ebc29b\") " pod="openshift-dns/dns-default-dzj8d" Apr 24 16:39:24.361318 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.361181 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/45369a43-7ed3-4f16-a1dc-7f1a61e06fc4-trusted-ca\") pod \"console-operator-9d4b6777b-f2l2f\" (UID: \"45369a43-7ed3-4f16-a1dc-7f1a61e06fc4\") " pod="openshift-console-operator/console-operator-9d4b6777b-f2l2f" Apr 24 16:39:24.361318 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.361211 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/32ea204a-32cf-4de1-bd0e-675e568756f9-default-certificate\") pod \"router-default-5958786cb4-48wlg\" (UID: \"32ea204a-32cf-4de1-bd0e-675e568756f9\") " pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:24.361318 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.361239 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0f570bd-6250-4e0c-a370-c143d42e56f0-serving-cert\") pod \"service-ca-operator-d6fc45fc5-czwgm\" (UID: \"d0f570bd-6250-4e0c-a370-c143d42e56f0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-czwgm" Apr 24 16:39:24.361318 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.361266 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pck6\" (UniqueName: \"kubernetes.io/projected/9ae2e73c-423b-4876-a52d-cd4111ca0013-kube-api-access-6pck6\") pod \"ingress-canary-kcgx6\" (UID: \"9ae2e73c-423b-4876-a52d-cd4111ca0013\") " pod="openshift-ingress-canary/ingress-canary-kcgx6" Apr 24 16:39:24.361318 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.361311 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eefe78cf-7d49-4547-bded-f34c94ebc29b-config-volume\") pod \"dns-default-dzj8d\" (UID: \"eefe78cf-7d49-4547-bded-f34c94ebc29b\") " pod="openshift-dns/dns-default-dzj8d" Apr 24 16:39:24.361603 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.361342 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/32ea204a-32cf-4de1-bd0e-675e568756f9-stats-auth\") pod \"router-default-5958786cb4-48wlg\" (UID: \"32ea204a-32cf-4de1-bd0e-675e568756f9\") " pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:24.361603 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.361369 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmfv4\" (UniqueName: \"kubernetes.io/projected/0897170b-76cd-4278-802c-03e1d1747af3-kube-api-access-lmfv4\") pod \"network-check-source-8894fc9bd-xx69l\" (UID: \"0897170b-76cd-4278-802c-03e1d1747af3\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xx69l" Apr 24 16:39:24.361603 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.361416 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ae2e73c-423b-4876-a52d-cd4111ca0013-cert\") pod \"ingress-canary-kcgx6\" (UID: \"9ae2e73c-423b-4876-a52d-cd4111ca0013\") " pod="openshift-ingress-canary/ingress-canary-kcgx6" Apr 24 16:39:24.361603 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.361480 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/4075e057-13f0-412d-96df-8e124be59b52-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-n6v4x\" (UID: \"4075e057-13f0-412d-96df-8e124be59b52\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n6v4x" Apr 24 16:39:24.361603 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.361513 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32ea204a-32cf-4de1-bd0e-675e568756f9-service-ca-bundle\") pod \"router-default-5958786cb4-48wlg\" (UID: \"32ea204a-32cf-4de1-bd0e-675e568756f9\") " pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:24.361603 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.361560 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4075e057-13f0-412d-96df-8e124be59b52-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-n6v4x\" (UID: \"4075e057-13f0-412d-96df-8e124be59b52\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n6v4x" Apr 24 16:39:24.361603 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.361592 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-klxgt\" (UniqueName: \"kubernetes.io/projected/32ea204a-32cf-4de1-bd0e-675e568756f9-kube-api-access-klxgt\") pod \"router-default-5958786cb4-48wlg\" (UID: \"32ea204a-32cf-4de1-bd0e-675e568756f9\") " pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:24.361920 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.361616 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/335d3a61-4224-4da9-adb2-5f83cb395511-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-jnwr7\" (UID: \"335d3a61-4224-4da9-adb2-5f83cb395511\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnwr7" Apr 24 16:39:24.361920 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.361658 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtxfs\" (UniqueName: \"kubernetes.io/projected/d0f570bd-6250-4e0c-a370-c143d42e56f0-kube-api-access-mtxfs\") pod \"service-ca-operator-d6fc45fc5-czwgm\" (UID: \"d0f570bd-6250-4e0c-a370-c143d42e56f0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-czwgm" Apr 24 16:39:24.361920 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.361689 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s4nd4\" (UniqueName: \"kubernetes.io/projected/ee65ef05-2b40-4b45-a683-47cafa91b43c-kube-api-access-s4nd4\") pod \"volume-data-source-validator-7c6cbb6c87-kzb6p\" (UID: \"ee65ef05-2b40-4b45-a683-47cafa91b43c\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kzb6p" Apr 24 16:39:24.361920 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.361716 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0f570bd-6250-4e0c-a370-c143d42e56f0-config\") pod \"service-ca-operator-d6fc45fc5-czwgm\" (UID: \"d0f570bd-6250-4e0c-a370-c143d42e56f0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-czwgm" Apr 24 16:39:24.361920 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.361737 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/335d3a61-4224-4da9-adb2-5f83cb395511-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-jnwr7\" (UID: \"335d3a61-4224-4da9-adb2-5f83cb395511\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnwr7" Apr 24 16:39:24.361920 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.361743 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eefe78cf-7d49-4547-bded-f34c94ebc29b-metrics-tls\") pod \"dns-default-dzj8d\" (UID: \"eefe78cf-7d49-4547-bded-f34c94ebc29b\") " pod="openshift-dns/dns-default-dzj8d" Apr 24 16:39:24.361920 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.361767 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7ccj\" (UniqueName: \"kubernetes.io/projected/eefe78cf-7d49-4547-bded-f34c94ebc29b-kube-api-access-v7ccj\") pod \"dns-default-dzj8d\" (UID: \"eefe78cf-7d49-4547-bded-f34c94ebc29b\") " pod="openshift-dns/dns-default-dzj8d" Apr 24 16:39:24.361920 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.361810 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lmwsh\" (UniqueName: \"kubernetes.io/projected/335d3a61-4224-4da9-adb2-5f83cb395511-kube-api-access-lmwsh\") pod \"kube-storage-version-migrator-operator-6769c5d45-jnwr7\" (UID: \"335d3a61-4224-4da9-adb2-5f83cb395511\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnwr7" Apr 24 16:39:24.362308 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.361940 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vbxgv\" (UniqueName: \"kubernetes.io/projected/546b86b2-63c7-46c8-b64c-048aaf992dca-kube-api-access-vbxgv\") pod \"cluster-samples-operator-6dc5bdb6b4-82qpb\" (UID: \"546b86b2-63c7-46c8-b64c-048aaf992dca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82qpb" Apr 24 16:39:24.362308 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.362070 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45369a43-7ed3-4f16-a1dc-7f1a61e06fc4-serving-cert\") pod \"console-operator-9d4b6777b-f2l2f\" (UID: \"45369a43-7ed3-4f16-a1dc-7f1a61e06fc4\") " pod="openshift-console-operator/console-operator-9d4b6777b-f2l2f" Apr 24 16:39:24.362308 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.362104 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32ea204a-32cf-4de1-bd0e-675e568756f9-metrics-certs\") pod \"router-default-5958786cb4-48wlg\" (UID: \"32ea204a-32cf-4de1-bd0e-675e568756f9\") " pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:24.362308 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.362131 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/546b86b2-63c7-46c8-b64c-048aaf992dca-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-82qpb\" (UID: \"546b86b2-63c7-46c8-b64c-048aaf992dca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82qpb" Apr 24 16:39:24.362512 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:24.362316 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 16:39:24.362512 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:24.362387 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/546b86b2-63c7-46c8-b64c-048aaf992dca-samples-operator-tls podName:546b86b2-63c7-46c8-b64c-048aaf992dca nodeName:}" failed. No retries permitted until 2026-04-24 16:39:24.862367989 +0000 UTC m=+35.234072198 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/546b86b2-63c7-46c8-b64c-048aaf992dca-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-82qpb" (UID: "546b86b2-63c7-46c8-b64c-048aaf992dca") : secret "samples-operator-tls" not found Apr 24 16:39:24.362612 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.362538 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45369a43-7ed3-4f16-a1dc-7f1a61e06fc4-config\") pod \"console-operator-9d4b6777b-f2l2f\" (UID: \"45369a43-7ed3-4f16-a1dc-7f1a61e06fc4\") " pod="openshift-console-operator/console-operator-9d4b6777b-f2l2f" Apr 24 16:39:24.362683 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:24.362664 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 16:39:24.362744 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:24.362735 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32ea204a-32cf-4de1-bd0e-675e568756f9-metrics-certs podName:32ea204a-32cf-4de1-bd0e-675e568756f9 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:24.862717045 +0000 UTC m=+35.234421262 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/32ea204a-32cf-4de1-bd0e-675e568756f9-metrics-certs") pod "router-default-5958786cb4-48wlg" (UID: "32ea204a-32cf-4de1-bd0e-675e568756f9") : secret "router-metrics-certs-default" not found Apr 24 16:39:24.362927 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.362904 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/4075e057-13f0-412d-96df-8e124be59b52-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-n6v4x\" (UID: \"4075e057-13f0-412d-96df-8e124be59b52\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n6v4x" Apr 24 16:39:24.363060 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:24.363041 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/32ea204a-32cf-4de1-bd0e-675e568756f9-service-ca-bundle podName:32ea204a-32cf-4de1-bd0e-675e568756f9 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:24.863024354 +0000 UTC m=+35.234728559 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/32ea204a-32cf-4de1-bd0e-675e568756f9-service-ca-bundle") pod "router-default-5958786cb4-48wlg" (UID: "32ea204a-32cf-4de1-bd0e-675e568756f9") : configmap references non-existent config key: service-ca.crt Apr 24 16:39:24.363205 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:24.363185 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 16:39:24.363348 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:24.363242 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4075e057-13f0-412d-96df-8e124be59b52-cluster-monitoring-operator-tls podName:4075e057-13f0-412d-96df-8e124be59b52 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:24.863228199 +0000 UTC m=+35.234932405 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4075e057-13f0-412d-96df-8e124be59b52-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-n6v4x" (UID: "4075e057-13f0-412d-96df-8e124be59b52") : secret "cluster-monitoring-operator-tls" not found Apr 24 16:39:24.363348 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.363311 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-8j4f4" Apr 24 16:39:24.364897 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.364439 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/335d3a61-4224-4da9-adb2-5f83cb395511-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-jnwr7\" (UID: \"335d3a61-4224-4da9-adb2-5f83cb395511\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnwr7" Apr 24 16:39:24.364897 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.364714 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/32ea204a-32cf-4de1-bd0e-675e568756f9-default-certificate\") pod \"router-default-5958786cb4-48wlg\" (UID: \"32ea204a-32cf-4de1-bd0e-675e568756f9\") " pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:24.365741 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.365716 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45369a43-7ed3-4f16-a1dc-7f1a61e06fc4-serving-cert\") pod \"console-operator-9d4b6777b-f2l2f\" (UID: \"45369a43-7ed3-4f16-a1dc-7f1a61e06fc4\") " pod="openshift-console-operator/console-operator-9d4b6777b-f2l2f" Apr 24 16:39:24.365848 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.365744 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/32ea204a-32cf-4de1-bd0e-675e568756f9-stats-auth\") pod \"router-default-5958786cb4-48wlg\" (UID: \"32ea204a-32cf-4de1-bd0e-675e568756f9\") " pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:24.372380 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.371259 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvbh6\" (UniqueName: \"kubernetes.io/projected/4075e057-13f0-412d-96df-8e124be59b52-kube-api-access-mvbh6\") pod \"cluster-monitoring-operator-75587bd455-n6v4x\" (UID: \"4075e057-13f0-412d-96df-8e124be59b52\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n6v4x" Apr 24 16:39:24.372380 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.371653 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4nd4\" (UniqueName: \"kubernetes.io/projected/ee65ef05-2b40-4b45-a683-47cafa91b43c-kube-api-access-s4nd4\") pod \"volume-data-source-validator-7c6cbb6c87-kzb6p\" (UID: \"ee65ef05-2b40-4b45-a683-47cafa91b43c\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kzb6p" Apr 24 16:39:24.372380 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.372005 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-klxgt\" (UniqueName: \"kubernetes.io/projected/32ea204a-32cf-4de1-bd0e-675e568756f9-kube-api-access-klxgt\") pod \"router-default-5958786cb4-48wlg\" (UID: \"32ea204a-32cf-4de1-bd0e-675e568756f9\") " pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:24.373872 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.373211 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hgbpw" event={"ID":"0c733890-1ac2-464e-9672-65bf90aded78","Type":"ContainerStarted","Data":"04efb0f4a6edcd8be1e04e481f56b996ecd01d4677b4c48ba7d7691a48a9ffbb"} Apr 24 16:39:24.373872 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.373828 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/45369a43-7ed3-4f16-a1dc-7f1a61e06fc4-trusted-ca\") pod \"console-operator-9d4b6777b-f2l2f\" (UID: \"45369a43-7ed3-4f16-a1dc-7f1a61e06fc4\") " pod="openshift-console-operator/console-operator-9d4b6777b-f2l2f" Apr 24 16:39:24.374323 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.374302 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbxgv\" (UniqueName: \"kubernetes.io/projected/546b86b2-63c7-46c8-b64c-048aaf992dca-kube-api-access-vbxgv\") pod \"cluster-samples-operator-6dc5bdb6b4-82qpb\" (UID: \"546b86b2-63c7-46c8-b64c-048aaf992dca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82qpb" Apr 24 16:39:24.375313 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.375272 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmwsh\" (UniqueName: \"kubernetes.io/projected/335d3a61-4224-4da9-adb2-5f83cb395511-kube-api-access-lmwsh\") pod \"kube-storage-version-migrator-operator-6769c5d45-jnwr7\" (UID: \"335d3a61-4224-4da9-adb2-5f83cb395511\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnwr7" Apr 24 16:39:24.382039 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.382013 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s46vw\" (UniqueName: \"kubernetes.io/projected/45369a43-7ed3-4f16-a1dc-7f1a61e06fc4-kube-api-access-s46vw\") pod \"console-operator-9d4b6777b-f2l2f\" (UID: \"45369a43-7ed3-4f16-a1dc-7f1a61e06fc4\") " pod="openshift-console-operator/console-operator-9d4b6777b-f2l2f" Apr 24 16:39:24.452421 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.452381 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kzb6p" Apr 24 16:39:24.463781 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.463585 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eefe78cf-7d49-4547-bded-f34c94ebc29b-tmp-dir\") pod \"dns-default-dzj8d\" (UID: \"eefe78cf-7d49-4547-bded-f34c94ebc29b\") " pod="openshift-dns/dns-default-dzj8d" Apr 24 16:39:24.463910 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.463826 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0f570bd-6250-4e0c-a370-c143d42e56f0-serving-cert\") pod \"service-ca-operator-d6fc45fc5-czwgm\" (UID: \"d0f570bd-6250-4e0c-a370-c143d42e56f0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-czwgm" Apr 24 16:39:24.463910 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.463863 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6pck6\" (UniqueName: \"kubernetes.io/projected/9ae2e73c-423b-4876-a52d-cd4111ca0013-kube-api-access-6pck6\") pod \"ingress-canary-kcgx6\" (UID: \"9ae2e73c-423b-4876-a52d-cd4111ca0013\") " pod="openshift-ingress-canary/ingress-canary-kcgx6" Apr 24 16:39:24.463910 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.463887 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eefe78cf-7d49-4547-bded-f34c94ebc29b-config-volume\") pod \"dns-default-dzj8d\" (UID: \"eefe78cf-7d49-4547-bded-f34c94ebc29b\") " pod="openshift-dns/dns-default-dzj8d" Apr 24 16:39:24.464064 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.463924 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lmfv4\" (UniqueName: \"kubernetes.io/projected/0897170b-76cd-4278-802c-03e1d1747af3-kube-api-access-lmfv4\") pod \"network-check-source-8894fc9bd-xx69l\" (UID: \"0897170b-76cd-4278-802c-03e1d1747af3\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xx69l" Apr 24 16:39:24.464064 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.463986 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ae2e73c-423b-4876-a52d-cd4111ca0013-cert\") pod \"ingress-canary-kcgx6\" (UID: \"9ae2e73c-423b-4876-a52d-cd4111ca0013\") " pod="openshift-ingress-canary/ingress-canary-kcgx6" Apr 24 16:39:24.464159 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.464061 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtxfs\" (UniqueName: \"kubernetes.io/projected/d0f570bd-6250-4e0c-a370-c143d42e56f0-kube-api-access-mtxfs\") pod \"service-ca-operator-d6fc45fc5-czwgm\" (UID: \"d0f570bd-6250-4e0c-a370-c143d42e56f0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-czwgm" Apr 24 16:39:24.464159 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.464093 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0f570bd-6250-4e0c-a370-c143d42e56f0-config\") pod \"service-ca-operator-d6fc45fc5-czwgm\" (UID: \"d0f570bd-6250-4e0c-a370-c143d42e56f0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-czwgm" Apr 24 16:39:24.464159 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.464117 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eefe78cf-7d49-4547-bded-f34c94ebc29b-metrics-tls\") pod \"dns-default-dzj8d\" (UID: \"eefe78cf-7d49-4547-bded-f34c94ebc29b\") " pod="openshift-dns/dns-default-dzj8d" Apr 24 16:39:24.464159 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.464142 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v7ccj\" (UniqueName: \"kubernetes.io/projected/eefe78cf-7d49-4547-bded-f34c94ebc29b-kube-api-access-v7ccj\") pod \"dns-default-dzj8d\" (UID: \"eefe78cf-7d49-4547-bded-f34c94ebc29b\") " pod="openshift-dns/dns-default-dzj8d" Apr 24 16:39:24.468607 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:24.466301 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:24.468607 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:24.466383 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ae2e73c-423b-4876-a52d-cd4111ca0013-cert podName:9ae2e73c-423b-4876-a52d-cd4111ca0013 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:24.966353985 +0000 UTC m=+35.338058203 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ae2e73c-423b-4876-a52d-cd4111ca0013-cert") pod "ingress-canary-kcgx6" (UID: "9ae2e73c-423b-4876-a52d-cd4111ca0013") : secret "canary-serving-cert" not found Apr 24 16:39:24.468607 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:24.466305 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:24.468607 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:24.466686 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eefe78cf-7d49-4547-bded-f34c94ebc29b-metrics-tls podName:eefe78cf-7d49-4547-bded-f34c94ebc29b nodeName:}" failed. No retries permitted until 2026-04-24 16:39:24.966662949 +0000 UTC m=+35.338367166 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/eefe78cf-7d49-4547-bded-f34c94ebc29b-metrics-tls") pod "dns-default-dzj8d" (UID: "eefe78cf-7d49-4547-bded-f34c94ebc29b") : secret "dns-default-metrics-tls" not found Apr 24 16:39:24.468607 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.467145 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0f570bd-6250-4e0c-a370-c143d42e56f0-config\") pod \"service-ca-operator-d6fc45fc5-czwgm\" (UID: \"d0f570bd-6250-4e0c-a370-c143d42e56f0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-czwgm" Apr 24 16:39:24.469000 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.468890 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0f570bd-6250-4e0c-a370-c143d42e56f0-serving-cert\") pod \"service-ca-operator-d6fc45fc5-czwgm\" (UID: \"d0f570bd-6250-4e0c-a370-c143d42e56f0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-czwgm" Apr 24 16:39:24.472859 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.471828 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-f2l2f" Apr 24 16:39:24.472859 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.472325 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eefe78cf-7d49-4547-bded-f34c94ebc29b-tmp-dir\") pod \"dns-default-dzj8d\" (UID: \"eefe78cf-7d49-4547-bded-f34c94ebc29b\") " pod="openshift-dns/dns-default-dzj8d" Apr 24 16:39:24.475331 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.474507 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eefe78cf-7d49-4547-bded-f34c94ebc29b-config-volume\") pod \"dns-default-dzj8d\" (UID: \"eefe78cf-7d49-4547-bded-f34c94ebc29b\") " pod="openshift-dns/dns-default-dzj8d" Apr 24 16:39:24.478754 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.478726 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtxfs\" (UniqueName: \"kubernetes.io/projected/d0f570bd-6250-4e0c-a370-c143d42e56f0-kube-api-access-mtxfs\") pod \"service-ca-operator-d6fc45fc5-czwgm\" (UID: \"d0f570bd-6250-4e0c-a370-c143d42e56f0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-czwgm" Apr 24 16:39:24.484481 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.484459 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pck6\" (UniqueName: \"kubernetes.io/projected/9ae2e73c-423b-4876-a52d-cd4111ca0013-kube-api-access-6pck6\") pod \"ingress-canary-kcgx6\" (UID: \"9ae2e73c-423b-4876-a52d-cd4111ca0013\") " pod="openshift-ingress-canary/ingress-canary-kcgx6" Apr 24 16:39:24.485699 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.485658 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7ccj\" (UniqueName: \"kubernetes.io/projected/eefe78cf-7d49-4547-bded-f34c94ebc29b-kube-api-access-v7ccj\") pod \"dns-default-dzj8d\" (UID: \"eefe78cf-7d49-4547-bded-f34c94ebc29b\") " pod="openshift-dns/dns-default-dzj8d" Apr 24 16:39:24.489751 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.489726 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnwr7" Apr 24 16:39:24.492044 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.492019 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmfv4\" (UniqueName: \"kubernetes.io/projected/0897170b-76cd-4278-802c-03e1d1747af3-kube-api-access-lmfv4\") pod \"network-check-source-8894fc9bd-xx69l\" (UID: \"0897170b-76cd-4278-802c-03e1d1747af3\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xx69l" Apr 24 16:39:24.516015 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.515480 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-czwgm" Apr 24 16:39:24.525265 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.522678 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-9nlzl"] Apr 24 16:39:24.536777 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.534857 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9nlzl" Apr 24 16:39:24.539804 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.539548 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-vbkkw\"" Apr 24 16:39:24.545227 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.544146 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xx69l" Apr 24 16:39:24.569146 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.568937 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-8j4f4"] Apr 24 16:39:24.654327 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.654305 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kzb6p"] Apr 24 16:39:24.659395 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.659371 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-f2l2f"] Apr 24 16:39:24.667572 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.667548 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2c5b2afa-97e0-4381-b83f-848951dec5c9-tmp-dir\") pod \"node-resolver-9nlzl\" (UID: \"2c5b2afa-97e0-4381-b83f-848951dec5c9\") " pod="openshift-dns/node-resolver-9nlzl" Apr 24 16:39:24.667665 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.667589 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2c5b2afa-97e0-4381-b83f-848951dec5c9-hosts-file\") pod \"node-resolver-9nlzl\" (UID: \"2c5b2afa-97e0-4381-b83f-848951dec5c9\") " pod="openshift-dns/node-resolver-9nlzl" Apr 24 16:39:24.667698 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.667670 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7g97\" (UniqueName: \"kubernetes.io/projected/2c5b2afa-97e0-4381-b83f-848951dec5c9-kube-api-access-p7g97\") pod \"node-resolver-9nlzl\" (UID: \"2c5b2afa-97e0-4381-b83f-848951dec5c9\") " pod="openshift-dns/node-resolver-9nlzl" Apr 24 16:39:24.680359 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.680332 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnwr7"] Apr 24 16:39:24.685243 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:39:24.685200 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45369a43_7ed3_4f16_a1dc_7f1a61e06fc4.slice/crio-4061e9c64f32b284e61a3b3b108bad4b23dbff8567d4886054b01b13f995d489 WatchSource:0}: Error finding container 4061e9c64f32b284e61a3b3b108bad4b23dbff8567d4886054b01b13f995d489: Status 404 returned error can't find the container with id 4061e9c64f32b284e61a3b3b108bad4b23dbff8567d4886054b01b13f995d489 Apr 24 16:39:24.689593 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:39:24.689401 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee65ef05_2b40_4b45_a683_47cafa91b43c.slice/crio-cd48f33de8d5729102e4570d5c0e7366957565a56ae991ccc79fbcce45d83062 WatchSource:0}: Error finding container cd48f33de8d5729102e4570d5c0e7366957565a56ae991ccc79fbcce45d83062: Status 404 returned error can't find the container with id cd48f33de8d5729102e4570d5c0e7366957565a56ae991ccc79fbcce45d83062 Apr 24 16:39:24.689849 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:39:24.689824 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0f4702a_d6d6_410a_9800_fb13b913d223.slice/crio-1a30e7b09a3d8cd67d7cdbdb849068f689fe030335b650f249f4aab11b6337d2 WatchSource:0}: Error finding container 1a30e7b09a3d8cd67d7cdbdb849068f689fe030335b650f249f4aab11b6337d2: Status 404 returned error can't find the container with id 1a30e7b09a3d8cd67d7cdbdb849068f689fe030335b650f249f4aab11b6337d2 Apr 24 16:39:24.690478 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:39:24.690161 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod335d3a61_4224_4da9_adb2_5f83cb395511.slice/crio-7b7009dad83f1837d45513e67fa39d8a7e7a3aee06cf7dfa9b1394ea829d8497 WatchSource:0}: Error finding container 7b7009dad83f1837d45513e67fa39d8a7e7a3aee06cf7dfa9b1394ea829d8497: Status 404 returned error can't find the container with id 7b7009dad83f1837d45513e67fa39d8a7e7a3aee06cf7dfa9b1394ea829d8497 Apr 24 16:39:24.768863 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.768832 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2c5b2afa-97e0-4381-b83f-848951dec5c9-hosts-file\") pod \"node-resolver-9nlzl\" (UID: \"2c5b2afa-97e0-4381-b83f-848951dec5c9\") " pod="openshift-dns/node-resolver-9nlzl" Apr 24 16:39:24.768991 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.768885 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-registry-tls\") pod \"image-registry-759b975967-pgkrt\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:24.768991 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.768922 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7g97\" (UniqueName: \"kubernetes.io/projected/2c5b2afa-97e0-4381-b83f-848951dec5c9-kube-api-access-p7g97\") pod \"node-resolver-9nlzl\" (UID: \"2c5b2afa-97e0-4381-b83f-848951dec5c9\") " pod="openshift-dns/node-resolver-9nlzl" Apr 24 16:39:24.768991 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.768974 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2c5b2afa-97e0-4381-b83f-848951dec5c9-hosts-file\") pod \"node-resolver-9nlzl\" (UID: \"2c5b2afa-97e0-4381-b83f-848951dec5c9\") " pod="openshift-dns/node-resolver-9nlzl" Apr 24 16:39:24.769132 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:24.769083 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:24.769132 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:24.769103 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-759b975967-pgkrt: secret "image-registry-tls" not found Apr 24 16:39:24.769132 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.769111 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2c5b2afa-97e0-4381-b83f-848951dec5c9-tmp-dir\") pod \"node-resolver-9nlzl\" (UID: \"2c5b2afa-97e0-4381-b83f-848951dec5c9\") " pod="openshift-dns/node-resolver-9nlzl" Apr 24 16:39:24.769264 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:24.769168 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-registry-tls podName:ea323d90-8b5e-41a9-9633-538a3fdcead6 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:25.769146124 +0000 UTC m=+36.140850335 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-registry-tls") pod "image-registry-759b975967-pgkrt" (UID: "ea323d90-8b5e-41a9-9633-538a3fdcead6") : secret "image-registry-tls" not found Apr 24 16:39:24.769480 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.769461 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2c5b2afa-97e0-4381-b83f-848951dec5c9-tmp-dir\") pod \"node-resolver-9nlzl\" (UID: \"2c5b2afa-97e0-4381-b83f-848951dec5c9\") " pod="openshift-dns/node-resolver-9nlzl" Apr 24 16:39:24.781054 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.781029 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7g97\" (UniqueName: \"kubernetes.io/projected/2c5b2afa-97e0-4381-b83f-848951dec5c9-kube-api-access-p7g97\") pod \"node-resolver-9nlzl\" (UID: \"2c5b2afa-97e0-4381-b83f-848951dec5c9\") " pod="openshift-dns/node-resolver-9nlzl" Apr 24 16:39:24.824339 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.824303 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-xx69l"] Apr 24 16:39:24.825443 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.825352 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-czwgm"] Apr 24 16:39:24.827430 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:39:24.827405 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0f570bd_6250_4e0c_a370_c143d42e56f0.slice/crio-dbfa8d64b70bd1decf82a04501faca77bb39334f2264b173e387f2ef83b876bd WatchSource:0}: Error finding container dbfa8d64b70bd1decf82a04501faca77bb39334f2264b173e387f2ef83b876bd: Status 404 returned error can't find the container with id dbfa8d64b70bd1decf82a04501faca77bb39334f2264b173e387f2ef83b876bd Apr 24 16:39:24.828349 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:39:24.828324 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0897170b_76cd_4278_802c_03e1d1747af3.slice/crio-abdb7e3232c55434e911162749bee04cd169c0f6a300463077fcb5f228d4c617 WatchSource:0}: Error finding container abdb7e3232c55434e911162749bee04cd169c0f6a300463077fcb5f228d4c617: Status 404 returned error can't find the container with id abdb7e3232c55434e911162749bee04cd169c0f6a300463077fcb5f228d4c617 Apr 24 16:39:24.870336 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.870248 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32ea204a-32cf-4de1-bd0e-675e568756f9-metrics-certs\") pod \"router-default-5958786cb4-48wlg\" (UID: \"32ea204a-32cf-4de1-bd0e-675e568756f9\") " pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:24.870336 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.870323 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/546b86b2-63c7-46c8-b64c-048aaf992dca-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-82qpb\" (UID: \"546b86b2-63c7-46c8-b64c-048aaf992dca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82qpb" Apr 24 16:39:24.870482 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:24.870410 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 16:39:24.870482 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.870451 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32ea204a-32cf-4de1-bd0e-675e568756f9-service-ca-bundle\") pod \"router-default-5958786cb4-48wlg\" (UID: \"32ea204a-32cf-4de1-bd0e-675e568756f9\") " pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:24.870482 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:24.870469 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 16:39:24.870561 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:24.870470 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32ea204a-32cf-4de1-bd0e-675e568756f9-metrics-certs podName:32ea204a-32cf-4de1-bd0e-675e568756f9 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:25.870451659 +0000 UTC m=+36.242155867 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/32ea204a-32cf-4de1-bd0e-675e568756f9-metrics-certs") pod "router-default-5958786cb4-48wlg" (UID: "32ea204a-32cf-4de1-bd0e-675e568756f9") : secret "router-metrics-certs-default" not found Apr 24 16:39:24.870561 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:24.870540 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/546b86b2-63c7-46c8-b64c-048aaf992dca-samples-operator-tls podName:546b86b2-63c7-46c8-b64c-048aaf992dca nodeName:}" failed. No retries permitted until 2026-04-24 16:39:25.870520767 +0000 UTC m=+36.242224987 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/546b86b2-63c7-46c8-b64c-048aaf992dca-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-82qpb" (UID: "546b86b2-63c7-46c8-b64c-048aaf992dca") : secret "samples-operator-tls" not found Apr 24 16:39:24.870629 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:24.870564 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/32ea204a-32cf-4de1-bd0e-675e568756f9-service-ca-bundle podName:32ea204a-32cf-4de1-bd0e-675e568756f9 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:25.870554218 +0000 UTC m=+36.242258436 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/32ea204a-32cf-4de1-bd0e-675e568756f9-service-ca-bundle") pod "router-default-5958786cb4-48wlg" (UID: "32ea204a-32cf-4de1-bd0e-675e568756f9") : configmap references non-existent config key: service-ca.crt Apr 24 16:39:24.870629 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.870616 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4075e057-13f0-412d-96df-8e124be59b52-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-n6v4x\" (UID: \"4075e057-13f0-412d-96df-8e124be59b52\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n6v4x" Apr 24 16:39:24.870739 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:24.870725 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 16:39:24.870772 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:24.870767 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4075e057-13f0-412d-96df-8e124be59b52-cluster-monitoring-operator-tls podName:4075e057-13f0-412d-96df-8e124be59b52 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:25.870758726 +0000 UTC m=+36.242462930 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4075e057-13f0-412d-96df-8e124be59b52-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-n6v4x" (UID: "4075e057-13f0-412d-96df-8e124be59b52") : secret "cluster-monitoring-operator-tls" not found Apr 24 16:39:24.886999 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.886975 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9nlzl" Apr 24 16:39:24.894517 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:39:24.894493 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c5b2afa_97e0_4381_b83f_848951dec5c9.slice/crio-1f3174243ac552cac98251256d6330ec20a60a1e93afc4d6860d04c15970316b WatchSource:0}: Error finding container 1f3174243ac552cac98251256d6330ec20a60a1e93afc4d6860d04c15970316b: Status 404 returned error can't find the container with id 1f3174243ac552cac98251256d6330ec20a60a1e93afc4d6860d04c15970316b Apr 24 16:39:24.972082 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.972052 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eefe78cf-7d49-4547-bded-f34c94ebc29b-metrics-tls\") pod \"dns-default-dzj8d\" (UID: \"eefe78cf-7d49-4547-bded-f34c94ebc29b\") " pod="openshift-dns/dns-default-dzj8d" Apr 24 16:39:24.972223 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:24.972207 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ae2e73c-423b-4876-a52d-cd4111ca0013-cert\") pod \"ingress-canary-kcgx6\" (UID: \"9ae2e73c-423b-4876-a52d-cd4111ca0013\") " pod="openshift-ingress-canary/ingress-canary-kcgx6" Apr 24 16:39:24.972309 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:24.972221 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:24.972309 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:24.972302 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eefe78cf-7d49-4547-bded-f34c94ebc29b-metrics-tls podName:eefe78cf-7d49-4547-bded-f34c94ebc29b nodeName:}" failed. No retries permitted until 2026-04-24 16:39:25.972266319 +0000 UTC m=+36.343970548 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/eefe78cf-7d49-4547-bded-f34c94ebc29b-metrics-tls") pod "dns-default-dzj8d" (UID: "eefe78cf-7d49-4547-bded-f34c94ebc29b") : secret "dns-default-metrics-tls" not found Apr 24 16:39:24.972390 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:24.972339 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:24.972390 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:24.972373 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ae2e73c-423b-4876-a52d-cd4111ca0013-cert podName:9ae2e73c-423b-4876-a52d-cd4111ca0013 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:25.972362746 +0000 UTC m=+36.344066963 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ae2e73c-423b-4876-a52d-cd4111ca0013-cert") pod "ingress-canary-kcgx6" (UID: "9ae2e73c-423b-4876-a52d-cd4111ca0013") : secret "canary-serving-cert" not found Apr 24 16:39:25.232852 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:25.232813 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:39:25.233029 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:25.232852 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:39:25.233029 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:25.232929 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:39:25.236999 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:25.236975 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 16:39:25.237244 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:25.237219 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fcvt5\"" Apr 24 16:39:25.238220 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:25.237772 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-j4znz\"" Apr 24 16:39:25.238220 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:25.237979 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 16:39:25.378738 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:25.378665 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xx69l" event={"ID":"0897170b-76cd-4278-802c-03e1d1747af3","Type":"ContainerStarted","Data":"abdb7e3232c55434e911162749bee04cd169c0f6a300463077fcb5f228d4c617"} Apr 24 16:39:25.384147 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:25.384108 2578 generic.go:358] "Generic (PLEG): container finished" podID="0c733890-1ac2-464e-9672-65bf90aded78" containerID="04efb0f4a6edcd8be1e04e481f56b996ecd01d4677b4c48ba7d7691a48a9ffbb" exitCode=0 Apr 24 16:39:25.384308 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:25.384182 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hgbpw" event={"ID":"0c733890-1ac2-464e-9672-65bf90aded78","Type":"ContainerDied","Data":"04efb0f4a6edcd8be1e04e481f56b996ecd01d4677b4c48ba7d7691a48a9ffbb"} Apr 24 16:39:25.386760 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:25.386699 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-f2l2f" event={"ID":"45369a43-7ed3-4f16-a1dc-7f1a61e06fc4","Type":"ContainerStarted","Data":"4061e9c64f32b284e61a3b3b108bad4b23dbff8567d4886054b01b13f995d489"} Apr 24 16:39:25.394704 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:25.394666 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kzb6p" event={"ID":"ee65ef05-2b40-4b45-a683-47cafa91b43c","Type":"ContainerStarted","Data":"cd48f33de8d5729102e4570d5c0e7366957565a56ae991ccc79fbcce45d83062"} Apr 24 16:39:25.396122 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:25.396076 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnwr7" event={"ID":"335d3a61-4224-4da9-adb2-5f83cb395511","Type":"ContainerStarted","Data":"7b7009dad83f1837d45513e67fa39d8a7e7a3aee06cf7dfa9b1394ea829d8497"} Apr 24 16:39:25.398452 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:25.397906 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9nlzl" event={"ID":"2c5b2afa-97e0-4381-b83f-848951dec5c9","Type":"ContainerStarted","Data":"994a85d81ca0ab5ac0d326a2781c9ae471c5c6c722f42bfb074345d833c988f5"} Apr 24 16:39:25.398452 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:25.397936 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9nlzl" event={"ID":"2c5b2afa-97e0-4381-b83f-848951dec5c9","Type":"ContainerStarted","Data":"1f3174243ac552cac98251256d6330ec20a60a1e93afc4d6860d04c15970316b"} Apr 24 16:39:25.400454 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:25.400414 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-czwgm" event={"ID":"d0f570bd-6250-4e0c-a370-c143d42e56f0","Type":"ContainerStarted","Data":"dbfa8d64b70bd1decf82a04501faca77bb39334f2264b173e387f2ef83b876bd"} Apr 24 16:39:25.401574 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:25.401499 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-8j4f4" event={"ID":"f0f4702a-d6d6-410a-9800-fb13b913d223","Type":"ContainerStarted","Data":"1a30e7b09a3d8cd67d7cdbdb849068f689fe030335b650f249f4aab11b6337d2"} Apr 24 16:39:25.784963 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:25.784129 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-registry-tls\") pod \"image-registry-759b975967-pgkrt\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:25.784963 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:25.784329 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:25.784963 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:25.784349 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-759b975967-pgkrt: secret "image-registry-tls" not found Apr 24 16:39:25.784963 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:25.784422 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-registry-tls podName:ea323d90-8b5e-41a9-9633-538a3fdcead6 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:27.784402219 +0000 UTC m=+38.156106437 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-registry-tls") pod "image-registry-759b975967-pgkrt" (UID: "ea323d90-8b5e-41a9-9633-538a3fdcead6") : secret "image-registry-tls" not found Apr 24 16:39:25.886625 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:25.885586 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4075e057-13f0-412d-96df-8e124be59b52-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-n6v4x\" (UID: \"4075e057-13f0-412d-96df-8e124be59b52\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n6v4x" Apr 24 16:39:25.886625 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:25.885676 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32ea204a-32cf-4de1-bd0e-675e568756f9-metrics-certs\") pod \"router-default-5958786cb4-48wlg\" (UID: \"32ea204a-32cf-4de1-bd0e-675e568756f9\") " pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:25.886625 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:25.885694 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/546b86b2-63c7-46c8-b64c-048aaf992dca-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-82qpb\" (UID: \"546b86b2-63c7-46c8-b64c-048aaf992dca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82qpb" Apr 24 16:39:25.886625 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:25.885772 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32ea204a-32cf-4de1-bd0e-675e568756f9-service-ca-bundle\") pod \"router-default-5958786cb4-48wlg\" (UID: \"32ea204a-32cf-4de1-bd0e-675e568756f9\") " pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:25.886625 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:25.885890 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/32ea204a-32cf-4de1-bd0e-675e568756f9-service-ca-bundle podName:32ea204a-32cf-4de1-bd0e-675e568756f9 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:27.885877543 +0000 UTC m=+38.257581747 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/32ea204a-32cf-4de1-bd0e-675e568756f9-service-ca-bundle") pod "router-default-5958786cb4-48wlg" (UID: "32ea204a-32cf-4de1-bd0e-675e568756f9") : configmap references non-existent config key: service-ca.crt Apr 24 16:39:25.886625 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:25.886274 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 16:39:25.886625 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:25.886354 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4075e057-13f0-412d-96df-8e124be59b52-cluster-monitoring-operator-tls podName:4075e057-13f0-412d-96df-8e124be59b52 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:27.886338237 +0000 UTC m=+38.258042444 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4075e057-13f0-412d-96df-8e124be59b52-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-n6v4x" (UID: "4075e057-13f0-412d-96df-8e124be59b52") : secret "cluster-monitoring-operator-tls" not found Apr 24 16:39:25.886625 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:25.886410 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 16:39:25.886625 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:25.886440 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32ea204a-32cf-4de1-bd0e-675e568756f9-metrics-certs podName:32ea204a-32cf-4de1-bd0e-675e568756f9 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:27.886430959 +0000 UTC m=+38.258135166 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/32ea204a-32cf-4de1-bd0e-675e568756f9-metrics-certs") pod "router-default-5958786cb4-48wlg" (UID: "32ea204a-32cf-4de1-bd0e-675e568756f9") : secret "router-metrics-certs-default" not found Apr 24 16:39:25.886625 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:25.886491 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 16:39:25.886625 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:25.886520 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/546b86b2-63c7-46c8-b64c-048aaf992dca-samples-operator-tls podName:546b86b2-63c7-46c8-b64c-048aaf992dca nodeName:}" failed. No retries permitted until 2026-04-24 16:39:27.886511891 +0000 UTC m=+38.258216096 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/546b86b2-63c7-46c8-b64c-048aaf992dca-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-82qpb" (UID: "546b86b2-63c7-46c8-b64c-048aaf992dca") : secret "samples-operator-tls" not found Apr 24 16:39:25.987766 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:25.987201 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ae2e73c-423b-4876-a52d-cd4111ca0013-cert\") pod \"ingress-canary-kcgx6\" (UID: \"9ae2e73c-423b-4876-a52d-cd4111ca0013\") " pod="openshift-ingress-canary/ingress-canary-kcgx6" Apr 24 16:39:25.987766 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:25.987258 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eefe78cf-7d49-4547-bded-f34c94ebc29b-metrics-tls\") pod \"dns-default-dzj8d\" (UID: \"eefe78cf-7d49-4547-bded-f34c94ebc29b\") " pod="openshift-dns/dns-default-dzj8d" Apr 24 16:39:25.987766 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:25.987407 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:25.987766 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:25.987443 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:25.987766 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:25.987485 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ae2e73c-423b-4876-a52d-cd4111ca0013-cert podName:9ae2e73c-423b-4876-a52d-cd4111ca0013 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:27.987464021 +0000 UTC m=+38.359168229 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ae2e73c-423b-4876-a52d-cd4111ca0013-cert") pod "ingress-canary-kcgx6" (UID: "9ae2e73c-423b-4876-a52d-cd4111ca0013") : secret "canary-serving-cert" not found Apr 24 16:39:25.987766 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:25.987505 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eefe78cf-7d49-4547-bded-f34c94ebc29b-metrics-tls podName:eefe78cf-7d49-4547-bded-f34c94ebc29b nodeName:}" failed. No retries permitted until 2026-04-24 16:39:27.987495931 +0000 UTC m=+38.359200135 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/eefe78cf-7d49-4547-bded-f34c94ebc29b-metrics-tls") pod "dns-default-dzj8d" (UID: "eefe78cf-7d49-4547-bded-f34c94ebc29b") : secret "dns-default-metrics-tls" not found Apr 24 16:39:26.408247 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:26.408207 2578 generic.go:358] "Generic (PLEG): container finished" podID="0c733890-1ac2-464e-9672-65bf90aded78" containerID="a2bd7a4612b791ac018c005b692b98c5cd741a722d2be189c10e907f0bcd4537" exitCode=0 Apr 24 16:39:26.408802 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:26.408260 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hgbpw" event={"ID":"0c733890-1ac2-464e-9672-65bf90aded78","Type":"ContainerDied","Data":"a2bd7a4612b791ac018c005b692b98c5cd741a722d2be189c10e907f0bcd4537"} Apr 24 16:39:26.431762 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:26.430484 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9nlzl" podStartSLOduration=2.430466151 podStartE2EDuration="2.430466151s" podCreationTimestamp="2026-04-24 16:39:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:39:25.442815461 +0000 UTC m=+35.814519688" watchObservedRunningTime="2026-04-24 16:39:26.430466151 +0000 UTC m=+36.802170377" Apr 24 16:39:27.806522 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:27.806425 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-registry-tls\") pod \"image-registry-759b975967-pgkrt\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:27.806908 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:27.806610 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:27.806908 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:27.806635 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-759b975967-pgkrt: secret "image-registry-tls" not found Apr 24 16:39:27.806908 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:27.806690 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-registry-tls podName:ea323d90-8b5e-41a9-9633-538a3fdcead6 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:31.806676369 +0000 UTC m=+42.178380582 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-registry-tls") pod "image-registry-759b975967-pgkrt" (UID: "ea323d90-8b5e-41a9-9633-538a3fdcead6") : secret "image-registry-tls" not found Apr 24 16:39:27.907775 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:27.907732 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32ea204a-32cf-4de1-bd0e-675e568756f9-service-ca-bundle\") pod \"router-default-5958786cb4-48wlg\" (UID: \"32ea204a-32cf-4de1-bd0e-675e568756f9\") " pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:27.907943 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:27.907798 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4075e057-13f0-412d-96df-8e124be59b52-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-n6v4x\" (UID: \"4075e057-13f0-412d-96df-8e124be59b52\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n6v4x" Apr 24 16:39:27.907943 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:27.907877 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32ea204a-32cf-4de1-bd0e-675e568756f9-metrics-certs\") pod \"router-default-5958786cb4-48wlg\" (UID: \"32ea204a-32cf-4de1-bd0e-675e568756f9\") " pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:27.907943 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:27.907907 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/546b86b2-63c7-46c8-b64c-048aaf992dca-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-82qpb\" (UID: \"546b86b2-63c7-46c8-b64c-048aaf992dca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82qpb" Apr 24 16:39:27.907943 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:27.907924 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/32ea204a-32cf-4de1-bd0e-675e568756f9-service-ca-bundle podName:32ea204a-32cf-4de1-bd0e-675e568756f9 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:31.907906743 +0000 UTC m=+42.279610970 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/32ea204a-32cf-4de1-bd0e-675e568756f9-service-ca-bundle") pod "router-default-5958786cb4-48wlg" (UID: "32ea204a-32cf-4de1-bd0e-675e568756f9") : configmap references non-existent config key: service-ca.crt Apr 24 16:39:27.908148 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:27.907980 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 16:39:27.908148 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:27.907998 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 16:39:27.908148 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:27.908039 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4075e057-13f0-412d-96df-8e124be59b52-cluster-monitoring-operator-tls podName:4075e057-13f0-412d-96df-8e124be59b52 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:31.908026235 +0000 UTC m=+42.279730441 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4075e057-13f0-412d-96df-8e124be59b52-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-n6v4x" (UID: "4075e057-13f0-412d-96df-8e124be59b52") : secret "cluster-monitoring-operator-tls" not found Apr 24 16:39:27.908148 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:27.908041 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 16:39:27.908148 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:27.908065 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/546b86b2-63c7-46c8-b64c-048aaf992dca-samples-operator-tls podName:546b86b2-63c7-46c8-b64c-048aaf992dca nodeName:}" failed. No retries permitted until 2026-04-24 16:39:31.908058455 +0000 UTC m=+42.279762659 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/546b86b2-63c7-46c8-b64c-048aaf992dca-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-82qpb" (UID: "546b86b2-63c7-46c8-b64c-048aaf992dca") : secret "samples-operator-tls" not found Apr 24 16:39:27.908148 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:27.908078 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32ea204a-32cf-4de1-bd0e-675e568756f9-metrics-certs podName:32ea204a-32cf-4de1-bd0e-675e568756f9 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:31.908072072 +0000 UTC m=+42.279776275 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/32ea204a-32cf-4de1-bd0e-675e568756f9-metrics-certs") pod "router-default-5958786cb4-48wlg" (UID: "32ea204a-32cf-4de1-bd0e-675e568756f9") : secret "router-metrics-certs-default" not found Apr 24 16:39:28.009130 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:28.009091 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ae2e73c-423b-4876-a52d-cd4111ca0013-cert\") pod \"ingress-canary-kcgx6\" (UID: \"9ae2e73c-423b-4876-a52d-cd4111ca0013\") " pod="openshift-ingress-canary/ingress-canary-kcgx6" Apr 24 16:39:28.009370 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:28.009188 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eefe78cf-7d49-4547-bded-f34c94ebc29b-metrics-tls\") pod \"dns-default-dzj8d\" (UID: \"eefe78cf-7d49-4547-bded-f34c94ebc29b\") " pod="openshift-dns/dns-default-dzj8d" Apr 24 16:39:28.009370 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:28.009307 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:28.009501 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:28.009366 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:28.009501 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:28.009379 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ae2e73c-423b-4876-a52d-cd4111ca0013-cert podName:9ae2e73c-423b-4876-a52d-cd4111ca0013 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:32.009363384 +0000 UTC m=+42.381067589 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ae2e73c-423b-4876-a52d-cd4111ca0013-cert") pod "ingress-canary-kcgx6" (UID: "9ae2e73c-423b-4876-a52d-cd4111ca0013") : secret "canary-serving-cert" not found Apr 24 16:39:28.009501 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:28.009427 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eefe78cf-7d49-4547-bded-f34c94ebc29b-metrics-tls podName:eefe78cf-7d49-4547-bded-f34c94ebc29b nodeName:}" failed. No retries permitted until 2026-04-24 16:39:32.009409526 +0000 UTC m=+42.381113751 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/eefe78cf-7d49-4547-bded-f34c94ebc29b-metrics-tls") pod "dns-default-dzj8d" (UID: "eefe78cf-7d49-4547-bded-f34c94ebc29b") : secret "dns-default-metrics-tls" not found Apr 24 16:39:31.421596 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:31.421519 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnwr7" event={"ID":"335d3a61-4224-4da9-adb2-5f83cb395511","Type":"ContainerStarted","Data":"fe0293bceebe0a222fbb18ddfc955268a20badd24f70dc1457dca366df3d6fef"} Apr 24 16:39:31.423119 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:31.423093 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-czwgm" event={"ID":"d0f570bd-6250-4e0c-a370-c143d42e56f0","Type":"ContainerStarted","Data":"907f1bb9647c208fdabc1aa309fec57cb616e49df41c235587f54a9f03233cac"} Apr 24 16:39:31.424648 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:31.424626 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-8j4f4" event={"ID":"f0f4702a-d6d6-410a-9800-fb13b913d223","Type":"ContainerStarted","Data":"fbcc82c6689dd8db3292e4465b7cee512d84c3a7b8c2ff38a33ed76c1bc0dd65"} Apr 24 16:39:31.427023 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:31.427000 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xx69l" event={"ID":"0897170b-76cd-4278-802c-03e1d1747af3","Type":"ContainerStarted","Data":"806c4be736502f9f1f3da51b93b77e90398af02ead9b656d9c3130d8657a8a53"} Apr 24 16:39:31.429468 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:31.429447 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hgbpw" event={"ID":"0c733890-1ac2-464e-9672-65bf90aded78","Type":"ContainerStarted","Data":"2eaa546f2f889760a1e3a61b022ac9e57a7c43c6cec49f079fd09c45cd494207"} Apr 24 16:39:31.431106 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:31.431078 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-f2l2f_45369a43-7ed3-4f16-a1dc-7f1a61e06fc4/console-operator/0.log" Apr 24 16:39:31.431200 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:31.431121 2578 generic.go:358] "Generic (PLEG): container finished" podID="45369a43-7ed3-4f16-a1dc-7f1a61e06fc4" containerID="c7e7279567cd0be4d9fabf464e7920cb8a1da53d2450826c6515bd04d3c75dd2" exitCode=255 Apr 24 16:39:31.431468 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:31.431334 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-f2l2f" event={"ID":"45369a43-7ed3-4f16-a1dc-7f1a61e06fc4","Type":"ContainerDied","Data":"c7e7279567cd0be4d9fabf464e7920cb8a1da53d2450826c6515bd04d3c75dd2"} Apr 24 16:39:31.431550 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:31.431476 2578 scope.go:117] "RemoveContainer" containerID="c7e7279567cd0be4d9fabf464e7920cb8a1da53d2450826c6515bd04d3c75dd2" Apr 24 16:39:31.434309 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:31.433869 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kzb6p" event={"ID":"ee65ef05-2b40-4b45-a683-47cafa91b43c","Type":"ContainerStarted","Data":"edbf766586dce3dc1204eed2b6dd6b0f80dd68ed7541473678c60476e16654b5"} Apr 24 16:39:31.439109 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:31.439047 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnwr7" podStartSLOduration=24.18087606 podStartE2EDuration="30.439032832s" podCreationTimestamp="2026-04-24 16:39:01 +0000 UTC" firstStartedPulling="2026-04-24 16:39:24.694173988 +0000 UTC m=+35.065878199" lastFinishedPulling="2026-04-24 16:39:30.952330763 +0000 UTC m=+41.324034971" observedRunningTime="2026-04-24 16:39:31.437848843 +0000 UTC m=+41.809553075" watchObservedRunningTime="2026-04-24 16:39:31.439032832 +0000 UTC m=+41.810737059" Apr 24 16:39:31.473643 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:31.473577 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-czwgm" podStartSLOduration=22.350676356 podStartE2EDuration="28.473559662s" podCreationTimestamp="2026-04-24 16:39:03 +0000 UTC" firstStartedPulling="2026-04-24 16:39:24.829500993 +0000 UTC m=+35.201205209" lastFinishedPulling="2026-04-24 16:39:30.952384299 +0000 UTC m=+41.324088515" observedRunningTime="2026-04-24 16:39:31.472781959 +0000 UTC m=+41.844486186" watchObservedRunningTime="2026-04-24 16:39:31.473559662 +0000 UTC m=+41.845263889" Apr 24 16:39:31.495380 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:31.494998 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xx69l" podStartSLOduration=21.091952858 podStartE2EDuration="27.494981991s" podCreationTimestamp="2026-04-24 16:39:04 +0000 UTC" firstStartedPulling="2026-04-24 16:39:24.830420009 +0000 UTC m=+35.202124228" lastFinishedPulling="2026-04-24 16:39:31.233449142 +0000 UTC m=+41.605153361" observedRunningTime="2026-04-24 16:39:31.494433538 +0000 UTC m=+41.866137765" watchObservedRunningTime="2026-04-24 16:39:31.494981991 +0000 UTC m=+41.866686218" Apr 24 16:39:31.519636 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:31.519586 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-hgbpw" podStartSLOduration=10.247413455 podStartE2EDuration="41.519568087s" podCreationTimestamp="2026-04-24 16:38:50 +0000 UTC" firstStartedPulling="2026-04-24 16:38:52.92630489 +0000 UTC m=+3.298009109" lastFinishedPulling="2026-04-24 16:39:24.198459537 +0000 UTC m=+34.570163741" observedRunningTime="2026-04-24 16:39:31.518248416 +0000 UTC m=+41.889952643" watchObservedRunningTime="2026-04-24 16:39:31.519568087 +0000 UTC m=+41.891272316" Apr 24 16:39:31.536998 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:31.536944 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-8j4f4" podStartSLOduration=31.266475627 podStartE2EDuration="37.536926143s" podCreationTimestamp="2026-04-24 16:38:54 +0000 UTC" firstStartedPulling="2026-04-24 16:39:24.695164628 +0000 UTC m=+35.066868837" lastFinishedPulling="2026-04-24 16:39:30.965615138 +0000 UTC m=+41.337319353" observedRunningTime="2026-04-24 16:39:31.534924955 +0000 UTC m=+41.906629183" watchObservedRunningTime="2026-04-24 16:39:31.536926143 +0000 UTC m=+41.908630368" Apr 24 16:39:31.560131 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:31.559971 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-kzb6p" podStartSLOduration=29.303482836 podStartE2EDuration="35.559955229s" podCreationTimestamp="2026-04-24 16:38:56 +0000 UTC" firstStartedPulling="2026-04-24 16:39:24.694233408 +0000 UTC m=+35.065937612" lastFinishedPulling="2026-04-24 16:39:30.950705784 +0000 UTC m=+41.322410005" observedRunningTime="2026-04-24 16:39:31.558936761 +0000 UTC m=+41.930640988" watchObservedRunningTime="2026-04-24 16:39:31.559955229 +0000 UTC m=+41.931659432" Apr 24 16:39:31.840687 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:31.840592 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-registry-tls\") pod \"image-registry-759b975967-pgkrt\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:31.840846 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:31.840750 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:31.840846 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:31.840767 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-759b975967-pgkrt: secret "image-registry-tls" not found Apr 24 16:39:31.840846 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:31.840824 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-registry-tls podName:ea323d90-8b5e-41a9-9633-538a3fdcead6 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:39.840808189 +0000 UTC m=+50.212512418 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-registry-tls") pod "image-registry-759b975967-pgkrt" (UID: "ea323d90-8b5e-41a9-9633-538a3fdcead6") : secret "image-registry-tls" not found Apr 24 16:39:31.941200 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:31.941160 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32ea204a-32cf-4de1-bd0e-675e568756f9-service-ca-bundle\") pod \"router-default-5958786cb4-48wlg\" (UID: \"32ea204a-32cf-4de1-bd0e-675e568756f9\") " pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:31.941419 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:31.941213 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4075e057-13f0-412d-96df-8e124be59b52-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-n6v4x\" (UID: \"4075e057-13f0-412d-96df-8e124be59b52\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n6v4x" Apr 24 16:39:31.941419 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:31.941278 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32ea204a-32cf-4de1-bd0e-675e568756f9-metrics-certs\") pod \"router-default-5958786cb4-48wlg\" (UID: \"32ea204a-32cf-4de1-bd0e-675e568756f9\") " pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:31.941419 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:31.941345 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/32ea204a-32cf-4de1-bd0e-675e568756f9-service-ca-bundle podName:32ea204a-32cf-4de1-bd0e-675e568756f9 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:39.941325711 +0000 UTC m=+50.313029939 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/32ea204a-32cf-4de1-bd0e-675e568756f9-service-ca-bundle") pod "router-default-5958786cb4-48wlg" (UID: "32ea204a-32cf-4de1-bd0e-675e568756f9") : configmap references non-existent config key: service-ca.crt Apr 24 16:39:31.941419 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:31.941382 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 16:39:31.941644 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:31.941411 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 16:39:31.941644 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:31.941430 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32ea204a-32cf-4de1-bd0e-675e568756f9-metrics-certs podName:32ea204a-32cf-4de1-bd0e-675e568756f9 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:39.941418466 +0000 UTC m=+50.313122686 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/32ea204a-32cf-4de1-bd0e-675e568756f9-metrics-certs") pod "router-default-5958786cb4-48wlg" (UID: "32ea204a-32cf-4de1-bd0e-675e568756f9") : secret "router-metrics-certs-default" not found Apr 24 16:39:31.941644 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:31.941456 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/546b86b2-63c7-46c8-b64c-048aaf992dca-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-82qpb\" (UID: \"546b86b2-63c7-46c8-b64c-048aaf992dca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82qpb" Apr 24 16:39:31.941644 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:31.941482 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4075e057-13f0-412d-96df-8e124be59b52-cluster-monitoring-operator-tls podName:4075e057-13f0-412d-96df-8e124be59b52 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:39.941461534 +0000 UTC m=+50.313165749 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4075e057-13f0-412d-96df-8e124be59b52-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-n6v4x" (UID: "4075e057-13f0-412d-96df-8e124be59b52") : secret "cluster-monitoring-operator-tls" not found Apr 24 16:39:31.941644 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:31.941514 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 16:39:31.941644 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:31.941544 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/546b86b2-63c7-46c8-b64c-048aaf992dca-samples-operator-tls podName:546b86b2-63c7-46c8-b64c-048aaf992dca nodeName:}" failed. No retries permitted until 2026-04-24 16:39:39.941533327 +0000 UTC m=+50.313237531 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/546b86b2-63c7-46c8-b64c-048aaf992dca-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-82qpb" (UID: "546b86b2-63c7-46c8-b64c-048aaf992dca") : secret "samples-operator-tls" not found Apr 24 16:39:32.042511 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:32.042471 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ae2e73c-423b-4876-a52d-cd4111ca0013-cert\") pod \"ingress-canary-kcgx6\" (UID: \"9ae2e73c-423b-4876-a52d-cd4111ca0013\") " pod="openshift-ingress-canary/ingress-canary-kcgx6" Apr 24 16:39:32.042703 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:32.042534 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eefe78cf-7d49-4547-bded-f34c94ebc29b-metrics-tls\") pod \"dns-default-dzj8d\" (UID: \"eefe78cf-7d49-4547-bded-f34c94ebc29b\") " pod="openshift-dns/dns-default-dzj8d" Apr 24 16:39:32.042703 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:32.042643 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:32.042703 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:32.042649 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:32.042703 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:32.042698 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eefe78cf-7d49-4547-bded-f34c94ebc29b-metrics-tls podName:eefe78cf-7d49-4547-bded-f34c94ebc29b nodeName:}" failed. No retries permitted until 2026-04-24 16:39:40.042682957 +0000 UTC m=+50.414387164 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/eefe78cf-7d49-4547-bded-f34c94ebc29b-metrics-tls") pod "dns-default-dzj8d" (UID: "eefe78cf-7d49-4547-bded-f34c94ebc29b") : secret "dns-default-metrics-tls" not found Apr 24 16:39:32.042897 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:32.042713 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ae2e73c-423b-4876-a52d-cd4111ca0013-cert podName:9ae2e73c-423b-4876-a52d-cd4111ca0013 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:40.042706519 +0000 UTC m=+50.414410722 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ae2e73c-423b-4876-a52d-cd4111ca0013-cert") pod "ingress-canary-kcgx6" (UID: "9ae2e73c-423b-4876-a52d-cd4111ca0013") : secret "canary-serving-cert" not found Apr 24 16:39:32.438779 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:32.438755 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-f2l2f_45369a43-7ed3-4f16-a1dc-7f1a61e06fc4/console-operator/1.log" Apr 24 16:39:32.439241 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:32.439221 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-f2l2f_45369a43-7ed3-4f16-a1dc-7f1a61e06fc4/console-operator/0.log" Apr 24 16:39:32.439330 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:32.439268 2578 generic.go:358] "Generic (PLEG): container finished" podID="45369a43-7ed3-4f16-a1dc-7f1a61e06fc4" containerID="84f072d383ec8a0424417ed1d7236e4c7c86d02d81c4f9ebc1e42588fe0f8ef5" exitCode=255 Apr 24 16:39:32.439387 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:32.439354 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-f2l2f" event={"ID":"45369a43-7ed3-4f16-a1dc-7f1a61e06fc4","Type":"ContainerDied","Data":"84f072d383ec8a0424417ed1d7236e4c7c86d02d81c4f9ebc1e42588fe0f8ef5"} Apr 24 16:39:32.439435 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:32.439397 2578 scope.go:117] "RemoveContainer" containerID="c7e7279567cd0be4d9fabf464e7920cb8a1da53d2450826c6515bd04d3c75dd2" Apr 24 16:39:32.440121 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:32.439659 2578 scope.go:117] "RemoveContainer" containerID="84f072d383ec8a0424417ed1d7236e4c7c86d02d81c4f9ebc1e42588fe0f8ef5" Apr 24 16:39:32.440121 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:32.439911 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-f2l2f_openshift-console-operator(45369a43-7ed3-4f16-a1dc-7f1a61e06fc4)\"" pod="openshift-console-operator/console-operator-9d4b6777b-f2l2f" podUID="45369a43-7ed3-4f16-a1dc-7f1a61e06fc4" Apr 24 16:39:33.444712 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:33.444676 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-f2l2f_45369a43-7ed3-4f16-a1dc-7f1a61e06fc4/console-operator/1.log" Apr 24 16:39:33.445140 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:33.445070 2578 scope.go:117] "RemoveContainer" containerID="84f072d383ec8a0424417ed1d7236e4c7c86d02d81c4f9ebc1e42588fe0f8ef5" Apr 24 16:39:33.445249 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:33.445230 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-f2l2f_openshift-console-operator(45369a43-7ed3-4f16-a1dc-7f1a61e06fc4)\"" pod="openshift-console-operator/console-operator-9d4b6777b-f2l2f" podUID="45369a43-7ed3-4f16-a1dc-7f1a61e06fc4" Apr 24 16:39:34.472431 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:34.472389 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-f2l2f" Apr 24 16:39:34.472887 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:34.472445 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-f2l2f" Apr 24 16:39:34.472887 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:34.472883 2578 scope.go:117] "RemoveContainer" containerID="84f072d383ec8a0424417ed1d7236e4c7c86d02d81c4f9ebc1e42588fe0f8ef5" Apr 24 16:39:34.473106 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:34.473084 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-f2l2f_openshift-console-operator(45369a43-7ed3-4f16-a1dc-7f1a61e06fc4)\"" pod="openshift-console-operator/console-operator-9d4b6777b-f2l2f" podUID="45369a43-7ed3-4f16-a1dc-7f1a61e06fc4" Apr 24 16:39:34.635216 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:34.635185 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9nlzl_2c5b2afa-97e0-4381-b83f-848951dec5c9/dns-node-resolver/0.log" Apr 24 16:39:35.635030 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:35.635000 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7m4xb_e8baf786-1fb8-494a-bdb7-c724c853faa3/node-ca/0.log" Apr 24 16:39:39.909708 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:39.909668 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-registry-tls\") pod \"image-registry-759b975967-pgkrt\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:39.910148 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:39.909823 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:39.910148 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:39.909842 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-759b975967-pgkrt: secret "image-registry-tls" not found Apr 24 16:39:39.910148 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:39.909906 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-registry-tls podName:ea323d90-8b5e-41a9-9633-538a3fdcead6 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:55.909890022 +0000 UTC m=+66.281594226 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-registry-tls") pod "image-registry-759b975967-pgkrt" (UID: "ea323d90-8b5e-41a9-9633-538a3fdcead6") : secret "image-registry-tls" not found Apr 24 16:39:40.011071 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:40.011035 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32ea204a-32cf-4de1-bd0e-675e568756f9-service-ca-bundle\") pod \"router-default-5958786cb4-48wlg\" (UID: \"32ea204a-32cf-4de1-bd0e-675e568756f9\") " pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:40.011266 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:40.011076 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4075e057-13f0-412d-96df-8e124be59b52-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-n6v4x\" (UID: \"4075e057-13f0-412d-96df-8e124be59b52\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n6v4x" Apr 24 16:39:40.011266 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:40.011170 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 16:39:40.011266 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:40.011224 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/32ea204a-32cf-4de1-bd0e-675e568756f9-service-ca-bundle podName:32ea204a-32cf-4de1-bd0e-675e568756f9 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:56.011199969 +0000 UTC m=+66.382904174 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/32ea204a-32cf-4de1-bd0e-675e568756f9-service-ca-bundle") pod "router-default-5958786cb4-48wlg" (UID: "32ea204a-32cf-4de1-bd0e-675e568756f9") : configmap references non-existent config key: service-ca.crt Apr 24 16:39:40.011266 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:40.011252 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4075e057-13f0-412d-96df-8e124be59b52-cluster-monitoring-operator-tls podName:4075e057-13f0-412d-96df-8e124be59b52 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:56.011239925 +0000 UTC m=+66.382944130 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4075e057-13f0-412d-96df-8e124be59b52-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-n6v4x" (UID: "4075e057-13f0-412d-96df-8e124be59b52") : secret "cluster-monitoring-operator-tls" not found Apr 24 16:39:40.011532 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:40.011361 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32ea204a-32cf-4de1-bd0e-675e568756f9-metrics-certs\") pod \"router-default-5958786cb4-48wlg\" (UID: \"32ea204a-32cf-4de1-bd0e-675e568756f9\") " pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:40.011532 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:40.011395 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/546b86b2-63c7-46c8-b64c-048aaf992dca-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-82qpb\" (UID: \"546b86b2-63c7-46c8-b64c-048aaf992dca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82qpb" Apr 24 16:39:40.011532 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:40.011471 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 16:39:40.011532 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:40.011527 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32ea204a-32cf-4de1-bd0e-675e568756f9-metrics-certs podName:32ea204a-32cf-4de1-bd0e-675e568756f9 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:56.01151133 +0000 UTC m=+66.383215538 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/32ea204a-32cf-4de1-bd0e-675e568756f9-metrics-certs") pod "router-default-5958786cb4-48wlg" (UID: "32ea204a-32cf-4de1-bd0e-675e568756f9") : secret "router-metrics-certs-default" not found Apr 24 16:39:40.014863 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:40.014833 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/546b86b2-63c7-46c8-b64c-048aaf992dca-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-82qpb\" (UID: \"546b86b2-63c7-46c8-b64c-048aaf992dca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82qpb" Apr 24 16:39:40.031552 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:40.031531 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82qpb" Apr 24 16:39:40.112864 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:40.112827 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ae2e73c-423b-4876-a52d-cd4111ca0013-cert\") pod \"ingress-canary-kcgx6\" (UID: \"9ae2e73c-423b-4876-a52d-cd4111ca0013\") " pod="openshift-ingress-canary/ingress-canary-kcgx6" Apr 24 16:39:40.112979 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:40.112920 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eefe78cf-7d49-4547-bded-f34c94ebc29b-metrics-tls\") pod \"dns-default-dzj8d\" (UID: \"eefe78cf-7d49-4547-bded-f34c94ebc29b\") " pod="openshift-dns/dns-default-dzj8d" Apr 24 16:39:40.112979 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:40.112934 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:40.113045 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:40.112995 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ae2e73c-423b-4876-a52d-cd4111ca0013-cert podName:9ae2e73c-423b-4876-a52d-cd4111ca0013 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:56.112977116 +0000 UTC m=+66.484681342 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ae2e73c-423b-4876-a52d-cd4111ca0013-cert") pod "ingress-canary-kcgx6" (UID: "9ae2e73c-423b-4876-a52d-cd4111ca0013") : secret "canary-serving-cert" not found Apr 24 16:39:40.113045 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:40.113029 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:40.113117 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:39:40.113066 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eefe78cf-7d49-4547-bded-f34c94ebc29b-metrics-tls podName:eefe78cf-7d49-4547-bded-f34c94ebc29b nodeName:}" failed. No retries permitted until 2026-04-24 16:39:56.113054409 +0000 UTC m=+66.484758613 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/eefe78cf-7d49-4547-bded-f34c94ebc29b-metrics-tls") pod "dns-default-dzj8d" (UID: "eefe78cf-7d49-4547-bded-f34c94ebc29b") : secret "dns-default-metrics-tls" not found Apr 24 16:39:40.150984 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:40.149971 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82qpb"] Apr 24 16:39:40.464125 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:40.464044 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82qpb" event={"ID":"546b86b2-63c7-46c8-b64c-048aaf992dca","Type":"ContainerStarted","Data":"d3cb6ff479244c0edca2ed2d85ca59627ba520ba987a8b3bbe99382664ff8ed9"} Apr 24 16:39:43.472632 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:43.472595 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82qpb" event={"ID":"546b86b2-63c7-46c8-b64c-048aaf992dca","Type":"ContainerStarted","Data":"6b6143758332a9c85986862387648619fe7ec583cc953041aa27c6fe1e46e179"} Apr 24 16:39:43.472632 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:43.472636 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82qpb" event={"ID":"546b86b2-63c7-46c8-b64c-048aaf992dca","Type":"ContainerStarted","Data":"f5d88ce7b4b6f9e541a712dce5106ecc2431dc8eb55f0ca09c765e8d8b6d5654"} Apr 24 16:39:43.488817 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:43.488767 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-82qpb" podStartSLOduration=45.078935999 podStartE2EDuration="47.488751588s" podCreationTimestamp="2026-04-24 16:38:56 +0000 UTC" firstStartedPulling="2026-04-24 16:39:40.239508799 +0000 UTC m=+50.611213003" lastFinishedPulling="2026-04-24 16:39:42.649324386 +0000 UTC m=+53.021028592" observedRunningTime="2026-04-24 16:39:43.488394187 +0000 UTC m=+53.860098413" watchObservedRunningTime="2026-04-24 16:39:43.488751588 +0000 UTC m=+53.860455814" Apr 24 16:39:47.233894 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:47.233864 2578 scope.go:117] "RemoveContainer" containerID="84f072d383ec8a0424417ed1d7236e4c7c86d02d81c4f9ebc1e42588fe0f8ef5" Apr 24 16:39:47.484921 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:47.484839 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-f2l2f_45369a43-7ed3-4f16-a1dc-7f1a61e06fc4/console-operator/1.log" Apr 24 16:39:47.485066 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:47.484924 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-f2l2f" event={"ID":"45369a43-7ed3-4f16-a1dc-7f1a61e06fc4","Type":"ContainerStarted","Data":"e2a1f775ea3143ad11b731a6bd3d41c643c51991f1fbdc914c53f21a7e778187"} Apr 24 16:39:47.485257 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:47.485239 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-f2l2f" Apr 24 16:39:47.503616 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:47.503562 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-f2l2f" podStartSLOduration=44.242351721 podStartE2EDuration="50.503549839s" podCreationTimestamp="2026-04-24 16:38:57 +0000 UTC" firstStartedPulling="2026-04-24 16:39:24.691566125 +0000 UTC m=+35.063270335" lastFinishedPulling="2026-04-24 16:39:30.952764234 +0000 UTC m=+41.324468453" observedRunningTime="2026-04-24 16:39:47.502600886 +0000 UTC m=+57.874305135" watchObservedRunningTime="2026-04-24 16:39:47.503549839 +0000 UTC m=+57.875254065" Apr 24 16:39:47.524167 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:47.524145 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-f2l2f" Apr 24 16:39:50.468564 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:50.468534 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mgfp8" Apr 24 16:39:55.672385 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:55.672338 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-cxjs9"] Apr 24 16:39:55.678369 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:55.678345 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-cxjs9" Apr 24 16:39:55.681792 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:55.681771 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-r6qpw\"" Apr 24 16:39:55.682862 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:55.682847 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 16:39:55.682941 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:55.682853 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 16:39:55.691702 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:55.691678 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-cxjs9"] Apr 24 16:39:55.753462 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:55.753428 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/88ca7df2-32d7-469f-9396-016d2ca3b6f3-data-volume\") pod \"insights-runtime-extractor-cxjs9\" (UID: \"88ca7df2-32d7-469f-9396-016d2ca3b6f3\") " pod="openshift-insights/insights-runtime-extractor-cxjs9" Apr 24 16:39:55.753772 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:55.753752 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/88ca7df2-32d7-469f-9396-016d2ca3b6f3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cxjs9\" (UID: \"88ca7df2-32d7-469f-9396-016d2ca3b6f3\") " pod="openshift-insights/insights-runtime-extractor-cxjs9" Apr 24 16:39:55.753967 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:55.753950 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/88ca7df2-32d7-469f-9396-016d2ca3b6f3-crio-socket\") pod \"insights-runtime-extractor-cxjs9\" (UID: \"88ca7df2-32d7-469f-9396-016d2ca3b6f3\") " pod="openshift-insights/insights-runtime-extractor-cxjs9" Apr 24 16:39:55.754266 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:55.754247 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/88ca7df2-32d7-469f-9396-016d2ca3b6f3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cxjs9\" (UID: \"88ca7df2-32d7-469f-9396-016d2ca3b6f3\") " pod="openshift-insights/insights-runtime-extractor-cxjs9" Apr 24 16:39:55.754478 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:55.754463 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckjrm\" (UniqueName: \"kubernetes.io/projected/88ca7df2-32d7-469f-9396-016d2ca3b6f3-kube-api-access-ckjrm\") pod \"insights-runtime-extractor-cxjs9\" (UID: \"88ca7df2-32d7-469f-9396-016d2ca3b6f3\") " pod="openshift-insights/insights-runtime-extractor-cxjs9" Apr 24 16:39:55.855160 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:55.855120 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/88ca7df2-32d7-469f-9396-016d2ca3b6f3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cxjs9\" (UID: \"88ca7df2-32d7-469f-9396-016d2ca3b6f3\") " pod="openshift-insights/insights-runtime-extractor-cxjs9" Apr 24 16:39:55.855160 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:55.855157 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ckjrm\" (UniqueName: \"kubernetes.io/projected/88ca7df2-32d7-469f-9396-016d2ca3b6f3-kube-api-access-ckjrm\") pod \"insights-runtime-extractor-cxjs9\" (UID: \"88ca7df2-32d7-469f-9396-016d2ca3b6f3\") " pod="openshift-insights/insights-runtime-extractor-cxjs9" Apr 24 16:39:55.855383 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:55.855330 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/88ca7df2-32d7-469f-9396-016d2ca3b6f3-data-volume\") pod \"insights-runtime-extractor-cxjs9\" (UID: \"88ca7df2-32d7-469f-9396-016d2ca3b6f3\") " pod="openshift-insights/insights-runtime-extractor-cxjs9" Apr 24 16:39:55.855427 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:55.855385 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/88ca7df2-32d7-469f-9396-016d2ca3b6f3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cxjs9\" (UID: \"88ca7df2-32d7-469f-9396-016d2ca3b6f3\") " pod="openshift-insights/insights-runtime-extractor-cxjs9" Apr 24 16:39:55.855464 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:55.855440 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/88ca7df2-32d7-469f-9396-016d2ca3b6f3-crio-socket\") pod \"insights-runtime-extractor-cxjs9\" (UID: \"88ca7df2-32d7-469f-9396-016d2ca3b6f3\") " pod="openshift-insights/insights-runtime-extractor-cxjs9" Apr 24 16:39:55.855566 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:55.855553 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/88ca7df2-32d7-469f-9396-016d2ca3b6f3-crio-socket\") pod \"insights-runtime-extractor-cxjs9\" (UID: \"88ca7df2-32d7-469f-9396-016d2ca3b6f3\") " pod="openshift-insights/insights-runtime-extractor-cxjs9" Apr 24 16:39:55.855666 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:55.855646 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/88ca7df2-32d7-469f-9396-016d2ca3b6f3-data-volume\") pod \"insights-runtime-extractor-cxjs9\" (UID: \"88ca7df2-32d7-469f-9396-016d2ca3b6f3\") " pod="openshift-insights/insights-runtime-extractor-cxjs9" Apr 24 16:39:55.855765 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:55.855751 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/88ca7df2-32d7-469f-9396-016d2ca3b6f3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-cxjs9\" (UID: \"88ca7df2-32d7-469f-9396-016d2ca3b6f3\") " pod="openshift-insights/insights-runtime-extractor-cxjs9" Apr 24 16:39:55.857781 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:55.857765 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/88ca7df2-32d7-469f-9396-016d2ca3b6f3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-cxjs9\" (UID: \"88ca7df2-32d7-469f-9396-016d2ca3b6f3\") " pod="openshift-insights/insights-runtime-extractor-cxjs9" Apr 24 16:39:55.873105 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:55.873075 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckjrm\" (UniqueName: \"kubernetes.io/projected/88ca7df2-32d7-469f-9396-016d2ca3b6f3-kube-api-access-ckjrm\") pod \"insights-runtime-extractor-cxjs9\" (UID: \"88ca7df2-32d7-469f-9396-016d2ca3b6f3\") " pod="openshift-insights/insights-runtime-extractor-cxjs9" Apr 24 16:39:55.956171 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:55.956083 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f368d48-c79b-45b5-8879-9dac1c5cfe3f-metrics-certs\") pod \"network-metrics-daemon-thz9k\" (UID: \"0f368d48-c79b-45b5-8879-9dac1c5cfe3f\") " pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:39:55.956171 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:55.956128 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2r84\" (UniqueName: \"kubernetes.io/projected/7230245a-1622-4c39-9d99-ab2e06ac0daf-kube-api-access-f2r84\") pod \"network-check-target-hcjpg\" (UID: \"7230245a-1622-4c39-9d99-ab2e06ac0daf\") " pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:39:55.956171 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:55.956167 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-registry-tls\") pod \"image-registry-759b975967-pgkrt\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:55.958566 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:55.958536 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 16:39:55.958901 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:55.958877 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-registry-tls\") pod \"image-registry-759b975967-pgkrt\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:55.959173 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:55.959152 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2r84\" (UniqueName: \"kubernetes.io/projected/7230245a-1622-4c39-9d99-ab2e06ac0daf-kube-api-access-f2r84\") pod \"network-check-target-hcjpg\" (UID: \"7230245a-1622-4c39-9d99-ab2e06ac0daf\") " pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:39:55.968234 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:55.968213 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f368d48-c79b-45b5-8879-9dac1c5cfe3f-metrics-certs\") pod \"network-metrics-daemon-thz9k\" (UID: \"0f368d48-c79b-45b5-8879-9dac1c5cfe3f\") " pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:39:55.988775 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:55.988752 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-cxjs9" Apr 24 16:39:56.057690 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.057575 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/81d7a862-3288-4023-b0d6-2464e9278dac-original-pull-secret\") pod \"global-pull-secret-syncer-rlzxl\" (UID: \"81d7a862-3288-4023-b0d6-2464e9278dac\") " pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:39:56.057690 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.057635 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32ea204a-32cf-4de1-bd0e-675e568756f9-metrics-certs\") pod \"router-default-5958786cb4-48wlg\" (UID: \"32ea204a-32cf-4de1-bd0e-675e568756f9\") " pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:56.057867 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.057739 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32ea204a-32cf-4de1-bd0e-675e568756f9-service-ca-bundle\") pod \"router-default-5958786cb4-48wlg\" (UID: \"32ea204a-32cf-4de1-bd0e-675e568756f9\") " pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:56.057965 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.057946 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4075e057-13f0-412d-96df-8e124be59b52-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-n6v4x\" (UID: \"4075e057-13f0-412d-96df-8e124be59b52\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n6v4x" Apr 24 16:39:56.058494 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.058475 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32ea204a-32cf-4de1-bd0e-675e568756f9-service-ca-bundle\") pod \"router-default-5958786cb4-48wlg\" (UID: \"32ea204a-32cf-4de1-bd0e-675e568756f9\") " pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:56.060253 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.060231 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 16:39:56.060772 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.060429 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4075e057-13f0-412d-96df-8e124be59b52-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-n6v4x\" (UID: \"4075e057-13f0-412d-96df-8e124be59b52\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n6v4x" Apr 24 16:39:56.060772 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.060728 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32ea204a-32cf-4de1-bd0e-675e568756f9-metrics-certs\") pod \"router-default-5958786cb4-48wlg\" (UID: \"32ea204a-32cf-4de1-bd0e-675e568756f9\") " pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:56.070779 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.070757 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/81d7a862-3288-4023-b0d6-2464e9278dac-original-pull-secret\") pod \"global-pull-secret-syncer-rlzxl\" (UID: \"81d7a862-3288-4023-b0d6-2464e9278dac\") " pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:39:56.111807 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.111775 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-cxjs9"] Apr 24 16:39:56.114572 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:39:56.114542 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88ca7df2_32d7_469f_9396_016d2ca3b6f3.slice/crio-65208bb8b4107c2eaf0464d910c6e1c4a4329c97c3ac05b6c368154633b26832 WatchSource:0}: Error finding container 65208bb8b4107c2eaf0464d910c6e1c4a4329c97c3ac05b6c368154633b26832: Status 404 returned error can't find the container with id 65208bb8b4107c2eaf0464d910c6e1c4a4329c97c3ac05b6c368154633b26832 Apr 24 16:39:56.158562 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.158537 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eefe78cf-7d49-4547-bded-f34c94ebc29b-metrics-tls\") pod \"dns-default-dzj8d\" (UID: \"eefe78cf-7d49-4547-bded-f34c94ebc29b\") " pod="openshift-dns/dns-default-dzj8d" Apr 24 16:39:56.158670 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.158594 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ae2e73c-423b-4876-a52d-cd4111ca0013-cert\") pod \"ingress-canary-kcgx6\" (UID: \"9ae2e73c-423b-4876-a52d-cd4111ca0013\") " pod="openshift-ingress-canary/ingress-canary-kcgx6" Apr 24 16:39:56.160838 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.160807 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eefe78cf-7d49-4547-bded-f34c94ebc29b-metrics-tls\") pod \"dns-default-dzj8d\" (UID: \"eefe78cf-7d49-4547-bded-f34c94ebc29b\") " pod="openshift-dns/dns-default-dzj8d" Apr 24 16:39:56.160956 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.160842 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ae2e73c-423b-4876-a52d-cd4111ca0013-cert\") pod \"ingress-canary-kcgx6\" (UID: \"9ae2e73c-423b-4876-a52d-cd4111ca0013\") " pod="openshift-ingress-canary/ingress-canary-kcgx6" Apr 24 16:39:56.185941 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.185902 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rlzxl" Apr 24 16:39:56.186078 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.186006 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-ff9nx\"" Apr 24 16:39:56.194457 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.194317 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:56.199017 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.198519 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fcvt5\"" Apr 24 16:39:56.202741 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.202720 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-4njbg\"" Apr 24 16:39:56.203571 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.203549 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-j4znz\"" Apr 24 16:39:56.204584 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.204564 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:39:56.210842 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.210753 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n6v4x" Apr 24 16:39:56.212473 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.212395 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-thz9k" Apr 24 16:39:56.213645 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.213623 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-556xq\"" Apr 24 16:39:56.221934 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.221868 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:56.397571 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:39:56.397511 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81d7a862_3288_4023_b0d6_2464e9278dac.slice/crio-1d84376031b828e6a74ba30fd32a3abda39f58ccb78fbf67f13a0051c3e407b8 WatchSource:0}: Error finding container 1d84376031b828e6a74ba30fd32a3abda39f58ccb78fbf67f13a0051c3e407b8: Status 404 returned error can't find the container with id 1d84376031b828e6a74ba30fd32a3abda39f58ccb78fbf67f13a0051c3e407b8 Apr 24 16:39:56.401191 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.401168 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zcktm\"" Apr 24 16:39:56.401711 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.401684 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-x6pvx\"" Apr 24 16:39:56.403047 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.403027 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rlzxl"] Apr 24 16:39:56.406356 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.405829 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dzj8d" Apr 24 16:39:56.409227 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.409023 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-kcgx6" Apr 24 16:39:56.428122 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.428092 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-759b975967-pgkrt"] Apr 24 16:39:56.431540 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:39:56.431477 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea323d90_8b5e_41a9_9633_538a3fdcead6.slice/crio-3cb4aafdd4e2d12394c2f67822e1765c01cda4f3e79617d036ba6397710e44a9 WatchSource:0}: Error finding container 3cb4aafdd4e2d12394c2f67822e1765c01cda4f3e79617d036ba6397710e44a9: Status 404 returned error can't find the container with id 3cb4aafdd4e2d12394c2f67822e1765c01cda4f3e79617d036ba6397710e44a9 Apr 24 16:39:56.440130 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.440067 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-n6v4x"] Apr 24 16:39:56.451897 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:39:56.451859 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4075e057_13f0_412d_96df_8e124be59b52.slice/crio-180b43b096c773651fa1110fa2766024b26d9c7e66bd15dbd40bd0bf55660d9a WatchSource:0}: Error finding container 180b43b096c773651fa1110fa2766024b26d9c7e66bd15dbd40bd0bf55660d9a: Status 404 returned error can't find the container with id 180b43b096c773651fa1110fa2766024b26d9c7e66bd15dbd40bd0bf55660d9a Apr 24 16:39:56.470894 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.470795 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5958786cb4-48wlg"] Apr 24 16:39:56.486632 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.486600 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-thz9k"] Apr 24 16:39:56.493901 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:39:56.493827 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f368d48_c79b_45b5_8879_9dac1c5cfe3f.slice/crio-0aed2e944cfd81920886e4f957652d4ee4755b83a04f1ba9d9e8837e637c2b02 WatchSource:0}: Error finding container 0aed2e944cfd81920886e4f957652d4ee4755b83a04f1ba9d9e8837e637c2b02: Status 404 returned error can't find the container with id 0aed2e944cfd81920886e4f957652d4ee4755b83a04f1ba9d9e8837e637c2b02 Apr 24 16:39:56.510688 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.510655 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cxjs9" event={"ID":"88ca7df2-32d7-469f-9396-016d2ca3b6f3","Type":"ContainerStarted","Data":"ed57a3bff17505f428d28d8d7be905317bd058d7573d25ec0e748cdb60799780"} Apr 24 16:39:56.510800 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.510692 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cxjs9" event={"ID":"88ca7df2-32d7-469f-9396-016d2ca3b6f3","Type":"ContainerStarted","Data":"65208bb8b4107c2eaf0464d910c6e1c4a4329c97c3ac05b6c368154633b26832"} Apr 24 16:39:56.512252 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.512197 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5958786cb4-48wlg" event={"ID":"32ea204a-32cf-4de1-bd0e-675e568756f9","Type":"ContainerStarted","Data":"7b413ef59f7e404e2eb738a4c55f02ed7b4feeafa123100e17870f803838aeb1"} Apr 24 16:39:56.513859 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.513819 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-759b975967-pgkrt" event={"ID":"ea323d90-8b5e-41a9-9633-538a3fdcead6","Type":"ContainerStarted","Data":"3cb4aafdd4e2d12394c2f67822e1765c01cda4f3e79617d036ba6397710e44a9"} Apr 24 16:39:56.516899 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.516656 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-thz9k" event={"ID":"0f368d48-c79b-45b5-8879-9dac1c5cfe3f","Type":"ContainerStarted","Data":"0aed2e944cfd81920886e4f957652d4ee4755b83a04f1ba9d9e8837e637c2b02"} Apr 24 16:39:56.519873 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.519842 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n6v4x" event={"ID":"4075e057-13f0-412d-96df-8e124be59b52","Type":"ContainerStarted","Data":"180b43b096c773651fa1110fa2766024b26d9c7e66bd15dbd40bd0bf55660d9a"} Apr 24 16:39:56.521176 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.521107 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rlzxl" event={"ID":"81d7a862-3288-4023-b0d6-2464e9278dac","Type":"ContainerStarted","Data":"1d84376031b828e6a74ba30fd32a3abda39f58ccb78fbf67f13a0051c3e407b8"} Apr 24 16:39:56.569744 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.569690 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dzj8d"] Apr 24 16:39:56.573769 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:39:56.573740 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeefe78cf_7d49_4547_bded_f34c94ebc29b.slice/crio-e1c6a4c0f7091ea179b94d5a05af0f7d45b384bba16f0c6adf8252032ce3681e WatchSource:0}: Error finding container e1c6a4c0f7091ea179b94d5a05af0f7d45b384bba16f0c6adf8252032ce3681e: Status 404 returned error can't find the container with id e1c6a4c0f7091ea179b94d5a05af0f7d45b384bba16f0c6adf8252032ce3681e Apr 24 16:39:56.588770 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.588625 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-kcgx6"] Apr 24 16:39:56.591367 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:39:56.591343 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ae2e73c_423b_4876_a52d_cd4111ca0013.slice/crio-55c4367a96d46344cc9d7b2c0d1bc0d646f2a3a1b615e44ccc04fefd361e3709 WatchSource:0}: Error finding container 55c4367a96d46344cc9d7b2c0d1bc0d646f2a3a1b615e44ccc04fefd361e3709: Status 404 returned error can't find the container with id 55c4367a96d46344cc9d7b2c0d1bc0d646f2a3a1b615e44ccc04fefd361e3709 Apr 24 16:39:56.632667 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:56.632637 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hcjpg"] Apr 24 16:39:56.636388 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:39:56.636347 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7230245a_1622_4c39_9d99_ab2e06ac0daf.slice/crio-8b3132f0080ad70474d4452d79e74cedef85687427699a4914ff8294b87d8476 WatchSource:0}: Error finding container 8b3132f0080ad70474d4452d79e74cedef85687427699a4914ff8294b87d8476: Status 404 returned error can't find the container with id 8b3132f0080ad70474d4452d79e74cedef85687427699a4914ff8294b87d8476 Apr 24 16:39:57.529082 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:57.529011 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-759b975967-pgkrt" event={"ID":"ea323d90-8b5e-41a9-9633-538a3fdcead6","Type":"ContainerStarted","Data":"3ac73e203bf8ad1c862d9928acc5ba0b2a50a5216cc11af601cca91b50bc7596"} Apr 24 16:39:57.529919 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:57.529890 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:39:57.532690 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:57.532663 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dzj8d" event={"ID":"eefe78cf-7d49-4547-bded-f34c94ebc29b","Type":"ContainerStarted","Data":"e1c6a4c0f7091ea179b94d5a05af0f7d45b384bba16f0c6adf8252032ce3681e"} Apr 24 16:39:57.535500 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:57.535446 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kcgx6" event={"ID":"9ae2e73c-423b-4876-a52d-cd4111ca0013","Type":"ContainerStarted","Data":"55c4367a96d46344cc9d7b2c0d1bc0d646f2a3a1b615e44ccc04fefd361e3709"} Apr 24 16:39:57.537808 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:57.537783 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cxjs9" event={"ID":"88ca7df2-32d7-469f-9396-016d2ca3b6f3","Type":"ContainerStarted","Data":"dfbbf2caf176f0054374af111c67afa24bc2ed0d4d22b2389ff7c43b14403826"} Apr 24 16:39:57.540696 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:57.540668 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hcjpg" event={"ID":"7230245a-1622-4c39-9d99-ab2e06ac0daf","Type":"ContainerStarted","Data":"3515a11119d0b2ead4f7461125ed3132fbf3e28200c4aeeae2222a13ce637af4"} Apr 24 16:39:57.540794 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:57.540705 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hcjpg" event={"ID":"7230245a-1622-4c39-9d99-ab2e06ac0daf","Type":"ContainerStarted","Data":"8b3132f0080ad70474d4452d79e74cedef85687427699a4914ff8294b87d8476"} Apr 24 16:39:57.541297 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:57.541237 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:39:57.545701 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:57.545174 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5958786cb4-48wlg" event={"ID":"32ea204a-32cf-4de1-bd0e-675e568756f9","Type":"ContainerStarted","Data":"178553f107aa17662b4a9402e5e10f9c99dc4f54e27437abf3ca5969b8bc91de"} Apr 24 16:39:57.561070 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:57.560044 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-759b975967-pgkrt" podStartSLOduration=67.560029023 podStartE2EDuration="1m7.560029023s" podCreationTimestamp="2026-04-24 16:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:39:57.55976702 +0000 UTC m=+67.931471249" watchObservedRunningTime="2026-04-24 16:39:57.560029023 +0000 UTC m=+67.931733251" Apr 24 16:39:57.578440 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:57.577330 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-hcjpg" podStartSLOduration=67.577313562 podStartE2EDuration="1m7.577313562s" podCreationTimestamp="2026-04-24 16:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:39:57.576528312 +0000 UTC m=+67.948232531" watchObservedRunningTime="2026-04-24 16:39:57.577313562 +0000 UTC m=+67.949017789" Apr 24 16:39:58.222901 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:58.222862 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:58.225790 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:58.225763 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:58.247904 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:58.247758 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5958786cb4-48wlg" podStartSLOduration=64.247742805 podStartE2EDuration="1m4.247742805s" podCreationTimestamp="2026-04-24 16:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:39:57.601649504 +0000 UTC m=+67.973353730" watchObservedRunningTime="2026-04-24 16:39:58.247742805 +0000 UTC m=+68.619447035" Apr 24 16:39:58.549645 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:58.549535 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:39:58.551034 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:39:58.551007 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5958786cb4-48wlg" Apr 24 16:40:02.565647 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:02.565613 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-thz9k" event={"ID":"0f368d48-c79b-45b5-8879-9dac1c5cfe3f","Type":"ContainerStarted","Data":"f3f964e4a3db73c088cf5bc4cb1a118186efce90f7f4a11dbad99e77207006ab"} Apr 24 16:40:02.566045 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:02.565656 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-thz9k" event={"ID":"0f368d48-c79b-45b5-8879-9dac1c5cfe3f","Type":"ContainerStarted","Data":"c32752057c87311a2d1de52fa1c06859d761df4b0abe572dee6a49ca528b0d81"} Apr 24 16:40:02.570473 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:02.568095 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n6v4x" event={"ID":"4075e057-13f0-412d-96df-8e124be59b52","Type":"ContainerStarted","Data":"6094d6f1caf205dbea6688195c2b209ff55d92695b09a32eac12e7584c1d45bd"} Apr 24 16:40:02.571596 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:02.571547 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rlzxl" event={"ID":"81d7a862-3288-4023-b0d6-2464e9278dac","Type":"ContainerStarted","Data":"0afb86f71d52185202874b6c1925f71b898b41ce126dc06d5f206b5e459e1b28"} Apr 24 16:40:02.577655 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:02.577632 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-kcgx6" event={"ID":"9ae2e73c-423b-4876-a52d-cd4111ca0013","Type":"ContainerStarted","Data":"774b831fa55354d0df00ff0b69340a4d15fc0c0515e4a314664a7aef69fb1a38"} Apr 24 16:40:02.579031 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:02.579013 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-cxjs9" event={"ID":"88ca7df2-32d7-469f-9396-016d2ca3b6f3","Type":"ContainerStarted","Data":"168daff4dead01ff42550c3c0e1d25061ab693805050a3df14479935dbfa52a0"} Apr 24 16:40:02.580563 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:02.580544 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dzj8d" event={"ID":"eefe78cf-7d49-4547-bded-f34c94ebc29b","Type":"ContainerStarted","Data":"8bf72485c746284e218550a93de17833f7a8a23a34fd215f9b54439e6ccbfa2f"} Apr 24 16:40:02.591036 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:02.590998 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-thz9k" podStartSLOduration=67.0223188 podStartE2EDuration="1m12.590987258s" podCreationTimestamp="2026-04-24 16:38:50 +0000 UTC" firstStartedPulling="2026-04-24 16:39:56.496514653 +0000 UTC m=+66.868218861" lastFinishedPulling="2026-04-24 16:40:02.0651831 +0000 UTC m=+72.436887319" observedRunningTime="2026-04-24 16:40:02.589517166 +0000 UTC m=+72.961221389" watchObservedRunningTime="2026-04-24 16:40:02.590987258 +0000 UTC m=+72.962691484" Apr 24 16:40:02.608271 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:02.608216 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-kcgx6" podStartSLOduration=33.136382267 podStartE2EDuration="38.608199813s" podCreationTimestamp="2026-04-24 16:39:24 +0000 UTC" firstStartedPulling="2026-04-24 16:39:56.593971222 +0000 UTC m=+66.965675619" lastFinishedPulling="2026-04-24 16:40:02.065788962 +0000 UTC m=+72.437493165" observedRunningTime="2026-04-24 16:40:02.606930137 +0000 UTC m=+72.978634390" watchObservedRunningTime="2026-04-24 16:40:02.608199813 +0000 UTC m=+72.979904040" Apr 24 16:40:02.651185 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:02.650613 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-rlzxl" podStartSLOduration=65.985401746 podStartE2EDuration="1m11.650594512s" podCreationTimestamp="2026-04-24 16:38:51 +0000 UTC" firstStartedPulling="2026-04-24 16:39:56.400548252 +0000 UTC m=+66.772252471" lastFinishedPulling="2026-04-24 16:40:02.065741017 +0000 UTC m=+72.437445237" observedRunningTime="2026-04-24 16:40:02.632063418 +0000 UTC m=+73.003767643" watchObservedRunningTime="2026-04-24 16:40:02.650594512 +0000 UTC m=+73.022298741" Apr 24 16:40:02.651185 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:02.650956 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-n6v4x" podStartSLOduration=63.042693264 podStartE2EDuration="1m8.650943912s" podCreationTimestamp="2026-04-24 16:38:54 +0000 UTC" firstStartedPulling="2026-04-24 16:39:56.457273314 +0000 UTC m=+66.828977530" lastFinishedPulling="2026-04-24 16:40:02.065523959 +0000 UTC m=+72.437228178" observedRunningTime="2026-04-24 16:40:02.65093221 +0000 UTC m=+73.022636436" watchObservedRunningTime="2026-04-24 16:40:02.650943912 +0000 UTC m=+73.022648138" Apr 24 16:40:02.676325 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:02.676249 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-cxjs9" podStartSLOduration=1.779889754 podStartE2EDuration="7.676232888s" podCreationTimestamp="2026-04-24 16:39:55 +0000 UTC" firstStartedPulling="2026-04-24 16:39:56.169113905 +0000 UTC m=+66.540818109" lastFinishedPulling="2026-04-24 16:40:02.065457034 +0000 UTC m=+72.437161243" observedRunningTime="2026-04-24 16:40:02.675600881 +0000 UTC m=+73.047305107" watchObservedRunningTime="2026-04-24 16:40:02.676232888 +0000 UTC m=+73.047937114" Apr 24 16:40:03.584972 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:03.584921 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dzj8d" event={"ID":"eefe78cf-7d49-4547-bded-f34c94ebc29b","Type":"ContainerStarted","Data":"2dd9005bec1f81eb859b09e027c807651158f7bf8cf28ceb9888f00ed983df85"} Apr 24 16:40:03.585762 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:03.585745 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-dzj8d" Apr 24 16:40:03.604842 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:03.604794 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-dzj8d" podStartSLOduration=34.109581068 podStartE2EDuration="39.604781421s" podCreationTimestamp="2026-04-24 16:39:24 +0000 UTC" firstStartedPulling="2026-04-24 16:39:56.575698217 +0000 UTC m=+66.947402421" lastFinishedPulling="2026-04-24 16:40:02.070898566 +0000 UTC m=+72.442602774" observedRunningTime="2026-04-24 16:40:03.603577364 +0000 UTC m=+73.975281590" watchObservedRunningTime="2026-04-24 16:40:03.604781421 +0000 UTC m=+73.976485647" Apr 24 16:40:14.249833 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.249715 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-dpwmn"] Apr 24 16:40:14.254397 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.254375 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:14.256505 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.256483 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 16:40:14.256617 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.256483 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 16:40:14.256728 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.256708 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-n9mn8\"" Apr 24 16:40:14.256782 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.256730 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 16:40:14.257272 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.257257 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 16:40:14.412968 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.412929 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtcjr\" (UniqueName: \"kubernetes.io/projected/f9ce762e-9cdb-4a93-8198-3c06cb490198-kube-api-access-qtcjr\") pod \"node-exporter-dpwmn\" (UID: \"f9ce762e-9cdb-4a93-8198-3c06cb490198\") " pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:14.412968 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.412974 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f9ce762e-9cdb-4a93-8198-3c06cb490198-node-exporter-tls\") pod \"node-exporter-dpwmn\" (UID: \"f9ce762e-9cdb-4a93-8198-3c06cb490198\") " pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:14.413199 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.412997 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f9ce762e-9cdb-4a93-8198-3c06cb490198-node-exporter-textfile\") pod \"node-exporter-dpwmn\" (UID: \"f9ce762e-9cdb-4a93-8198-3c06cb490198\") " pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:14.413199 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.413076 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f9ce762e-9cdb-4a93-8198-3c06cb490198-node-exporter-accelerators-collector-config\") pod \"node-exporter-dpwmn\" (UID: \"f9ce762e-9cdb-4a93-8198-3c06cb490198\") " pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:14.413199 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.413137 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f9ce762e-9cdb-4a93-8198-3c06cb490198-sys\") pod \"node-exporter-dpwmn\" (UID: \"f9ce762e-9cdb-4a93-8198-3c06cb490198\") " pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:14.413199 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.413169 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f9ce762e-9cdb-4a93-8198-3c06cb490198-node-exporter-wtmp\") pod \"node-exporter-dpwmn\" (UID: \"f9ce762e-9cdb-4a93-8198-3c06cb490198\") " pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:14.413199 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.413188 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f9ce762e-9cdb-4a93-8198-3c06cb490198-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dpwmn\" (UID: \"f9ce762e-9cdb-4a93-8198-3c06cb490198\") " pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:14.413418 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.413245 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f9ce762e-9cdb-4a93-8198-3c06cb490198-root\") pod \"node-exporter-dpwmn\" (UID: \"f9ce762e-9cdb-4a93-8198-3c06cb490198\") " pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:14.413418 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.413271 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f9ce762e-9cdb-4a93-8198-3c06cb490198-metrics-client-ca\") pod \"node-exporter-dpwmn\" (UID: \"f9ce762e-9cdb-4a93-8198-3c06cb490198\") " pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:14.513741 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.513654 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f9ce762e-9cdb-4a93-8198-3c06cb490198-node-exporter-wtmp\") pod \"node-exporter-dpwmn\" (UID: \"f9ce762e-9cdb-4a93-8198-3c06cb490198\") " pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:14.513741 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.513695 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f9ce762e-9cdb-4a93-8198-3c06cb490198-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dpwmn\" (UID: \"f9ce762e-9cdb-4a93-8198-3c06cb490198\") " pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:14.513741 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.513734 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f9ce762e-9cdb-4a93-8198-3c06cb490198-root\") pod \"node-exporter-dpwmn\" (UID: \"f9ce762e-9cdb-4a93-8198-3c06cb490198\") " pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:14.514011 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.513784 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f9ce762e-9cdb-4a93-8198-3c06cb490198-root\") pod \"node-exporter-dpwmn\" (UID: \"f9ce762e-9cdb-4a93-8198-3c06cb490198\") " pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:14.514011 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.513834 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f9ce762e-9cdb-4a93-8198-3c06cb490198-node-exporter-wtmp\") pod \"node-exporter-dpwmn\" (UID: \"f9ce762e-9cdb-4a93-8198-3c06cb490198\") " pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:14.514011 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.513862 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f9ce762e-9cdb-4a93-8198-3c06cb490198-metrics-client-ca\") pod \"node-exporter-dpwmn\" (UID: \"f9ce762e-9cdb-4a93-8198-3c06cb490198\") " pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:14.514011 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.513935 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtcjr\" (UniqueName: \"kubernetes.io/projected/f9ce762e-9cdb-4a93-8198-3c06cb490198-kube-api-access-qtcjr\") pod \"node-exporter-dpwmn\" (UID: \"f9ce762e-9cdb-4a93-8198-3c06cb490198\") " pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:14.514011 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.513955 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f9ce762e-9cdb-4a93-8198-3c06cb490198-node-exporter-tls\") pod \"node-exporter-dpwmn\" (UID: \"f9ce762e-9cdb-4a93-8198-3c06cb490198\") " pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:14.514011 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.513975 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f9ce762e-9cdb-4a93-8198-3c06cb490198-node-exporter-textfile\") pod \"node-exporter-dpwmn\" (UID: \"f9ce762e-9cdb-4a93-8198-3c06cb490198\") " pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:14.514011 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.513993 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f9ce762e-9cdb-4a93-8198-3c06cb490198-node-exporter-accelerators-collector-config\") pod \"node-exporter-dpwmn\" (UID: \"f9ce762e-9cdb-4a93-8198-3c06cb490198\") " pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:14.514388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.514025 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f9ce762e-9cdb-4a93-8198-3c06cb490198-sys\") pod \"node-exporter-dpwmn\" (UID: \"f9ce762e-9cdb-4a93-8198-3c06cb490198\") " pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:14.514388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.514092 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f9ce762e-9cdb-4a93-8198-3c06cb490198-sys\") pod \"node-exporter-dpwmn\" (UID: \"f9ce762e-9cdb-4a93-8198-3c06cb490198\") " pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:14.514388 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:40:14.514181 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 16:40:14.514388 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:40:14.514271 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9ce762e-9cdb-4a93-8198-3c06cb490198-node-exporter-tls podName:f9ce762e-9cdb-4a93-8198-3c06cb490198 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:15.014248435 +0000 UTC m=+85.385952640 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/f9ce762e-9cdb-4a93-8198-3c06cb490198-node-exporter-tls") pod "node-exporter-dpwmn" (UID: "f9ce762e-9cdb-4a93-8198-3c06cb490198") : secret "node-exporter-tls" not found Apr 24 16:40:14.514388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.514328 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f9ce762e-9cdb-4a93-8198-3c06cb490198-node-exporter-textfile\") pod \"node-exporter-dpwmn\" (UID: \"f9ce762e-9cdb-4a93-8198-3c06cb490198\") " pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:14.514824 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.514452 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f9ce762e-9cdb-4a93-8198-3c06cb490198-metrics-client-ca\") pod \"node-exporter-dpwmn\" (UID: \"f9ce762e-9cdb-4a93-8198-3c06cb490198\") " pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:14.514824 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.514606 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f9ce762e-9cdb-4a93-8198-3c06cb490198-node-exporter-accelerators-collector-config\") pod \"node-exporter-dpwmn\" (UID: \"f9ce762e-9cdb-4a93-8198-3c06cb490198\") " pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:14.516352 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.516332 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f9ce762e-9cdb-4a93-8198-3c06cb490198-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-dpwmn\" (UID: \"f9ce762e-9cdb-4a93-8198-3c06cb490198\") " pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:14.525349 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.525322 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtcjr\" (UniqueName: \"kubernetes.io/projected/f9ce762e-9cdb-4a93-8198-3c06cb490198-kube-api-access-qtcjr\") pod \"node-exporter-dpwmn\" (UID: \"f9ce762e-9cdb-4a93-8198-3c06cb490198\") " pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:14.595517 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:14.595481 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-dzj8d" Apr 24 16:40:15.017762 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:15.017727 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f9ce762e-9cdb-4a93-8198-3c06cb490198-node-exporter-tls\") pod \"node-exporter-dpwmn\" (UID: \"f9ce762e-9cdb-4a93-8198-3c06cb490198\") " pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:15.017952 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:40:15.017893 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 16:40:15.018015 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:40:15.017977 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9ce762e-9cdb-4a93-8198-3c06cb490198-node-exporter-tls podName:f9ce762e-9cdb-4a93-8198-3c06cb490198 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:16.017952906 +0000 UTC m=+86.389657125 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/f9ce762e-9cdb-4a93-8198-3c06cb490198-node-exporter-tls") pod "node-exporter-dpwmn" (UID: "f9ce762e-9cdb-4a93-8198-3c06cb490198") : secret "node-exporter-tls" not found Apr 24 16:40:16.026892 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:16.026862 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f9ce762e-9cdb-4a93-8198-3c06cb490198-node-exporter-tls\") pod \"node-exporter-dpwmn\" (UID: \"f9ce762e-9cdb-4a93-8198-3c06cb490198\") " pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:16.029185 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:16.029161 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f9ce762e-9cdb-4a93-8198-3c06cb490198-node-exporter-tls\") pod \"node-exporter-dpwmn\" (UID: \"f9ce762e-9cdb-4a93-8198-3c06cb490198\") " pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:16.064188 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:16.064160 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-dpwmn" Apr 24 16:40:16.076986 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:40:16.076955 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9ce762e_9cdb_4a93_8198_3c06cb490198.slice/crio-7ffe5ca01d015b1c6f48b2830836527e0308c95623e3553e08e8431607feaf3f WatchSource:0}: Error finding container 7ffe5ca01d015b1c6f48b2830836527e0308c95623e3553e08e8431607feaf3f: Status 404 returned error can't find the container with id 7ffe5ca01d015b1c6f48b2830836527e0308c95623e3553e08e8431607feaf3f Apr 24 16:40:16.199022 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:16.198979 2578 patch_prober.go:28] interesting pod/image-registry-759b975967-pgkrt container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 16:40:16.199188 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:16.199052 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-759b975967-pgkrt" podUID="ea323d90-8b5e-41a9-9633-538a3fdcead6" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:40:16.622038 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:16.622004 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dpwmn" event={"ID":"f9ce762e-9cdb-4a93-8198-3c06cb490198","Type":"ContainerStarted","Data":"7ffe5ca01d015b1c6f48b2830836527e0308c95623e3553e08e8431607feaf3f"} Apr 24 16:40:17.626411 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:17.626373 2578 generic.go:358] "Generic (PLEG): container finished" podID="f9ce762e-9cdb-4a93-8198-3c06cb490198" containerID="630f0e4068b460582e50da242e13e427622332d1e837fcd073ec2d08b24ca233" exitCode=0 Apr 24 16:40:17.626815 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:17.626421 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dpwmn" event={"ID":"f9ce762e-9cdb-4a93-8198-3c06cb490198","Type":"ContainerDied","Data":"630f0e4068b460582e50da242e13e427622332d1e837fcd073ec2d08b24ca233"} Apr 24 16:40:18.630792 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:18.630757 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dpwmn" event={"ID":"f9ce762e-9cdb-4a93-8198-3c06cb490198","Type":"ContainerStarted","Data":"79a9e7fec5d9db1ab97941d9c77ae67462b4993ee6d898ae97f2f7d5ab4027d9"} Apr 24 16:40:18.630792 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:18.630791 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-dpwmn" event={"ID":"f9ce762e-9cdb-4a93-8198-3c06cb490198","Type":"ContainerStarted","Data":"f600e312c0492b10fc01c832266d94ea9ced814b3973de9b922c94f26d784178"} Apr 24 16:40:18.650961 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:18.650913 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-dpwmn" podStartSLOduration=3.9327514900000002 podStartE2EDuration="4.650898611s" podCreationTimestamp="2026-04-24 16:40:14 +0000 UTC" firstStartedPulling="2026-04-24 16:40:16.078582159 +0000 UTC m=+86.450286363" lastFinishedPulling="2026-04-24 16:40:16.796729269 +0000 UTC m=+87.168433484" observedRunningTime="2026-04-24 16:40:18.650123641 +0000 UTC m=+89.021827868" watchObservedRunningTime="2026-04-24 16:40:18.650898611 +0000 UTC m=+89.022602837" Apr 24 16:40:19.026236 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:19.026194 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-759b975967-pgkrt"] Apr 24 16:40:19.030676 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:19.030651 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:40:29.554273 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:29.554236 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-hcjpg" Apr 24 16:40:44.045770 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.045704 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-759b975967-pgkrt" podUID="ea323d90-8b5e-41a9-9633-538a3fdcead6" containerName="registry" containerID="cri-o://3ac73e203bf8ad1c862d9928acc5ba0b2a50a5216cc11af601cca91b50bc7596" gracePeriod=30 Apr 24 16:40:44.284359 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.284336 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:40:44.320125 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.320048 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-registry-tls\") pod \"ea323d90-8b5e-41a9-9633-538a3fdcead6\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " Apr 24 16:40:44.320125 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.320082 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ea323d90-8b5e-41a9-9633-538a3fdcead6-image-registry-private-configuration\") pod \"ea323d90-8b5e-41a9-9633-538a3fdcead6\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " Apr 24 16:40:44.320125 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.320110 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ea323d90-8b5e-41a9-9633-538a3fdcead6-installation-pull-secrets\") pod \"ea323d90-8b5e-41a9-9633-538a3fdcead6\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " Apr 24 16:40:44.320400 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.320136 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-bound-sa-token\") pod \"ea323d90-8b5e-41a9-9633-538a3fdcead6\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " Apr 24 16:40:44.320400 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.320172 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea323d90-8b5e-41a9-9633-538a3fdcead6-trusted-ca\") pod \"ea323d90-8b5e-41a9-9633-538a3fdcead6\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " Apr 24 16:40:44.320400 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.320239 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ea323d90-8b5e-41a9-9633-538a3fdcead6-ca-trust-extracted\") pod \"ea323d90-8b5e-41a9-9633-538a3fdcead6\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " Apr 24 16:40:44.320400 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.320301 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ea323d90-8b5e-41a9-9633-538a3fdcead6-registry-certificates\") pod \"ea323d90-8b5e-41a9-9633-538a3fdcead6\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " Apr 24 16:40:44.320760 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.320726 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea323d90-8b5e-41a9-9633-538a3fdcead6-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "ea323d90-8b5e-41a9-9633-538a3fdcead6" (UID: "ea323d90-8b5e-41a9-9633-538a3fdcead6"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:40:44.321276 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.321242 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea323d90-8b5e-41a9-9633-538a3fdcead6-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "ea323d90-8b5e-41a9-9633-538a3fdcead6" (UID: "ea323d90-8b5e-41a9-9633-538a3fdcead6"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:40:44.322660 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.322637 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccmg4\" (UniqueName: \"kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-kube-api-access-ccmg4\") pod \"ea323d90-8b5e-41a9-9633-538a3fdcead6\" (UID: \"ea323d90-8b5e-41a9-9633-538a3fdcead6\") " Apr 24 16:40:44.322874 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.322844 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea323d90-8b5e-41a9-9633-538a3fdcead6-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "ea323d90-8b5e-41a9-9633-538a3fdcead6" (UID: "ea323d90-8b5e-41a9-9633-538a3fdcead6"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:40:44.323167 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.322939 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea323d90-8b5e-41a9-9633-538a3fdcead6-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "ea323d90-8b5e-41a9-9633-538a3fdcead6" (UID: "ea323d90-8b5e-41a9-9633-538a3fdcead6"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:40:44.323167 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.323057 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ea323d90-8b5e-41a9-9633-538a3fdcead6-registry-certificates\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 16:40:44.323167 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.323078 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ea323d90-8b5e-41a9-9633-538a3fdcead6-image-registry-private-configuration\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 16:40:44.323167 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.323094 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ea323d90-8b5e-41a9-9633-538a3fdcead6-installation-pull-secrets\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 16:40:44.323167 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.323109 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea323d90-8b5e-41a9-9633-538a3fdcead6-trusted-ca\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 16:40:44.323509 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.323320 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "ea323d90-8b5e-41a9-9633-538a3fdcead6" (UID: "ea323d90-8b5e-41a9-9633-538a3fdcead6"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:40:44.323509 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.323418 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "ea323d90-8b5e-41a9-9633-538a3fdcead6" (UID: "ea323d90-8b5e-41a9-9633-538a3fdcead6"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:40:44.325337 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.325306 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-kube-api-access-ccmg4" (OuterVolumeSpecName: "kube-api-access-ccmg4") pod "ea323d90-8b5e-41a9-9633-538a3fdcead6" (UID: "ea323d90-8b5e-41a9-9633-538a3fdcead6"). InnerVolumeSpecName "kube-api-access-ccmg4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:40:44.330035 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.330010 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea323d90-8b5e-41a9-9633-538a3fdcead6-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "ea323d90-8b5e-41a9-9633-538a3fdcead6" (UID: "ea323d90-8b5e-41a9-9633-538a3fdcead6"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:40:44.424434 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.424399 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ccmg4\" (UniqueName: \"kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-kube-api-access-ccmg4\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 16:40:44.424434 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.424429 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-registry-tls\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 16:40:44.424434 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.424439 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ea323d90-8b5e-41a9-9633-538a3fdcead6-bound-sa-token\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 16:40:44.424649 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.424447 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ea323d90-8b5e-41a9-9633-538a3fdcead6-ca-trust-extracted\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 16:40:44.703761 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.703726 2578 generic.go:358] "Generic (PLEG): container finished" podID="ea323d90-8b5e-41a9-9633-538a3fdcead6" containerID="3ac73e203bf8ad1c862d9928acc5ba0b2a50a5216cc11af601cca91b50bc7596" exitCode=0 Apr 24 16:40:44.703933 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.703782 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-759b975967-pgkrt" Apr 24 16:40:44.703933 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.703813 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-759b975967-pgkrt" event={"ID":"ea323d90-8b5e-41a9-9633-538a3fdcead6","Type":"ContainerDied","Data":"3ac73e203bf8ad1c862d9928acc5ba0b2a50a5216cc11af601cca91b50bc7596"} Apr 24 16:40:44.703933 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.703854 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-759b975967-pgkrt" event={"ID":"ea323d90-8b5e-41a9-9633-538a3fdcead6","Type":"ContainerDied","Data":"3cb4aafdd4e2d12394c2f67822e1765c01cda4f3e79617d036ba6397710e44a9"} Apr 24 16:40:44.703933 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.703871 2578 scope.go:117] "RemoveContainer" containerID="3ac73e203bf8ad1c862d9928acc5ba0b2a50a5216cc11af601cca91b50bc7596" Apr 24 16:40:44.712722 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.712707 2578 scope.go:117] "RemoveContainer" containerID="3ac73e203bf8ad1c862d9928acc5ba0b2a50a5216cc11af601cca91b50bc7596" Apr 24 16:40:44.713004 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:40:44.712985 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ac73e203bf8ad1c862d9928acc5ba0b2a50a5216cc11af601cca91b50bc7596\": container with ID starting with 3ac73e203bf8ad1c862d9928acc5ba0b2a50a5216cc11af601cca91b50bc7596 not found: ID does not exist" containerID="3ac73e203bf8ad1c862d9928acc5ba0b2a50a5216cc11af601cca91b50bc7596" Apr 24 16:40:44.713047 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.713012 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ac73e203bf8ad1c862d9928acc5ba0b2a50a5216cc11af601cca91b50bc7596"} err="failed to get container status \"3ac73e203bf8ad1c862d9928acc5ba0b2a50a5216cc11af601cca91b50bc7596\": rpc error: code = NotFound desc = could not find container \"3ac73e203bf8ad1c862d9928acc5ba0b2a50a5216cc11af601cca91b50bc7596\": container with ID starting with 3ac73e203bf8ad1c862d9928acc5ba0b2a50a5216cc11af601cca91b50bc7596 not found: ID does not exist" Apr 24 16:40:44.724628 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.724605 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-759b975967-pgkrt"] Apr 24 16:40:44.727986 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:44.727965 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-759b975967-pgkrt"] Apr 24 16:40:46.236938 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:46.236898 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea323d90-8b5e-41a9-9633-538a3fdcead6" path="/var/lib/kubelet/pods/ea323d90-8b5e-41a9-9633-538a3fdcead6/volumes" Apr 24 16:40:51.726419 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:51.726389 2578 generic.go:358] "Generic (PLEG): container finished" podID="335d3a61-4224-4da9-adb2-5f83cb395511" containerID="fe0293bceebe0a222fbb18ddfc955268a20badd24f70dc1457dca366df3d6fef" exitCode=0 Apr 24 16:40:51.726707 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:51.726460 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnwr7" event={"ID":"335d3a61-4224-4da9-adb2-5f83cb395511","Type":"ContainerDied","Data":"fe0293bceebe0a222fbb18ddfc955268a20badd24f70dc1457dca366df3d6fef"} Apr 24 16:40:51.726772 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:51.726757 2578 scope.go:117] "RemoveContainer" containerID="fe0293bceebe0a222fbb18ddfc955268a20badd24f70dc1457dca366df3d6fef" Apr 24 16:40:52.730542 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:40:52.730501 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jnwr7" event={"ID":"335d3a61-4224-4da9-adb2-5f83cb395511","Type":"ContainerStarted","Data":"c6bafa5a93e5d5c901853ef152c39c86eb9695f2d6992871a83888d36e6eda4e"} Apr 24 16:41:01.756483 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:41:01.756447 2578 generic.go:358] "Generic (PLEG): container finished" podID="d0f570bd-6250-4e0c-a370-c143d42e56f0" containerID="907f1bb9647c208fdabc1aa309fec57cb616e49df41c235587f54a9f03233cac" exitCode=0 Apr 24 16:41:01.756872 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:41:01.756524 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-czwgm" event={"ID":"d0f570bd-6250-4e0c-a370-c143d42e56f0","Type":"ContainerDied","Data":"907f1bb9647c208fdabc1aa309fec57cb616e49df41c235587f54a9f03233cac"} Apr 24 16:41:01.756872 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:41:01.756829 2578 scope.go:117] "RemoveContainer" containerID="907f1bb9647c208fdabc1aa309fec57cb616e49df41c235587f54a9f03233cac" Apr 24 16:41:02.761219 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:41:02.761177 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-czwgm" event={"ID":"d0f570bd-6250-4e0c-a370-c143d42e56f0","Type":"ContainerStarted","Data":"3d102ce1c93bf2d956fb4a2b9c5aa962fc1e58cd54ee8295bb70dbfd96591610"} Apr 24 16:41:02.762511 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:41:02.762489 2578 generic.go:358] "Generic (PLEG): container finished" podID="f0f4702a-d6d6-410a-9800-fb13b913d223" containerID="fbcc82c6689dd8db3292e4465b7cee512d84c3a7b8c2ff38a33ed76c1bc0dd65" exitCode=0 Apr 24 16:41:02.762610 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:41:02.762566 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-8j4f4" event={"ID":"f0f4702a-d6d6-410a-9800-fb13b913d223","Type":"ContainerDied","Data":"fbcc82c6689dd8db3292e4465b7cee512d84c3a7b8c2ff38a33ed76c1bc0dd65"} Apr 24 16:41:02.762835 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:41:02.762822 2578 scope.go:117] "RemoveContainer" containerID="fbcc82c6689dd8db3292e4465b7cee512d84c3a7b8c2ff38a33ed76c1bc0dd65" Apr 24 16:41:03.767160 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:41:03.767125 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-8j4f4" event={"ID":"f0f4702a-d6d6-410a-9800-fb13b913d223","Type":"ContainerStarted","Data":"04d75b7661b74d7399f53fe5464176bc7cfe76cd098d784c017c7ed6b1580456"} Apr 24 16:42:15.762533 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:15.762485 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4hjf"] Apr 24 16:42:15.763067 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:15.762873 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea323d90-8b5e-41a9-9633-538a3fdcead6" containerName="registry" Apr 24 16:42:15.763067 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:15.762893 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea323d90-8b5e-41a9-9633-538a3fdcead6" containerName="registry" Apr 24 16:42:15.763067 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:15.762967 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ea323d90-8b5e-41a9-9633-538a3fdcead6" containerName="registry" Apr 24 16:42:15.765918 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:15.765904 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4hjf" Apr 24 16:42:15.768355 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:15.768325 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 16:42:15.768485 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:15.768411 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-z4sd5\"" Apr 24 16:42:15.769021 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:15.769007 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 16:42:15.774766 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:15.774743 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4hjf"] Apr 24 16:42:15.803303 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:15.803251 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b6ec65d-498d-4e11-8200-cab7bd4092e2-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4hjf\" (UID: \"2b6ec65d-498d-4e11-8200-cab7bd4092e2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4hjf" Apr 24 16:42:15.803446 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:15.803355 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b6ec65d-498d-4e11-8200-cab7bd4092e2-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4hjf\" (UID: \"2b6ec65d-498d-4e11-8200-cab7bd4092e2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4hjf" Apr 24 16:42:15.803446 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:15.803391 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtrdz\" (UniqueName: \"kubernetes.io/projected/2b6ec65d-498d-4e11-8200-cab7bd4092e2-kube-api-access-rtrdz\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4hjf\" (UID: \"2b6ec65d-498d-4e11-8200-cab7bd4092e2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4hjf" Apr 24 16:42:15.904292 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:15.904254 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b6ec65d-498d-4e11-8200-cab7bd4092e2-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4hjf\" (UID: \"2b6ec65d-498d-4e11-8200-cab7bd4092e2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4hjf" Apr 24 16:42:15.904444 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:15.904328 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtrdz\" (UniqueName: \"kubernetes.io/projected/2b6ec65d-498d-4e11-8200-cab7bd4092e2-kube-api-access-rtrdz\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4hjf\" (UID: \"2b6ec65d-498d-4e11-8200-cab7bd4092e2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4hjf" Apr 24 16:42:15.904444 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:15.904374 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b6ec65d-498d-4e11-8200-cab7bd4092e2-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4hjf\" (UID: \"2b6ec65d-498d-4e11-8200-cab7bd4092e2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4hjf" Apr 24 16:42:15.904667 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:15.904650 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b6ec65d-498d-4e11-8200-cab7bd4092e2-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4hjf\" (UID: \"2b6ec65d-498d-4e11-8200-cab7bd4092e2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4hjf" Apr 24 16:42:15.904725 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:15.904694 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b6ec65d-498d-4e11-8200-cab7bd4092e2-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4hjf\" (UID: \"2b6ec65d-498d-4e11-8200-cab7bd4092e2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4hjf" Apr 24 16:42:15.913524 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:15.913499 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtrdz\" (UniqueName: \"kubernetes.io/projected/2b6ec65d-498d-4e11-8200-cab7bd4092e2-kube-api-access-rtrdz\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4hjf\" (UID: \"2b6ec65d-498d-4e11-8200-cab7bd4092e2\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4hjf" Apr 24 16:42:16.075487 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:16.075398 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4hjf" Apr 24 16:42:16.196336 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:16.196132 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4hjf"] Apr 24 16:42:16.199066 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:42:16.199033 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b6ec65d_498d_4e11_8200_cab7bd4092e2.slice/crio-078c5f8b2bb5f81eb6182b0ea9e6a717c0096890f74e066458b3fc35179d6daa WatchSource:0}: Error finding container 078c5f8b2bb5f81eb6182b0ea9e6a717c0096890f74e066458b3fc35179d6daa: Status 404 returned error can't find the container with id 078c5f8b2bb5f81eb6182b0ea9e6a717c0096890f74e066458b3fc35179d6daa Apr 24 16:42:16.980115 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:16.980074 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4hjf" event={"ID":"2b6ec65d-498d-4e11-8200-cab7bd4092e2","Type":"ContainerStarted","Data":"078c5f8b2bb5f81eb6182b0ea9e6a717c0096890f74e066458b3fc35179d6daa"} Apr 24 16:42:20.797397 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:20.797360 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-587b79bf5-jk4v9"] Apr 24 16:42:20.801666 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:20.801645 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-587b79bf5-jk4v9" Apr 24 16:42:20.804243 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:20.804221 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 16:42:20.804390 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:20.804230 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-gqpvr\"" Apr 24 16:42:20.805168 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:20.805137 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 16:42:20.805302 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:20.805215 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 16:42:20.805388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:20.805218 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 24 16:42:20.810933 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:20.810885 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-587b79bf5-jk4v9"] Apr 24 16:42:20.840787 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:20.840604 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/51ed9176-938e-4fec-a0d7-29b14f7ceab7-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-587b79bf5-jk4v9\" (UID: \"51ed9176-938e-4fec-a0d7-29b14f7ceab7\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-587b79bf5-jk4v9" Apr 24 16:42:20.840787 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:20.840658 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdljq\" (UniqueName: \"kubernetes.io/projected/51ed9176-938e-4fec-a0d7-29b14f7ceab7-kube-api-access-jdljq\") pod \"managed-serviceaccount-addon-agent-587b79bf5-jk4v9\" (UID: \"51ed9176-938e-4fec-a0d7-29b14f7ceab7\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-587b79bf5-jk4v9" Apr 24 16:42:20.915841 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:20.915808 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6865b6457d-bh7vn"] Apr 24 16:42:20.919305 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:20.919262 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6865b6457d-bh7vn" Apr 24 16:42:20.921801 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:20.921619 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 24 16:42:20.921801 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:20.921667 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 24 16:42:20.921801 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:20.921670 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 24 16:42:20.921801 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:20.921619 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 24 16:42:20.932439 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:20.932413 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6865b6457d-bh7vn"] Apr 24 16:42:20.941806 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:20.941780 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/51ed9176-938e-4fec-a0d7-29b14f7ceab7-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-587b79bf5-jk4v9\" (UID: \"51ed9176-938e-4fec-a0d7-29b14f7ceab7\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-587b79bf5-jk4v9" Apr 24 16:42:20.941941 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:20.941820 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdljq\" (UniqueName: \"kubernetes.io/projected/51ed9176-938e-4fec-a0d7-29b14f7ceab7-kube-api-access-jdljq\") pod \"managed-serviceaccount-addon-agent-587b79bf5-jk4v9\" (UID: \"51ed9176-938e-4fec-a0d7-29b14f7ceab7\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-587b79bf5-jk4v9" Apr 24 16:42:20.945248 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:20.945221 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/51ed9176-938e-4fec-a0d7-29b14f7ceab7-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-587b79bf5-jk4v9\" (UID: \"51ed9176-938e-4fec-a0d7-29b14f7ceab7\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-587b79bf5-jk4v9" Apr 24 16:42:20.950369 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:20.950346 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdljq\" (UniqueName: \"kubernetes.io/projected/51ed9176-938e-4fec-a0d7-29b14f7ceab7-kube-api-access-jdljq\") pod \"managed-serviceaccount-addon-agent-587b79bf5-jk4v9\" (UID: \"51ed9176-938e-4fec-a0d7-29b14f7ceab7\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-587b79bf5-jk4v9" Apr 24 16:42:21.043192 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:21.043157 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/fd17e30d-8cf3-42ae-9802-439073672acf-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6865b6457d-bh7vn\" (UID: \"fd17e30d-8cf3-42ae-9802-439073672acf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6865b6457d-bh7vn" Apr 24 16:42:21.043388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:21.043218 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9htg6\" (UniqueName: \"kubernetes.io/projected/fd17e30d-8cf3-42ae-9802-439073672acf-kube-api-access-9htg6\") pod \"cluster-proxy-proxy-agent-6865b6457d-bh7vn\" (UID: \"fd17e30d-8cf3-42ae-9802-439073672acf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6865b6457d-bh7vn" Apr 24 16:42:21.043388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:21.043245 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/fd17e30d-8cf3-42ae-9802-439073672acf-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6865b6457d-bh7vn\" (UID: \"fd17e30d-8cf3-42ae-9802-439073672acf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6865b6457d-bh7vn" Apr 24 16:42:21.043388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:21.043308 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/fd17e30d-8cf3-42ae-9802-439073672acf-hub\") pod \"cluster-proxy-proxy-agent-6865b6457d-bh7vn\" (UID: \"fd17e30d-8cf3-42ae-9802-439073672acf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6865b6457d-bh7vn" Apr 24 16:42:21.043388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:21.043338 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fd17e30d-8cf3-42ae-9802-439073672acf-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6865b6457d-bh7vn\" (UID: \"fd17e30d-8cf3-42ae-9802-439073672acf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6865b6457d-bh7vn" Apr 24 16:42:21.043388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:21.043360 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/fd17e30d-8cf3-42ae-9802-439073672acf-ca\") pod \"cluster-proxy-proxy-agent-6865b6457d-bh7vn\" (UID: \"fd17e30d-8cf3-42ae-9802-439073672acf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6865b6457d-bh7vn" Apr 24 16:42:21.124914 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:21.124879 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-587b79bf5-jk4v9" Apr 24 16:42:21.144741 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:21.144712 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/fd17e30d-8cf3-42ae-9802-439073672acf-hub\") pod \"cluster-proxy-proxy-agent-6865b6457d-bh7vn\" (UID: \"fd17e30d-8cf3-42ae-9802-439073672acf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6865b6457d-bh7vn" Apr 24 16:42:21.144856 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:21.144766 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fd17e30d-8cf3-42ae-9802-439073672acf-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6865b6457d-bh7vn\" (UID: \"fd17e30d-8cf3-42ae-9802-439073672acf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6865b6457d-bh7vn" Apr 24 16:42:21.144856 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:21.144801 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/fd17e30d-8cf3-42ae-9802-439073672acf-ca\") pod \"cluster-proxy-proxy-agent-6865b6457d-bh7vn\" (UID: \"fd17e30d-8cf3-42ae-9802-439073672acf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6865b6457d-bh7vn" Apr 24 16:42:21.144856 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:21.144834 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/fd17e30d-8cf3-42ae-9802-439073672acf-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6865b6457d-bh7vn\" (UID: \"fd17e30d-8cf3-42ae-9802-439073672acf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6865b6457d-bh7vn" Apr 24 16:42:21.145012 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:21.144901 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9htg6\" (UniqueName: \"kubernetes.io/projected/fd17e30d-8cf3-42ae-9802-439073672acf-kube-api-access-9htg6\") pod \"cluster-proxy-proxy-agent-6865b6457d-bh7vn\" (UID: \"fd17e30d-8cf3-42ae-9802-439073672acf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6865b6457d-bh7vn" Apr 24 16:42:21.145012 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:21.144935 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/fd17e30d-8cf3-42ae-9802-439073672acf-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6865b6457d-bh7vn\" (UID: \"fd17e30d-8cf3-42ae-9802-439073672acf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6865b6457d-bh7vn" Apr 24 16:42:21.146088 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:21.146029 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/fd17e30d-8cf3-42ae-9802-439073672acf-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6865b6457d-bh7vn\" (UID: \"fd17e30d-8cf3-42ae-9802-439073672acf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6865b6457d-bh7vn" Apr 24 16:42:21.147998 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:21.147941 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/fd17e30d-8cf3-42ae-9802-439073672acf-hub\") pod \"cluster-proxy-proxy-agent-6865b6457d-bh7vn\" (UID: \"fd17e30d-8cf3-42ae-9802-439073672acf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6865b6457d-bh7vn" Apr 24 16:42:21.148419 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:21.148393 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/fd17e30d-8cf3-42ae-9802-439073672acf-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6865b6457d-bh7vn\" (UID: \"fd17e30d-8cf3-42ae-9802-439073672acf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6865b6457d-bh7vn" Apr 24 16:42:21.148516 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:21.148483 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/fd17e30d-8cf3-42ae-9802-439073672acf-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6865b6457d-bh7vn\" (UID: \"fd17e30d-8cf3-42ae-9802-439073672acf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6865b6457d-bh7vn" Apr 24 16:42:21.148744 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:21.148720 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/fd17e30d-8cf3-42ae-9802-439073672acf-ca\") pod \"cluster-proxy-proxy-agent-6865b6457d-bh7vn\" (UID: \"fd17e30d-8cf3-42ae-9802-439073672acf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6865b6457d-bh7vn" Apr 24 16:42:21.153545 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:21.153516 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9htg6\" (UniqueName: \"kubernetes.io/projected/fd17e30d-8cf3-42ae-9802-439073672acf-kube-api-access-9htg6\") pod \"cluster-proxy-proxy-agent-6865b6457d-bh7vn\" (UID: \"fd17e30d-8cf3-42ae-9802-439073672acf\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6865b6457d-bh7vn" Apr 24 16:42:21.230325 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:21.230243 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6865b6457d-bh7vn" Apr 24 16:42:21.251323 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:21.251270 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-587b79bf5-jk4v9"] Apr 24 16:42:21.254030 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:42:21.253977 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51ed9176_938e_4fec_a0d7_29b14f7ceab7.slice/crio-7e48746c3ec41249895fd382b5f88709fbf50ee5240c6725e3f2a590bf0452b5 WatchSource:0}: Error finding container 7e48746c3ec41249895fd382b5f88709fbf50ee5240c6725e3f2a590bf0452b5: Status 404 returned error can't find the container with id 7e48746c3ec41249895fd382b5f88709fbf50ee5240c6725e3f2a590bf0452b5 Apr 24 16:42:21.356026 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:21.355997 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6865b6457d-bh7vn"] Apr 24 16:42:21.358509 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:42:21.358482 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd17e30d_8cf3_42ae_9802_439073672acf.slice/crio-81b7f638ff515b6835b101f01710c71f6a42d2074164a450217bd8fcb4ff3174 WatchSource:0}: Error finding container 81b7f638ff515b6835b101f01710c71f6a42d2074164a450217bd8fcb4ff3174: Status 404 returned error can't find the container with id 81b7f638ff515b6835b101f01710c71f6a42d2074164a450217bd8fcb4ff3174 Apr 24 16:42:22.001938 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:22.001891 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6865b6457d-bh7vn" event={"ID":"fd17e30d-8cf3-42ae-9802-439073672acf","Type":"ContainerStarted","Data":"81b7f638ff515b6835b101f01710c71f6a42d2074164a450217bd8fcb4ff3174"} Apr 24 16:42:22.004372 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:22.004330 2578 generic.go:358] "Generic (PLEG): container finished" podID="2b6ec65d-498d-4e11-8200-cab7bd4092e2" containerID="104f8738356b3b0ea7acb3dabd0205c98adfe2d2ac8e7913d723ed0b930c4068" exitCode=0 Apr 24 16:42:22.004508 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:22.004423 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4hjf" event={"ID":"2b6ec65d-498d-4e11-8200-cab7bd4092e2","Type":"ContainerDied","Data":"104f8738356b3b0ea7acb3dabd0205c98adfe2d2ac8e7913d723ed0b930c4068"} Apr 24 16:42:22.006236 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:22.006143 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-587b79bf5-jk4v9" event={"ID":"51ed9176-938e-4fec-a0d7-29b14f7ceab7","Type":"ContainerStarted","Data":"7e48746c3ec41249895fd382b5f88709fbf50ee5240c6725e3f2a590bf0452b5"} Apr 24 16:42:26.022434 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:26.022401 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6865b6457d-bh7vn" event={"ID":"fd17e30d-8cf3-42ae-9802-439073672acf","Type":"ContainerStarted","Data":"c73e60cf82a37b8bb12e694e0b556b7e20afd196aa817afc7c3abb01a014b638"} Apr 24 16:42:26.023933 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:26.023908 2578 generic.go:358] "Generic (PLEG): container finished" podID="2b6ec65d-498d-4e11-8200-cab7bd4092e2" containerID="668c0e118874b205d595d6ea66da26e55526a188b390610498666ac13465864a" exitCode=0 Apr 24 16:42:26.024052 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:26.023982 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4hjf" event={"ID":"2b6ec65d-498d-4e11-8200-cab7bd4092e2","Type":"ContainerDied","Data":"668c0e118874b205d595d6ea66da26e55526a188b390610498666ac13465864a"} Apr 24 16:42:26.025214 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:26.025150 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-587b79bf5-jk4v9" event={"ID":"51ed9176-938e-4fec-a0d7-29b14f7ceab7","Type":"ContainerStarted","Data":"443a1942d6ef1bf2399bada8b046dfc01c95648fb57345a0affb2f5c5825fb50"} Apr 24 16:42:26.082932 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:26.082884 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-587b79bf5-jk4v9" podStartSLOduration=1.828658199 podStartE2EDuration="6.08287057s" podCreationTimestamp="2026-04-24 16:42:20 +0000 UTC" firstStartedPulling="2026-04-24 16:42:21.256099831 +0000 UTC m=+211.627804038" lastFinishedPulling="2026-04-24 16:42:25.510312196 +0000 UTC m=+215.882016409" observedRunningTime="2026-04-24 16:42:26.081683814 +0000 UTC m=+216.453388041" watchObservedRunningTime="2026-04-24 16:42:26.08287057 +0000 UTC m=+216.454574796" Apr 24 16:42:28.034707 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:28.034671 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6865b6457d-bh7vn" event={"ID":"fd17e30d-8cf3-42ae-9802-439073672acf","Type":"ContainerStarted","Data":"41e1be0660e487ac1869ffa461bf594eee087a408f233f43ff41e21bc21825ea"} Apr 24 16:42:28.034707 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:28.034713 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6865b6457d-bh7vn" event={"ID":"fd17e30d-8cf3-42ae-9802-439073672acf","Type":"ContainerStarted","Data":"f88282e91edc5bc61141cff4b97249b5ee7153f4a051382773356ba54aa74dbd"} Apr 24 16:42:28.054566 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:28.054500 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6865b6457d-bh7vn" podStartSLOduration=1.770259141 podStartE2EDuration="8.054484779s" podCreationTimestamp="2026-04-24 16:42:20 +0000 UTC" firstStartedPulling="2026-04-24 16:42:21.360690013 +0000 UTC m=+211.732394220" lastFinishedPulling="2026-04-24 16:42:27.64491564 +0000 UTC m=+218.016619858" observedRunningTime="2026-04-24 16:42:28.053696484 +0000 UTC m=+218.425400711" watchObservedRunningTime="2026-04-24 16:42:28.054484779 +0000 UTC m=+218.426188999" Apr 24 16:42:33.052976 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:33.052935 2578 generic.go:358] "Generic (PLEG): container finished" podID="2b6ec65d-498d-4e11-8200-cab7bd4092e2" containerID="067c84983bbef099cc14049afddf2deee3f86d14cd2fe9e06d89cb13c72e9262" exitCode=0 Apr 24 16:42:33.053441 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:33.052983 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4hjf" event={"ID":"2b6ec65d-498d-4e11-8200-cab7bd4092e2","Type":"ContainerDied","Data":"067c84983bbef099cc14049afddf2deee3f86d14cd2fe9e06d89cb13c72e9262"} Apr 24 16:42:34.178741 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:34.178719 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4hjf" Apr 24 16:42:34.261824 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:34.261796 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtrdz\" (UniqueName: \"kubernetes.io/projected/2b6ec65d-498d-4e11-8200-cab7bd4092e2-kube-api-access-rtrdz\") pod \"2b6ec65d-498d-4e11-8200-cab7bd4092e2\" (UID: \"2b6ec65d-498d-4e11-8200-cab7bd4092e2\") " Apr 24 16:42:34.261984 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:34.261856 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b6ec65d-498d-4e11-8200-cab7bd4092e2-bundle\") pod \"2b6ec65d-498d-4e11-8200-cab7bd4092e2\" (UID: \"2b6ec65d-498d-4e11-8200-cab7bd4092e2\") " Apr 24 16:42:34.261984 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:34.261893 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b6ec65d-498d-4e11-8200-cab7bd4092e2-util\") pod \"2b6ec65d-498d-4e11-8200-cab7bd4092e2\" (UID: \"2b6ec65d-498d-4e11-8200-cab7bd4092e2\") " Apr 24 16:42:34.262397 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:34.262372 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b6ec65d-498d-4e11-8200-cab7bd4092e2-bundle" (OuterVolumeSpecName: "bundle") pod "2b6ec65d-498d-4e11-8200-cab7bd4092e2" (UID: "2b6ec65d-498d-4e11-8200-cab7bd4092e2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:42:34.264159 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:34.264135 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b6ec65d-498d-4e11-8200-cab7bd4092e2-kube-api-access-rtrdz" (OuterVolumeSpecName: "kube-api-access-rtrdz") pod "2b6ec65d-498d-4e11-8200-cab7bd4092e2" (UID: "2b6ec65d-498d-4e11-8200-cab7bd4092e2"). InnerVolumeSpecName "kube-api-access-rtrdz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:42:34.267014 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:34.266988 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b6ec65d-498d-4e11-8200-cab7bd4092e2-util" (OuterVolumeSpecName: "util") pod "2b6ec65d-498d-4e11-8200-cab7bd4092e2" (UID: "2b6ec65d-498d-4e11-8200-cab7bd4092e2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:42:34.363402 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:34.363319 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rtrdz\" (UniqueName: \"kubernetes.io/projected/2b6ec65d-498d-4e11-8200-cab7bd4092e2-kube-api-access-rtrdz\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 16:42:34.363402 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:34.363354 2578 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2b6ec65d-498d-4e11-8200-cab7bd4092e2-bundle\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 16:42:34.363402 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:34.363363 2578 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2b6ec65d-498d-4e11-8200-cab7bd4092e2-util\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 16:42:35.059314 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:35.059262 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4hjf" event={"ID":"2b6ec65d-498d-4e11-8200-cab7bd4092e2","Type":"ContainerDied","Data":"078c5f8b2bb5f81eb6182b0ea9e6a717c0096890f74e066458b3fc35179d6daa"} Apr 24 16:42:35.059314 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:35.059312 2578 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="078c5f8b2bb5f81eb6182b0ea9e6a717c0096890f74e066458b3fc35179d6daa" Apr 24 16:42:35.059517 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:42:35.059337 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cr4hjf" Apr 24 16:43:50.110134 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:43:50.110109 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-f2l2f_45369a43-7ed3-4f16-a1dc-7f1a61e06fc4/console-operator/1.log" Apr 24 16:43:50.110134 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:43:50.110118 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-f2l2f_45369a43-7ed3-4f16-a1dc-7f1a61e06fc4/console-operator/1.log" Apr 24 16:43:50.113660 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:43:50.113635 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/ovn-acl-logging/0.log" Apr 24 16:43:50.113783 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:43:50.113636 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/ovn-acl-logging/0.log" Apr 24 16:43:50.124771 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:43:50.124744 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 16:45:38.327127 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:38.327097 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9"] Apr 24 16:45:38.327590 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:38.327400 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b6ec65d-498d-4e11-8200-cab7bd4092e2" containerName="util" Apr 24 16:45:38.327590 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:38.327411 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6ec65d-498d-4e11-8200-cab7bd4092e2" containerName="util" Apr 24 16:45:38.327590 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:38.327418 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b6ec65d-498d-4e11-8200-cab7bd4092e2" containerName="pull" Apr 24 16:45:38.327590 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:38.327424 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6ec65d-498d-4e11-8200-cab7bd4092e2" containerName="pull" Apr 24 16:45:38.327590 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:38.327445 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b6ec65d-498d-4e11-8200-cab7bd4092e2" containerName="extract" Apr 24 16:45:38.327590 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:38.327451 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b6ec65d-498d-4e11-8200-cab7bd4092e2" containerName="extract" Apr 24 16:45:38.327590 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:38.327495 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b6ec65d-498d-4e11-8200-cab7bd4092e2" containerName="extract" Apr 24 16:45:38.332895 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:38.331205 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" Apr 24 16:45:38.335109 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:38.335086 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-gd6p9\"" Apr 24 16:45:38.335238 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:38.335145 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 16:45:38.335238 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:38.335155 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-kube-rbac-proxy-sar-config\"" Apr 24 16:45:38.335440 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:38.335426 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-predictor-serving-cert\"" Apr 24 16:45:38.335489 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:38.335442 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 16:45:38.342592 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:38.342572 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9"] Apr 24 16:45:38.344693 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:38.344660 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ca4f3ee8-0572-4b8c-9308-badb39f9cea2-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-jjzm9\" (UID: \"ca4f3ee8-0572-4b8c-9308-badb39f9cea2\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" Apr 24 16:45:38.344771 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:38.344729 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca4f3ee8-0572-4b8c-9308-badb39f9cea2-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-jjzm9\" (UID: \"ca4f3ee8-0572-4b8c-9308-badb39f9cea2\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" Apr 24 16:45:38.344808 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:38.344779 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca4f3ee8-0572-4b8c-9308-badb39f9cea2-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-jjzm9\" (UID: \"ca4f3ee8-0572-4b8c-9308-badb39f9cea2\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" Apr 24 16:45:38.344882 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:38.344862 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l82x\" (UniqueName: \"kubernetes.io/projected/ca4f3ee8-0572-4b8c-9308-badb39f9cea2-kube-api-access-4l82x\") pod \"isvc-xgboost-graph-predictor-669d8d6456-jjzm9\" (UID: \"ca4f3ee8-0572-4b8c-9308-badb39f9cea2\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" Apr 24 16:45:38.445886 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:38.445844 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ca4f3ee8-0572-4b8c-9308-badb39f9cea2-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-jjzm9\" (UID: \"ca4f3ee8-0572-4b8c-9308-badb39f9cea2\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" Apr 24 16:45:38.446059 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:38.445895 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca4f3ee8-0572-4b8c-9308-badb39f9cea2-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-jjzm9\" (UID: \"ca4f3ee8-0572-4b8c-9308-badb39f9cea2\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" Apr 24 16:45:38.446059 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:38.445922 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca4f3ee8-0572-4b8c-9308-badb39f9cea2-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-jjzm9\" (UID: \"ca4f3ee8-0572-4b8c-9308-badb39f9cea2\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" Apr 24 16:45:38.446059 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:38.445955 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4l82x\" (UniqueName: \"kubernetes.io/projected/ca4f3ee8-0572-4b8c-9308-badb39f9cea2-kube-api-access-4l82x\") pod \"isvc-xgboost-graph-predictor-669d8d6456-jjzm9\" (UID: \"ca4f3ee8-0572-4b8c-9308-badb39f9cea2\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" Apr 24 16:45:38.446178 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:45:38.446086 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-graph-predictor-serving-cert: secret "isvc-xgboost-graph-predictor-serving-cert" not found Apr 24 16:45:38.446178 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:45:38.446148 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca4f3ee8-0572-4b8c-9308-badb39f9cea2-proxy-tls podName:ca4f3ee8-0572-4b8c-9308-badb39f9cea2 nodeName:}" failed. No retries permitted until 2026-04-24 16:45:38.946132782 +0000 UTC m=+409.317836986 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/ca4f3ee8-0572-4b8c-9308-badb39f9cea2-proxy-tls") pod "isvc-xgboost-graph-predictor-669d8d6456-jjzm9" (UID: "ca4f3ee8-0572-4b8c-9308-badb39f9cea2") : secret "isvc-xgboost-graph-predictor-serving-cert" not found Apr 24 16:45:38.446347 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:38.446331 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca4f3ee8-0572-4b8c-9308-badb39f9cea2-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-jjzm9\" (UID: \"ca4f3ee8-0572-4b8c-9308-badb39f9cea2\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" Apr 24 16:45:38.446674 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:38.446656 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ca4f3ee8-0572-4b8c-9308-badb39f9cea2-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-jjzm9\" (UID: \"ca4f3ee8-0572-4b8c-9308-badb39f9cea2\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" Apr 24 16:45:38.456997 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:38.456966 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l82x\" (UniqueName: \"kubernetes.io/projected/ca4f3ee8-0572-4b8c-9308-badb39f9cea2-kube-api-access-4l82x\") pod \"isvc-xgboost-graph-predictor-669d8d6456-jjzm9\" (UID: \"ca4f3ee8-0572-4b8c-9308-badb39f9cea2\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" Apr 24 16:45:38.948720 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:38.948687 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca4f3ee8-0572-4b8c-9308-badb39f9cea2-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-jjzm9\" (UID: \"ca4f3ee8-0572-4b8c-9308-badb39f9cea2\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" Apr 24 16:45:38.951241 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:38.951221 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca4f3ee8-0572-4b8c-9308-badb39f9cea2-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-jjzm9\" (UID: \"ca4f3ee8-0572-4b8c-9308-badb39f9cea2\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" Apr 24 16:45:39.243360 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:39.243248 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" Apr 24 16:45:39.367591 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:39.367565 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9"] Apr 24 16:45:39.370054 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:45:39.370025 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca4f3ee8_0572_4b8c_9308_badb39f9cea2.slice/crio-c5042c6dc44b5826a558597ae89d7730cc63209c11eb1fc88c9037aac947034b WatchSource:0}: Error finding container c5042c6dc44b5826a558597ae89d7730cc63209c11eb1fc88c9037aac947034b: Status 404 returned error can't find the container with id c5042c6dc44b5826a558597ae89d7730cc63209c11eb1fc88c9037aac947034b Apr 24 16:45:39.371970 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:39.371955 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:45:39.585610 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:39.585518 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" event={"ID":"ca4f3ee8-0572-4b8c-9308-badb39f9cea2","Type":"ContainerStarted","Data":"c5042c6dc44b5826a558597ae89d7730cc63209c11eb1fc88c9037aac947034b"} Apr 24 16:45:43.598672 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:43.598630 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" event={"ID":"ca4f3ee8-0572-4b8c-9308-badb39f9cea2","Type":"ContainerStarted","Data":"c918962c24a7e1db7f6fd57a68d491d377cc932fcfddf1dc639f59fb8270db26"} Apr 24 16:45:47.611526 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:47.611487 2578 generic.go:358] "Generic (PLEG): container finished" podID="ca4f3ee8-0572-4b8c-9308-badb39f9cea2" containerID="c918962c24a7e1db7f6fd57a68d491d377cc932fcfddf1dc639f59fb8270db26" exitCode=0 Apr 24 16:45:47.611900 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:45:47.611560 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" event={"ID":"ca4f3ee8-0572-4b8c-9308-badb39f9cea2","Type":"ContainerDied","Data":"c918962c24a7e1db7f6fd57a68d491d377cc932fcfddf1dc639f59fb8270db26"} Apr 24 16:46:06.670703 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:46:06.670644 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" event={"ID":"ca4f3ee8-0572-4b8c-9308-badb39f9cea2","Type":"ContainerStarted","Data":"5207d7bdda27207ac2178852cf70fb075d306556dae3d8bec5584300541f80a1"} Apr 24 16:46:08.683066 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:46:08.683028 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" event={"ID":"ca4f3ee8-0572-4b8c-9308-badb39f9cea2","Type":"ContainerStarted","Data":"d283ec59c28b01e5cd9f53628e1decd0b3100cd1b01f1aaf3a11ef00d057ba95"} Apr 24 16:46:08.683458 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:46:08.683195 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" Apr 24 16:46:09.686140 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:46:09.686111 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" Apr 24 16:46:09.687232 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:46:09.687204 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" podUID="ca4f3ee8-0572-4b8c-9308-badb39f9cea2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 24 16:46:10.689268 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:46:10.689222 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" podUID="ca4f3ee8-0572-4b8c-9308-badb39f9cea2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 24 16:46:15.694128 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:46:15.694099 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" Apr 24 16:46:15.694748 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:46:15.694718 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" podUID="ca4f3ee8-0572-4b8c-9308-badb39f9cea2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 24 16:46:15.726090 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:46:15.726042 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" podStartSLOduration=8.618700591 podStartE2EDuration="37.726027422s" podCreationTimestamp="2026-04-24 16:45:38 +0000 UTC" firstStartedPulling="2026-04-24 16:45:39.372079603 +0000 UTC m=+409.743783807" lastFinishedPulling="2026-04-24 16:46:08.479406431 +0000 UTC m=+438.851110638" observedRunningTime="2026-04-24 16:46:08.71215738 +0000 UTC m=+439.083861606" watchObservedRunningTime="2026-04-24 16:46:15.726027422 +0000 UTC m=+446.097731647" Apr 24 16:46:25.695090 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:46:25.695046 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" podUID="ca4f3ee8-0572-4b8c-9308-badb39f9cea2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 24 16:46:35.695488 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:46:35.695444 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" podUID="ca4f3ee8-0572-4b8c-9308-badb39f9cea2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 24 16:46:45.695529 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:46:45.695489 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" podUID="ca4f3ee8-0572-4b8c-9308-badb39f9cea2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 24 16:46:55.695341 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:46:55.695273 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" podUID="ca4f3ee8-0572-4b8c-9308-badb39f9cea2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:8080: connect: connection refused" Apr 24 16:46:57.723175 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:46:57.723129 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth"] Apr 24 16:46:57.727452 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:46:57.727429 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth" Apr 24 16:46:57.731125 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:46:57.731101 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-32209-kube-rbac-proxy-sar-config\"" Apr 24 16:46:57.731788 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:46:57.731769 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-32209-serving-cert\"" Apr 24 16:46:57.735484 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:46:57.735462 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth"] Apr 24 16:46:57.762630 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:46:57.762599 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09fa39fb-000b-4fe7-9efa-229106525764-openshift-service-ca-bundle\") pod \"switch-graph-32209-6559f64bb7-gspth\" (UID: \"09fa39fb-000b-4fe7-9efa-229106525764\") " pod="kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth" Apr 24 16:46:57.762779 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:46:57.762649 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/09fa39fb-000b-4fe7-9efa-229106525764-proxy-tls\") pod \"switch-graph-32209-6559f64bb7-gspth\" (UID: \"09fa39fb-000b-4fe7-9efa-229106525764\") " pod="kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth" Apr 24 16:46:57.864001 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:46:57.863955 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09fa39fb-000b-4fe7-9efa-229106525764-openshift-service-ca-bundle\") pod \"switch-graph-32209-6559f64bb7-gspth\" (UID: \"09fa39fb-000b-4fe7-9efa-229106525764\") " pod="kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth" Apr 24 16:46:57.864174 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:46:57.864042 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/09fa39fb-000b-4fe7-9efa-229106525764-proxy-tls\") pod \"switch-graph-32209-6559f64bb7-gspth\" (UID: \"09fa39fb-000b-4fe7-9efa-229106525764\") " pod="kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth" Apr 24 16:46:57.864174 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:46:57.864162 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-32209-serving-cert: secret "switch-graph-32209-serving-cert" not found Apr 24 16:46:57.864271 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:46:57.864257 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09fa39fb-000b-4fe7-9efa-229106525764-proxy-tls podName:09fa39fb-000b-4fe7-9efa-229106525764 nodeName:}" failed. No retries permitted until 2026-04-24 16:46:58.364231184 +0000 UTC m=+488.735935406 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/09fa39fb-000b-4fe7-9efa-229106525764-proxy-tls") pod "switch-graph-32209-6559f64bb7-gspth" (UID: "09fa39fb-000b-4fe7-9efa-229106525764") : secret "switch-graph-32209-serving-cert" not found Apr 24 16:46:57.864736 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:46:57.864710 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09fa39fb-000b-4fe7-9efa-229106525764-openshift-service-ca-bundle\") pod \"switch-graph-32209-6559f64bb7-gspth\" (UID: \"09fa39fb-000b-4fe7-9efa-229106525764\") " pod="kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth" Apr 24 16:46:58.368869 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:46:58.368810 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/09fa39fb-000b-4fe7-9efa-229106525764-proxy-tls\") pod \"switch-graph-32209-6559f64bb7-gspth\" (UID: \"09fa39fb-000b-4fe7-9efa-229106525764\") " pod="kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth" Apr 24 16:46:58.371548 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:46:58.371512 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/09fa39fb-000b-4fe7-9efa-229106525764-proxy-tls\") pod \"switch-graph-32209-6559f64bb7-gspth\" (UID: \"09fa39fb-000b-4fe7-9efa-229106525764\") " pod="kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth" Apr 24 16:46:58.637501 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:46:58.637395 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth" Apr 24 16:46:58.763767 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:46:58.763739 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth"] Apr 24 16:46:58.766770 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:46:58.766730 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09fa39fb_000b_4fe7_9efa_229106525764.slice/crio-63f2e8578e893fdb6addd8e7505cf03accb245ebfe254febccbaf8090fceb2c5 WatchSource:0}: Error finding container 63f2e8578e893fdb6addd8e7505cf03accb245ebfe254febccbaf8090fceb2c5: Status 404 returned error can't find the container with id 63f2e8578e893fdb6addd8e7505cf03accb245ebfe254febccbaf8090fceb2c5 Apr 24 16:46:58.824905 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:46:58.824866 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth" event={"ID":"09fa39fb-000b-4fe7-9efa-229106525764","Type":"ContainerStarted","Data":"63f2e8578e893fdb6addd8e7505cf03accb245ebfe254febccbaf8090fceb2c5"} Apr 24 16:47:01.835491 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:01.835459 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth" event={"ID":"09fa39fb-000b-4fe7-9efa-229106525764","Type":"ContainerStarted","Data":"b603434578986c884c26904ca071d7a18caa7452bfeea1e285016ff85da30d02"} Apr 24 16:47:01.835898 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:01.835577 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth" Apr 24 16:47:01.855015 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:01.854964 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth" podStartSLOduration=2.565834571 podStartE2EDuration="4.854949764s" podCreationTimestamp="2026-04-24 16:46:57 +0000 UTC" firstStartedPulling="2026-04-24 16:46:58.768520978 +0000 UTC m=+489.140225186" lastFinishedPulling="2026-04-24 16:47:01.057636173 +0000 UTC m=+491.429340379" observedRunningTime="2026-04-24 16:47:01.853839719 +0000 UTC m=+492.225543944" watchObservedRunningTime="2026-04-24 16:47:01.854949764 +0000 UTC m=+492.226653989" Apr 24 16:47:05.696092 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:05.696060 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" Apr 24 16:47:07.844049 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:07.844016 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth" Apr 24 16:47:11.925929 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:11.925892 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth"] Apr 24 16:47:11.926355 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:11.926096 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth" podUID="09fa39fb-000b-4fe7-9efa-229106525764" containerName="switch-graph-32209" containerID="cri-o://b603434578986c884c26904ca071d7a18caa7452bfeea1e285016ff85da30d02" gracePeriod=30 Apr 24 16:47:12.842200 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:12.842158 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth" podUID="09fa39fb-000b-4fe7-9efa-229106525764" containerName="switch-graph-32209" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:47:17.842197 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:17.842161 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth" podUID="09fa39fb-000b-4fe7-9efa-229106525764" containerName="switch-graph-32209" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:47:22.842005 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:22.841958 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth" podUID="09fa39fb-000b-4fe7-9efa-229106525764" containerName="switch-graph-32209" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:47:22.842402 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:22.842073 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth" Apr 24 16:47:27.841906 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:27.841860 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth" podUID="09fa39fb-000b-4fe7-9efa-229106525764" containerName="switch-graph-32209" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:47:32.842445 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:32.842401 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth" podUID="09fa39fb-000b-4fe7-9efa-229106525764" containerName="switch-graph-32209" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:47:37.775704 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:37.775623 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd"] Apr 24 16:47:37.780115 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:37.780095 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd" Apr 24 16:47:37.783471 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:37.783451 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-serving-cert\"" Apr 24 16:47:37.783576 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:37.783454 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-kube-rbac-proxy-sar-config\"" Apr 24 16:47:37.788264 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:37.788244 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd"] Apr 24 16:47:37.841787 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:37.841746 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth" podUID="09fa39fb-000b-4fe7-9efa-229106525764" containerName="switch-graph-32209" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:47:37.883108 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:37.883072 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8140754c-4236-412b-97ff-b990cd982c2f-proxy-tls\") pod \"model-chainer-787f8b7fc6-767sd\" (UID: \"8140754c-4236-412b-97ff-b990cd982c2f\") " pod="kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd" Apr 24 16:47:37.883328 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:37.883149 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8140754c-4236-412b-97ff-b990cd982c2f-openshift-service-ca-bundle\") pod \"model-chainer-787f8b7fc6-767sd\" (UID: \"8140754c-4236-412b-97ff-b990cd982c2f\") " pod="kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd" Apr 24 16:47:37.984083 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:37.984042 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8140754c-4236-412b-97ff-b990cd982c2f-openshift-service-ca-bundle\") pod \"model-chainer-787f8b7fc6-767sd\" (UID: \"8140754c-4236-412b-97ff-b990cd982c2f\") " pod="kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd" Apr 24 16:47:37.984262 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:37.984106 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8140754c-4236-412b-97ff-b990cd982c2f-proxy-tls\") pod \"model-chainer-787f8b7fc6-767sd\" (UID: \"8140754c-4236-412b-97ff-b990cd982c2f\") " pod="kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd" Apr 24 16:47:37.984262 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:47:37.984214 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-serving-cert: secret "model-chainer-serving-cert" not found Apr 24 16:47:37.984574 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:47:37.984275 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8140754c-4236-412b-97ff-b990cd982c2f-proxy-tls podName:8140754c-4236-412b-97ff-b990cd982c2f nodeName:}" failed. No retries permitted until 2026-04-24 16:47:38.484258186 +0000 UTC m=+528.855962391 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/8140754c-4236-412b-97ff-b990cd982c2f-proxy-tls") pod "model-chainer-787f8b7fc6-767sd" (UID: "8140754c-4236-412b-97ff-b990cd982c2f") : secret "model-chainer-serving-cert" not found Apr 24 16:47:37.984737 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:37.984714 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8140754c-4236-412b-97ff-b990cd982c2f-openshift-service-ca-bundle\") pod \"model-chainer-787f8b7fc6-767sd\" (UID: \"8140754c-4236-412b-97ff-b990cd982c2f\") " pod="kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd" Apr 24 16:47:38.489430 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:38.489393 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8140754c-4236-412b-97ff-b990cd982c2f-proxy-tls\") pod \"model-chainer-787f8b7fc6-767sd\" (UID: \"8140754c-4236-412b-97ff-b990cd982c2f\") " pod="kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd" Apr 24 16:47:38.491941 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:38.491921 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8140754c-4236-412b-97ff-b990cd982c2f-proxy-tls\") pod \"model-chainer-787f8b7fc6-767sd\" (UID: \"8140754c-4236-412b-97ff-b990cd982c2f\") " pod="kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd" Apr 24 16:47:38.690797 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:38.690757 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd" Apr 24 16:47:38.813696 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:38.813664 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd"] Apr 24 16:47:38.817244 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:47:38.817209 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8140754c_4236_412b_97ff_b990cd982c2f.slice/crio-bb4bbf2d9cd627ce6f72f36d557e7f8ef7a8b1fd8ee1858716d2ab20fed67f2f WatchSource:0}: Error finding container bb4bbf2d9cd627ce6f72f36d557e7f8ef7a8b1fd8ee1858716d2ab20fed67f2f: Status 404 returned error can't find the container with id bb4bbf2d9cd627ce6f72f36d557e7f8ef7a8b1fd8ee1858716d2ab20fed67f2f Apr 24 16:47:38.942719 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:38.942677 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd" event={"ID":"8140754c-4236-412b-97ff-b990cd982c2f","Type":"ContainerStarted","Data":"d5ca8f4ec29a9586cd7c54db70d975649d576dd47cabcef027c03c77f1e7d663"} Apr 24 16:47:38.942719 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:38.942721 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd" event={"ID":"8140754c-4236-412b-97ff-b990cd982c2f","Type":"ContainerStarted","Data":"bb4bbf2d9cd627ce6f72f36d557e7f8ef7a8b1fd8ee1858716d2ab20fed67f2f"} Apr 24 16:47:38.942934 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:38.942811 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd" Apr 24 16:47:38.960940 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:38.960883 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd" podStartSLOduration=1.960864028 podStartE2EDuration="1.960864028s" podCreationTimestamp="2026-04-24 16:47:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:47:38.959731356 +0000 UTC m=+529.331435582" watchObservedRunningTime="2026-04-24 16:47:38.960864028 +0000 UTC m=+529.332568256" Apr 24 16:47:41.954980 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:41.954871 2578 generic.go:358] "Generic (PLEG): container finished" podID="09fa39fb-000b-4fe7-9efa-229106525764" containerID="b603434578986c884c26904ca071d7a18caa7452bfeea1e285016ff85da30d02" exitCode=0 Apr 24 16:47:41.954980 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:41.954960 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth" event={"ID":"09fa39fb-000b-4fe7-9efa-229106525764","Type":"ContainerDied","Data":"b603434578986c884c26904ca071d7a18caa7452bfeea1e285016ff85da30d02"} Apr 24 16:47:42.070294 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:42.070266 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth" Apr 24 16:47:42.220270 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:42.220173 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/09fa39fb-000b-4fe7-9efa-229106525764-proxy-tls\") pod \"09fa39fb-000b-4fe7-9efa-229106525764\" (UID: \"09fa39fb-000b-4fe7-9efa-229106525764\") " Apr 24 16:47:42.220270 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:42.220240 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09fa39fb-000b-4fe7-9efa-229106525764-openshift-service-ca-bundle\") pod \"09fa39fb-000b-4fe7-9efa-229106525764\" (UID: \"09fa39fb-000b-4fe7-9efa-229106525764\") " Apr 24 16:47:42.220617 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:42.220585 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09fa39fb-000b-4fe7-9efa-229106525764-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "09fa39fb-000b-4fe7-9efa-229106525764" (UID: "09fa39fb-000b-4fe7-9efa-229106525764"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:47:42.222576 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:42.222548 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09fa39fb-000b-4fe7-9efa-229106525764-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "09fa39fb-000b-4fe7-9efa-229106525764" (UID: "09fa39fb-000b-4fe7-9efa-229106525764"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:47:42.321342 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:42.321305 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09fa39fb-000b-4fe7-9efa-229106525764-openshift-service-ca-bundle\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 16:47:42.321342 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:42.321334 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/09fa39fb-000b-4fe7-9efa-229106525764-proxy-tls\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 16:47:42.959404 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:42.959367 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth" event={"ID":"09fa39fb-000b-4fe7-9efa-229106525764","Type":"ContainerDied","Data":"63f2e8578e893fdb6addd8e7505cf03accb245ebfe254febccbaf8090fceb2c5"} Apr 24 16:47:42.959404 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:42.959397 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth" Apr 24 16:47:42.959922 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:42.959422 2578 scope.go:117] "RemoveContainer" containerID="b603434578986c884c26904ca071d7a18caa7452bfeea1e285016ff85da30d02" Apr 24 16:47:42.977425 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:42.977402 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth"] Apr 24 16:47:42.982433 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:42.982410 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-32209-6559f64bb7-gspth"] Apr 24 16:47:44.236706 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:44.236672 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09fa39fb-000b-4fe7-9efa-229106525764" path="/var/lib/kubelet/pods/09fa39fb-000b-4fe7-9efa-229106525764/volumes" Apr 24 16:47:44.951691 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:44.951660 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd" Apr 24 16:47:47.849145 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:47.849116 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd"] Apr 24 16:47:47.849565 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:47.849327 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd" podUID="8140754c-4236-412b-97ff-b990cd982c2f" containerName="model-chainer" containerID="cri-o://d5ca8f4ec29a9586cd7c54db70d975649d576dd47cabcef027c03c77f1e7d663" gracePeriod=30 Apr 24 16:47:48.100010 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:48.099921 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9"] Apr 24 16:47:48.100252 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:48.100225 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" podUID="ca4f3ee8-0572-4b8c-9308-badb39f9cea2" containerName="kserve-container" containerID="cri-o://5207d7bdda27207ac2178852cf70fb075d306556dae3d8bec5584300541f80a1" gracePeriod=30 Apr 24 16:47:48.100340 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:48.100247 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" podUID="ca4f3ee8-0572-4b8c-9308-badb39f9cea2" containerName="kube-rbac-proxy" containerID="cri-o://d283ec59c28b01e5cd9f53628e1decd0b3100cd1b01f1aaf3a11ef00d057ba95" gracePeriod=30 Apr 24 16:47:48.979839 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:48.979803 2578 generic.go:358] "Generic (PLEG): container finished" podID="ca4f3ee8-0572-4b8c-9308-badb39f9cea2" containerID="d283ec59c28b01e5cd9f53628e1decd0b3100cd1b01f1aaf3a11ef00d057ba95" exitCode=2 Apr 24 16:47:48.980204 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:48.979863 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" event={"ID":"ca4f3ee8-0572-4b8c-9308-badb39f9cea2","Type":"ContainerDied","Data":"d283ec59c28b01e5cd9f53628e1decd0b3100cd1b01f1aaf3a11ef00d057ba95"} Apr 24 16:47:49.949893 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:49.949851 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd" podUID="8140754c-4236-412b-97ff-b990cd982c2f" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:47:50.689847 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:50.689803 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" podUID="ca4f3ee8-0572-4b8c-9308-badb39f9cea2" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.22:8643/healthz\": dial tcp 10.132.0.22:8643: connect: connection refused" Apr 24 16:47:51.743019 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:51.742995 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" Apr 24 16:47:51.896957 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:51.896870 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ca4f3ee8-0572-4b8c-9308-badb39f9cea2-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"ca4f3ee8-0572-4b8c-9308-badb39f9cea2\" (UID: \"ca4f3ee8-0572-4b8c-9308-badb39f9cea2\") " Apr 24 16:47:51.896957 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:51.896945 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l82x\" (UniqueName: \"kubernetes.io/projected/ca4f3ee8-0572-4b8c-9308-badb39f9cea2-kube-api-access-4l82x\") pod \"ca4f3ee8-0572-4b8c-9308-badb39f9cea2\" (UID: \"ca4f3ee8-0572-4b8c-9308-badb39f9cea2\") " Apr 24 16:47:51.897155 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:51.897054 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca4f3ee8-0572-4b8c-9308-badb39f9cea2-kserve-provision-location\") pod \"ca4f3ee8-0572-4b8c-9308-badb39f9cea2\" (UID: \"ca4f3ee8-0572-4b8c-9308-badb39f9cea2\") " Apr 24 16:47:51.897155 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:51.897091 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca4f3ee8-0572-4b8c-9308-badb39f9cea2-proxy-tls\") pod \"ca4f3ee8-0572-4b8c-9308-badb39f9cea2\" (UID: \"ca4f3ee8-0572-4b8c-9308-badb39f9cea2\") " Apr 24 16:47:51.897402 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:51.897373 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca4f3ee8-0572-4b8c-9308-badb39f9cea2-isvc-xgboost-graph-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-kube-rbac-proxy-sar-config") pod "ca4f3ee8-0572-4b8c-9308-badb39f9cea2" (UID: "ca4f3ee8-0572-4b8c-9308-badb39f9cea2"). InnerVolumeSpecName "isvc-xgboost-graph-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:47:51.897402 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:51.897393 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca4f3ee8-0572-4b8c-9308-badb39f9cea2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ca4f3ee8-0572-4b8c-9308-badb39f9cea2" (UID: "ca4f3ee8-0572-4b8c-9308-badb39f9cea2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:47:51.899262 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:51.899231 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca4f3ee8-0572-4b8c-9308-badb39f9cea2-kube-api-access-4l82x" (OuterVolumeSpecName: "kube-api-access-4l82x") pod "ca4f3ee8-0572-4b8c-9308-badb39f9cea2" (UID: "ca4f3ee8-0572-4b8c-9308-badb39f9cea2"). InnerVolumeSpecName "kube-api-access-4l82x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:47:51.899377 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:51.899271 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca4f3ee8-0572-4b8c-9308-badb39f9cea2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ca4f3ee8-0572-4b8c-9308-badb39f9cea2" (UID: "ca4f3ee8-0572-4b8c-9308-badb39f9cea2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:47:51.990548 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:51.990508 2578 generic.go:358] "Generic (PLEG): container finished" podID="ca4f3ee8-0572-4b8c-9308-badb39f9cea2" containerID="5207d7bdda27207ac2178852cf70fb075d306556dae3d8bec5584300541f80a1" exitCode=0 Apr 24 16:47:51.990709 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:51.990571 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" event={"ID":"ca4f3ee8-0572-4b8c-9308-badb39f9cea2","Type":"ContainerDied","Data":"5207d7bdda27207ac2178852cf70fb075d306556dae3d8bec5584300541f80a1"} Apr 24 16:47:51.990709 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:51.990602 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" Apr 24 16:47:51.990709 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:51.990609 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9" event={"ID":"ca4f3ee8-0572-4b8c-9308-badb39f9cea2","Type":"ContainerDied","Data":"c5042c6dc44b5826a558597ae89d7730cc63209c11eb1fc88c9037aac947034b"} Apr 24 16:47:51.990709 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:51.990630 2578 scope.go:117] "RemoveContainer" containerID="d283ec59c28b01e5cd9f53628e1decd0b3100cd1b01f1aaf3a11ef00d057ba95" Apr 24 16:47:51.997977 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:51.997951 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4l82x\" (UniqueName: \"kubernetes.io/projected/ca4f3ee8-0572-4b8c-9308-badb39f9cea2-kube-api-access-4l82x\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 16:47:51.997977 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:51.997977 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca4f3ee8-0572-4b8c-9308-badb39f9cea2-kserve-provision-location\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 16:47:51.998137 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:51.997988 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ca4f3ee8-0572-4b8c-9308-badb39f9cea2-proxy-tls\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 16:47:51.998137 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:51.998000 2578 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ca4f3ee8-0572-4b8c-9308-badb39f9cea2-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 16:47:51.998881 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:51.998862 2578 scope.go:117] "RemoveContainer" containerID="5207d7bdda27207ac2178852cf70fb075d306556dae3d8bec5584300541f80a1" Apr 24 16:47:52.007491 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:52.007469 2578 scope.go:117] "RemoveContainer" containerID="c918962c24a7e1db7f6fd57a68d491d377cc932fcfddf1dc639f59fb8270db26" Apr 24 16:47:52.011226 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:52.011203 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9"] Apr 24 16:47:52.013068 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:52.013049 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-jjzm9"] Apr 24 16:47:52.014994 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:52.014977 2578 scope.go:117] "RemoveContainer" containerID="d283ec59c28b01e5cd9f53628e1decd0b3100cd1b01f1aaf3a11ef00d057ba95" Apr 24 16:47:52.015275 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:47:52.015257 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d283ec59c28b01e5cd9f53628e1decd0b3100cd1b01f1aaf3a11ef00d057ba95\": container with ID starting with d283ec59c28b01e5cd9f53628e1decd0b3100cd1b01f1aaf3a11ef00d057ba95 not found: ID does not exist" containerID="d283ec59c28b01e5cd9f53628e1decd0b3100cd1b01f1aaf3a11ef00d057ba95" Apr 24 16:47:52.015345 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:52.015308 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d283ec59c28b01e5cd9f53628e1decd0b3100cd1b01f1aaf3a11ef00d057ba95"} err="failed to get container status \"d283ec59c28b01e5cd9f53628e1decd0b3100cd1b01f1aaf3a11ef00d057ba95\": rpc error: code = NotFound desc = could not find container \"d283ec59c28b01e5cd9f53628e1decd0b3100cd1b01f1aaf3a11ef00d057ba95\": container with ID starting with d283ec59c28b01e5cd9f53628e1decd0b3100cd1b01f1aaf3a11ef00d057ba95 not found: ID does not exist" Apr 24 16:47:52.015345 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:52.015330 2578 scope.go:117] "RemoveContainer" containerID="5207d7bdda27207ac2178852cf70fb075d306556dae3d8bec5584300541f80a1" Apr 24 16:47:52.015556 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:47:52.015537 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5207d7bdda27207ac2178852cf70fb075d306556dae3d8bec5584300541f80a1\": container with ID starting with 5207d7bdda27207ac2178852cf70fb075d306556dae3d8bec5584300541f80a1 not found: ID does not exist" containerID="5207d7bdda27207ac2178852cf70fb075d306556dae3d8bec5584300541f80a1" Apr 24 16:47:52.015595 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:52.015565 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5207d7bdda27207ac2178852cf70fb075d306556dae3d8bec5584300541f80a1"} err="failed to get container status \"5207d7bdda27207ac2178852cf70fb075d306556dae3d8bec5584300541f80a1\": rpc error: code = NotFound desc = could not find container \"5207d7bdda27207ac2178852cf70fb075d306556dae3d8bec5584300541f80a1\": container with ID starting with 5207d7bdda27207ac2178852cf70fb075d306556dae3d8bec5584300541f80a1 not found: ID does not exist" Apr 24 16:47:52.015595 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:52.015581 2578 scope.go:117] "RemoveContainer" containerID="c918962c24a7e1db7f6fd57a68d491d377cc932fcfddf1dc639f59fb8270db26" Apr 24 16:47:52.015784 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:47:52.015768 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c918962c24a7e1db7f6fd57a68d491d377cc932fcfddf1dc639f59fb8270db26\": container with ID starting with c918962c24a7e1db7f6fd57a68d491d377cc932fcfddf1dc639f59fb8270db26 not found: ID does not exist" containerID="c918962c24a7e1db7f6fd57a68d491d377cc932fcfddf1dc639f59fb8270db26" Apr 24 16:47:52.015827 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:52.015789 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c918962c24a7e1db7f6fd57a68d491d377cc932fcfddf1dc639f59fb8270db26"} err="failed to get container status \"c918962c24a7e1db7f6fd57a68d491d377cc932fcfddf1dc639f59fb8270db26\": rpc error: code = NotFound desc = could not find container \"c918962c24a7e1db7f6fd57a68d491d377cc932fcfddf1dc639f59fb8270db26\": container with ID starting with c918962c24a7e1db7f6fd57a68d491d377cc932fcfddf1dc639f59fb8270db26 not found: ID does not exist" Apr 24 16:47:52.238131 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:52.238102 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca4f3ee8-0572-4b8c-9308-badb39f9cea2" path="/var/lib/kubelet/pods/ca4f3ee8-0572-4b8c-9308-badb39f9cea2/volumes" Apr 24 16:47:54.949842 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:54.949801 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd" podUID="8140754c-4236-412b-97ff-b990cd982c2f" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:47:59.949374 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:59.949327 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd" podUID="8140754c-4236-412b-97ff-b990cd982c2f" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:47:59.949743 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:47:59.949439 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd" Apr 24 16:48:04.950083 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:04.950046 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd" podUID="8140754c-4236-412b-97ff-b990cd982c2f" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:48:06.315630 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:48:06.315590 2578 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/46063f9b93ee88a9a41a856c1fefdf8e4190575edc3c1b356a0d27288760849c/diff" to get inode usage: stat /var/lib/containers/storage/overlay/46063f9b93ee88a9a41a856c1fefdf8e4190575edc3c1b356a0d27288760849c/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/kserve-ci-e2e-test_isvc-xgboost-graph-predictor-669d8d6456-jjzm9_ca4f3ee8-0572-4b8c-9308-badb39f9cea2/kserve-container/0.log" to get inode usage: stat /var/log/pods/kserve-ci-e2e-test_isvc-xgboost-graph-predictor-669d8d6456-jjzm9_ca4f3ee8-0572-4b8c-9308-badb39f9cea2/kserve-container/0.log: no such file or directory Apr 24 16:48:09.950093 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:09.950052 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd" podUID="8140754c-4236-412b-97ff-b990cd982c2f" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:48:12.238059 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:12.238029 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch"] Apr 24 16:48:12.238425 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:12.238302 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca4f3ee8-0572-4b8c-9308-badb39f9cea2" containerName="storage-initializer" Apr 24 16:48:12.238425 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:12.238313 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca4f3ee8-0572-4b8c-9308-badb39f9cea2" containerName="storage-initializer" Apr 24 16:48:12.238425 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:12.238324 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca4f3ee8-0572-4b8c-9308-badb39f9cea2" containerName="kube-rbac-proxy" Apr 24 16:48:12.238425 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:12.238330 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca4f3ee8-0572-4b8c-9308-badb39f9cea2" containerName="kube-rbac-proxy" Apr 24 16:48:12.238425 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:12.238337 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09fa39fb-000b-4fe7-9efa-229106525764" containerName="switch-graph-32209" Apr 24 16:48:12.238425 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:12.238342 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="09fa39fb-000b-4fe7-9efa-229106525764" containerName="switch-graph-32209" Apr 24 16:48:12.238425 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:12.238352 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca4f3ee8-0572-4b8c-9308-badb39f9cea2" containerName="kserve-container" Apr 24 16:48:12.238425 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:12.238358 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca4f3ee8-0572-4b8c-9308-badb39f9cea2" containerName="kserve-container" Apr 24 16:48:12.238425 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:12.238408 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca4f3ee8-0572-4b8c-9308-badb39f9cea2" containerName="kube-rbac-proxy" Apr 24 16:48:12.238425 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:12.238416 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca4f3ee8-0572-4b8c-9308-badb39f9cea2" containerName="kserve-container" Apr 24 16:48:12.238425 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:12.238424 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="09fa39fb-000b-4fe7-9efa-229106525764" containerName="switch-graph-32209" Apr 24 16:48:12.241202 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:12.241185 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch" Apr 24 16:48:12.247531 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:12.247509 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-2c4dd-kube-rbac-proxy-sar-config\"" Apr 24 16:48:12.247607 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:12.247534 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-2c4dd-serving-cert\"" Apr 24 16:48:12.262302 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:12.262260 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch"] Apr 24 16:48:12.351871 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:12.351817 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7-openshift-service-ca-bundle\") pod \"switch-graph-2c4dd-5f99fd554-5kkch\" (UID: \"b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7\") " pod="kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch" Apr 24 16:48:12.352044 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:12.351892 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7-proxy-tls\") pod \"switch-graph-2c4dd-5f99fd554-5kkch\" (UID: \"b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7\") " pod="kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch" Apr 24 16:48:12.452324 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:12.452275 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7-openshift-service-ca-bundle\") pod \"switch-graph-2c4dd-5f99fd554-5kkch\" (UID: \"b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7\") " pod="kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch" Apr 24 16:48:12.452506 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:12.452352 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7-proxy-tls\") pod \"switch-graph-2c4dd-5f99fd554-5kkch\" (UID: \"b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7\") " pod="kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch" Apr 24 16:48:12.452940 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:12.452912 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7-openshift-service-ca-bundle\") pod \"switch-graph-2c4dd-5f99fd554-5kkch\" (UID: \"b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7\") " pod="kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch" Apr 24 16:48:12.455004 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:12.454977 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7-proxy-tls\") pod \"switch-graph-2c4dd-5f99fd554-5kkch\" (UID: \"b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7\") " pod="kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch" Apr 24 16:48:12.551101 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:12.551016 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch" Apr 24 16:48:12.676635 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:12.674779 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch"] Apr 24 16:48:13.053063 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:13.053029 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch" event={"ID":"b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7","Type":"ContainerStarted","Data":"3a353963fd4e7fe62b53ec5356f84ca0ebff7269b75b087a3fdd95f55ded2eca"} Apr 24 16:48:13.053063 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:13.053068 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch" event={"ID":"b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7","Type":"ContainerStarted","Data":"3f2e9932280b902fda4e31d2e4baf7e0864fed373c9a0f97813a5d0d46680e91"} Apr 24 16:48:13.053263 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:13.053101 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch" Apr 24 16:48:13.073336 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:13.073177 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch" podStartSLOduration=1.073162201 podStartE2EDuration="1.073162201s" podCreationTimestamp="2026-04-24 16:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:48:13.071949738 +0000 UTC m=+563.443653964" watchObservedRunningTime="2026-04-24 16:48:13.073162201 +0000 UTC m=+563.444866427" Apr 24 16:48:14.949716 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:14.949669 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd" podUID="8140754c-4236-412b-97ff-b990cd982c2f" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:48:17.872457 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:48:17.872414 2578 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8140754c_4236_412b_97ff_b990cd982c2f.slice/crio-bb4bbf2d9cd627ce6f72f36d557e7f8ef7a8b1fd8ee1858716d2ab20fed67f2f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8140754c_4236_412b_97ff_b990cd982c2f.slice/crio-d5ca8f4ec29a9586cd7c54db70d975649d576dd47cabcef027c03c77f1e7d663.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8140754c_4236_412b_97ff_b990cd982c2f.slice/crio-conmon-d5ca8f4ec29a9586cd7c54db70d975649d576dd47cabcef027c03c77f1e7d663.scope\": RecentStats: unable to find data in memory cache]" Apr 24 16:48:17.872770 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:48:17.872592 2578 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8140754c_4236_412b_97ff_b990cd982c2f.slice/crio-bb4bbf2d9cd627ce6f72f36d557e7f8ef7a8b1fd8ee1858716d2ab20fed67f2f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8140754c_4236_412b_97ff_b990cd982c2f.slice/crio-conmon-d5ca8f4ec29a9586cd7c54db70d975649d576dd47cabcef027c03c77f1e7d663.scope\": RecentStats: unable to find data in memory cache]" Apr 24 16:48:17.872770 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:48:17.872592 2578 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8140754c_4236_412b_97ff_b990cd982c2f.slice/crio-bb4bbf2d9cd627ce6f72f36d557e7f8ef7a8b1fd8ee1858716d2ab20fed67f2f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8140754c_4236_412b_97ff_b990cd982c2f.slice/crio-conmon-d5ca8f4ec29a9586cd7c54db70d975649d576dd47cabcef027c03c77f1e7d663.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8140754c_4236_412b_97ff_b990cd982c2f.slice/crio-d5ca8f4ec29a9586cd7c54db70d975649d576dd47cabcef027c03c77f1e7d663.scope\": RecentStats: unable to find data in memory cache]" Apr 24 16:48:17.992534 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:17.992511 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd" Apr 24 16:48:18.068716 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:18.068672 2578 generic.go:358] "Generic (PLEG): container finished" podID="8140754c-4236-412b-97ff-b990cd982c2f" containerID="d5ca8f4ec29a9586cd7c54db70d975649d576dd47cabcef027c03c77f1e7d663" exitCode=0 Apr 24 16:48:18.068869 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:18.068724 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd" event={"ID":"8140754c-4236-412b-97ff-b990cd982c2f","Type":"ContainerDied","Data":"d5ca8f4ec29a9586cd7c54db70d975649d576dd47cabcef027c03c77f1e7d663"} Apr 24 16:48:18.068869 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:18.068738 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd" Apr 24 16:48:18.068869 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:18.068752 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd" event={"ID":"8140754c-4236-412b-97ff-b990cd982c2f","Type":"ContainerDied","Data":"bb4bbf2d9cd627ce6f72f36d557e7f8ef7a8b1fd8ee1858716d2ab20fed67f2f"} Apr 24 16:48:18.068869 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:18.068770 2578 scope.go:117] "RemoveContainer" containerID="d5ca8f4ec29a9586cd7c54db70d975649d576dd47cabcef027c03c77f1e7d663" Apr 24 16:48:18.076403 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:18.076377 2578 scope.go:117] "RemoveContainer" containerID="d5ca8f4ec29a9586cd7c54db70d975649d576dd47cabcef027c03c77f1e7d663" Apr 24 16:48:18.076654 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:48:18.076636 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5ca8f4ec29a9586cd7c54db70d975649d576dd47cabcef027c03c77f1e7d663\": container with ID starting with d5ca8f4ec29a9586cd7c54db70d975649d576dd47cabcef027c03c77f1e7d663 not found: ID does not exist" containerID="d5ca8f4ec29a9586cd7c54db70d975649d576dd47cabcef027c03c77f1e7d663" Apr 24 16:48:18.076721 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:18.076666 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5ca8f4ec29a9586cd7c54db70d975649d576dd47cabcef027c03c77f1e7d663"} err="failed to get container status \"d5ca8f4ec29a9586cd7c54db70d975649d576dd47cabcef027c03c77f1e7d663\": rpc error: code = NotFound desc = could not find container \"d5ca8f4ec29a9586cd7c54db70d975649d576dd47cabcef027c03c77f1e7d663\": container with ID starting with d5ca8f4ec29a9586cd7c54db70d975649d576dd47cabcef027c03c77f1e7d663 not found: ID does not exist" Apr 24 16:48:18.090939 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:18.090920 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8140754c-4236-412b-97ff-b990cd982c2f-openshift-service-ca-bundle\") pod \"8140754c-4236-412b-97ff-b990cd982c2f\" (UID: \"8140754c-4236-412b-97ff-b990cd982c2f\") " Apr 24 16:48:18.091045 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:18.090976 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8140754c-4236-412b-97ff-b990cd982c2f-proxy-tls\") pod \"8140754c-4236-412b-97ff-b990cd982c2f\" (UID: \"8140754c-4236-412b-97ff-b990cd982c2f\") " Apr 24 16:48:18.091330 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:18.091269 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8140754c-4236-412b-97ff-b990cd982c2f-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "8140754c-4236-412b-97ff-b990cd982c2f" (UID: "8140754c-4236-412b-97ff-b990cd982c2f"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:48:18.093108 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:18.093088 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8140754c-4236-412b-97ff-b990cd982c2f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8140754c-4236-412b-97ff-b990cd982c2f" (UID: "8140754c-4236-412b-97ff-b990cd982c2f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:48:18.191685 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:18.191649 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8140754c-4236-412b-97ff-b990cd982c2f-openshift-service-ca-bundle\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 16:48:18.191685 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:18.191681 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8140754c-4236-412b-97ff-b990cd982c2f-proxy-tls\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 16:48:18.384085 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:18.384051 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd"] Apr 24 16:48:18.388277 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:18.388251 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-787f8b7fc6-767sd"] Apr 24 16:48:19.062423 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:19.062391 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch" Apr 24 16:48:20.239146 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:20.239100 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8140754c-4236-412b-97ff-b990cd982c2f" path="/var/lib/kubelet/pods/8140754c-4236-412b-97ff-b990cd982c2f/volumes" Apr 24 16:48:48.057378 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:48.057346 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn"] Apr 24 16:48:48.057836 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:48.057644 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8140754c-4236-412b-97ff-b990cd982c2f" containerName="model-chainer" Apr 24 16:48:48.057836 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:48.057654 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8140754c-4236-412b-97ff-b990cd982c2f" containerName="model-chainer" Apr 24 16:48:48.057836 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:48.057712 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="8140754c-4236-412b-97ff-b990cd982c2f" containerName="model-chainer" Apr 24 16:48:48.060679 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:48.060662 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn" Apr 24 16:48:48.064189 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:48.064166 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-30dd6-serving-cert\"" Apr 24 16:48:48.064312 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:48.064202 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-30dd6-kube-rbac-proxy-sar-config\"" Apr 24 16:48:48.072746 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:48.072723 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn"] Apr 24 16:48:48.226720 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:48.226678 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf9447c6-99a6-4598-a2cf-4f2419257de2-proxy-tls\") pod \"sequence-graph-30dd6-744fc64586-ptbhn\" (UID: \"cf9447c6-99a6-4598-a2cf-4f2419257de2\") " pod="kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn" Apr 24 16:48:48.226892 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:48.226734 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf9447c6-99a6-4598-a2cf-4f2419257de2-openshift-service-ca-bundle\") pod \"sequence-graph-30dd6-744fc64586-ptbhn\" (UID: \"cf9447c6-99a6-4598-a2cf-4f2419257de2\") " pod="kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn" Apr 24 16:48:48.327726 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:48.327637 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf9447c6-99a6-4598-a2cf-4f2419257de2-proxy-tls\") pod \"sequence-graph-30dd6-744fc64586-ptbhn\" (UID: \"cf9447c6-99a6-4598-a2cf-4f2419257de2\") " pod="kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn" Apr 24 16:48:48.327726 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:48.327686 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf9447c6-99a6-4598-a2cf-4f2419257de2-openshift-service-ca-bundle\") pod \"sequence-graph-30dd6-744fc64586-ptbhn\" (UID: \"cf9447c6-99a6-4598-a2cf-4f2419257de2\") " pod="kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn" Apr 24 16:48:48.327908 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:48:48.327797 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-30dd6-serving-cert: secret "sequence-graph-30dd6-serving-cert" not found Apr 24 16:48:48.327908 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:48:48.327872 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf9447c6-99a6-4598-a2cf-4f2419257de2-proxy-tls podName:cf9447c6-99a6-4598-a2cf-4f2419257de2 nodeName:}" failed. No retries permitted until 2026-04-24 16:48:48.827856225 +0000 UTC m=+599.199560428 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/cf9447c6-99a6-4598-a2cf-4f2419257de2-proxy-tls") pod "sequence-graph-30dd6-744fc64586-ptbhn" (UID: "cf9447c6-99a6-4598-a2cf-4f2419257de2") : secret "sequence-graph-30dd6-serving-cert" not found Apr 24 16:48:48.328338 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:48.328322 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf9447c6-99a6-4598-a2cf-4f2419257de2-openshift-service-ca-bundle\") pod \"sequence-graph-30dd6-744fc64586-ptbhn\" (UID: \"cf9447c6-99a6-4598-a2cf-4f2419257de2\") " pod="kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn" Apr 24 16:48:48.831793 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:48.831749 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf9447c6-99a6-4598-a2cf-4f2419257de2-proxy-tls\") pod \"sequence-graph-30dd6-744fc64586-ptbhn\" (UID: \"cf9447c6-99a6-4598-a2cf-4f2419257de2\") " pod="kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn" Apr 24 16:48:48.834325 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:48.834306 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf9447c6-99a6-4598-a2cf-4f2419257de2-proxy-tls\") pod \"sequence-graph-30dd6-744fc64586-ptbhn\" (UID: \"cf9447c6-99a6-4598-a2cf-4f2419257de2\") " pod="kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn" Apr 24 16:48:48.970051 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:48.970005 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn" Apr 24 16:48:49.091753 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:49.091679 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn"] Apr 24 16:48:49.095355 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:48:49.095326 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf9447c6_99a6_4598_a2cf_4f2419257de2.slice/crio-d7f9e64f72bee538709bccfa976820bc6b1942bec1e2220a7e7c4c604413480d WatchSource:0}: Error finding container d7f9e64f72bee538709bccfa976820bc6b1942bec1e2220a7e7c4c604413480d: Status 404 returned error can't find the container with id d7f9e64f72bee538709bccfa976820bc6b1942bec1e2220a7e7c4c604413480d Apr 24 16:48:49.156631 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:49.156602 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn" event={"ID":"cf9447c6-99a6-4598-a2cf-4f2419257de2","Type":"ContainerStarted","Data":"59f099a48cfea77b4d8a1a414ef601cba7d6ee698261ee92871c90366e7c8098"} Apr 24 16:48:49.156752 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:49.156636 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn" event={"ID":"cf9447c6-99a6-4598-a2cf-4f2419257de2","Type":"ContainerStarted","Data":"d7f9e64f72bee538709bccfa976820bc6b1942bec1e2220a7e7c4c604413480d"} Apr 24 16:48:49.156752 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:49.156703 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn" Apr 24 16:48:49.175548 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:49.175497 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn" podStartSLOduration=1.175482317 podStartE2EDuration="1.175482317s" podCreationTimestamp="2026-04-24 16:48:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:48:49.174112401 +0000 UTC m=+599.545816627" watchObservedRunningTime="2026-04-24 16:48:49.175482317 +0000 UTC m=+599.547186587" Apr 24 16:48:50.136764 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:50.136734 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-f2l2f_45369a43-7ed3-4f16-a1dc-7f1a61e06fc4/console-operator/1.log" Apr 24 16:48:50.137496 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:50.137469 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-f2l2f_45369a43-7ed3-4f16-a1dc-7f1a61e06fc4/console-operator/1.log" Apr 24 16:48:50.139515 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:50.139497 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/ovn-acl-logging/0.log" Apr 24 16:48:50.140750 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:50.140332 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/ovn-acl-logging/0.log" Apr 24 16:48:55.164513 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:48:55.164481 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn" Apr 24 16:53:50.157212 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:53:50.157183 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-f2l2f_45369a43-7ed3-4f16-a1dc-7f1a61e06fc4/console-operator/1.log" Apr 24 16:53:50.159353 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:53:50.159257 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-f2l2f_45369a43-7ed3-4f16-a1dc-7f1a61e06fc4/console-operator/1.log" Apr 24 16:53:50.159959 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:53:50.159935 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/ovn-acl-logging/0.log" Apr 24 16:53:50.161903 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:53:50.161884 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/ovn-acl-logging/0.log" Apr 24 16:56:27.040916 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:56:27.040873 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch"] Apr 24 16:56:27.041596 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:56:27.041102 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch" podUID="b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7" containerName="switch-graph-2c4dd" containerID="cri-o://3a353963fd4e7fe62b53ec5356f84ca0ebff7269b75b087a3fdd95f55ded2eca" gracePeriod=30 Apr 24 16:56:29.060605 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:56:29.060559 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch" podUID="b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7" containerName="switch-graph-2c4dd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:56:34.060942 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:56:34.060896 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch" podUID="b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7" containerName="switch-graph-2c4dd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:56:39.060206 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:56:39.060120 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch" podUID="b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7" containerName="switch-graph-2c4dd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:56:39.060600 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:56:39.060247 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch" Apr 24 16:56:44.060508 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:56:44.060464 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch" podUID="b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7" containerName="switch-graph-2c4dd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:56:49.060935 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:56:49.060897 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch" podUID="b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7" containerName="switch-graph-2c4dd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:56:54.060896 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:56:54.060853 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch" podUID="b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7" containerName="switch-graph-2c4dd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:56:57.186592 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:56:57.186561 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch" Apr 24 16:56:57.207536 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:56:57.207510 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7-openshift-service-ca-bundle\") pod \"b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7\" (UID: \"b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7\") " Apr 24 16:56:57.207667 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:56:57.207565 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7-proxy-tls\") pod \"b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7\" (UID: \"b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7\") " Apr 24 16:56:57.207955 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:56:57.207930 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7" (UID: "b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:56:57.209886 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:56:57.209861 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7" (UID: "b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:56:57.308646 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:56:57.308563 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7-openshift-service-ca-bundle\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 16:56:57.308646 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:56:57.308593 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7-proxy-tls\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 16:56:57.530306 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:56:57.530241 2578 generic.go:358] "Generic (PLEG): container finished" podID="b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7" containerID="3a353963fd4e7fe62b53ec5356f84ca0ebff7269b75b087a3fdd95f55ded2eca" exitCode=0 Apr 24 16:56:57.530504 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:56:57.530325 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch" Apr 24 16:56:57.530504 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:56:57.530335 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch" event={"ID":"b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7","Type":"ContainerDied","Data":"3a353963fd4e7fe62b53ec5356f84ca0ebff7269b75b087a3fdd95f55ded2eca"} Apr 24 16:56:57.530504 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:56:57.530377 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch" event={"ID":"b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7","Type":"ContainerDied","Data":"3f2e9932280b902fda4e31d2e4baf7e0864fed373c9a0f97813a5d0d46680e91"} Apr 24 16:56:57.530504 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:56:57.530398 2578 scope.go:117] "RemoveContainer" containerID="3a353963fd4e7fe62b53ec5356f84ca0ebff7269b75b087a3fdd95f55ded2eca" Apr 24 16:56:57.540423 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:56:57.540207 2578 scope.go:117] "RemoveContainer" containerID="3a353963fd4e7fe62b53ec5356f84ca0ebff7269b75b087a3fdd95f55ded2eca" Apr 24 16:56:57.540673 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:56:57.540647 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a353963fd4e7fe62b53ec5356f84ca0ebff7269b75b087a3fdd95f55ded2eca\": container with ID starting with 3a353963fd4e7fe62b53ec5356f84ca0ebff7269b75b087a3fdd95f55ded2eca not found: ID does not exist" containerID="3a353963fd4e7fe62b53ec5356f84ca0ebff7269b75b087a3fdd95f55ded2eca" Apr 24 16:56:57.540764 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:56:57.540682 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a353963fd4e7fe62b53ec5356f84ca0ebff7269b75b087a3fdd95f55ded2eca"} err="failed to get container status \"3a353963fd4e7fe62b53ec5356f84ca0ebff7269b75b087a3fdd95f55ded2eca\": rpc error: code = NotFound desc = could not find container \"3a353963fd4e7fe62b53ec5356f84ca0ebff7269b75b087a3fdd95f55ded2eca\": container with ID starting with 3a353963fd4e7fe62b53ec5356f84ca0ebff7269b75b087a3fdd95f55ded2eca not found: ID does not exist" Apr 24 16:56:57.554633 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:56:57.554609 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch"] Apr 24 16:56:57.562320 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:56:57.562248 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-2c4dd-5f99fd554-5kkch"] Apr 24 16:56:58.236616 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:56:58.236579 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7" path="/var/lib/kubelet/pods/b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7/volumes" Apr 24 16:57:02.841394 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:02.841339 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn"] Apr 24 16:57:02.841966 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:02.841564 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn" podUID="cf9447c6-99a6-4598-a2cf-4f2419257de2" containerName="sequence-graph-30dd6" containerID="cri-o://59f099a48cfea77b4d8a1a414ef601cba7d6ee698261ee92871c90366e7c8098" gracePeriod=30 Apr 24 16:57:05.163442 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:05.163398 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn" podUID="cf9447c6-99a6-4598-a2cf-4f2419257de2" containerName="sequence-graph-30dd6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:57:10.163462 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:10.163425 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn" podUID="cf9447c6-99a6-4598-a2cf-4f2419257de2" containerName="sequence-graph-30dd6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:57:15.162786 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:15.162742 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn" podUID="cf9447c6-99a6-4598-a2cf-4f2419257de2" containerName="sequence-graph-30dd6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:57:15.163161 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:15.162871 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn" Apr 24 16:57:20.162740 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:20.162695 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn" podUID="cf9447c6-99a6-4598-a2cf-4f2419257de2" containerName="sequence-graph-30dd6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:57:25.162964 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:25.162923 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn" podUID="cf9447c6-99a6-4598-a2cf-4f2419257de2" containerName="sequence-graph-30dd6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:57:27.298453 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:27.298416 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr"] Apr 24 16:57:27.298840 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:27.298689 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7" containerName="switch-graph-2c4dd" Apr 24 16:57:27.298840 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:27.298700 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7" containerName="switch-graph-2c4dd" Apr 24 16:57:27.298840 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:27.298761 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="b0aacbe3-2a72-4fbd-ac66-8ba97f59b1d7" containerName="switch-graph-2c4dd" Apr 24 16:57:27.301568 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:27.301551 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr" Apr 24 16:57:27.304164 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:27.304138 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-f6171-serving-cert\"" Apr 24 16:57:27.304495 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:27.304478 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-f6171-kube-rbac-proxy-sar-config\"" Apr 24 16:57:27.310979 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:27.310952 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr"] Apr 24 16:57:27.439477 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:27.439442 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b064f2e-90a8-488f-98e1-8ae698ce7f7d-proxy-tls\") pod \"ensemble-graph-f6171-6bffd6cbc4-82phr\" (UID: \"2b064f2e-90a8-488f-98e1-8ae698ce7f7d\") " pod="kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr" Apr 24 16:57:27.439641 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:27.439528 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b064f2e-90a8-488f-98e1-8ae698ce7f7d-openshift-service-ca-bundle\") pod \"ensemble-graph-f6171-6bffd6cbc4-82phr\" (UID: \"2b064f2e-90a8-488f-98e1-8ae698ce7f7d\") " pod="kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr" Apr 24 16:57:27.540464 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:27.540424 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b064f2e-90a8-488f-98e1-8ae698ce7f7d-openshift-service-ca-bundle\") pod \"ensemble-graph-f6171-6bffd6cbc4-82phr\" (UID: \"2b064f2e-90a8-488f-98e1-8ae698ce7f7d\") " pod="kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr" Apr 24 16:57:27.540464 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:27.540473 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b064f2e-90a8-488f-98e1-8ae698ce7f7d-proxy-tls\") pod \"ensemble-graph-f6171-6bffd6cbc4-82phr\" (UID: \"2b064f2e-90a8-488f-98e1-8ae698ce7f7d\") " pod="kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr" Apr 24 16:57:27.541119 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:27.541100 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b064f2e-90a8-488f-98e1-8ae698ce7f7d-openshift-service-ca-bundle\") pod \"ensemble-graph-f6171-6bffd6cbc4-82phr\" (UID: \"2b064f2e-90a8-488f-98e1-8ae698ce7f7d\") " pod="kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr" Apr 24 16:57:27.542997 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:27.542975 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b064f2e-90a8-488f-98e1-8ae698ce7f7d-proxy-tls\") pod \"ensemble-graph-f6171-6bffd6cbc4-82phr\" (UID: \"2b064f2e-90a8-488f-98e1-8ae698ce7f7d\") " pod="kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr" Apr 24 16:57:27.611896 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:27.611815 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr" Apr 24 16:57:27.732778 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:27.732668 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr"] Apr 24 16:57:27.735517 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:57:27.735489 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b064f2e_90a8_488f_98e1_8ae698ce7f7d.slice/crio-bde2ef7ab5b3574a9db56241414854c598e952d437d648da85a814f0d4bc64cd WatchSource:0}: Error finding container bde2ef7ab5b3574a9db56241414854c598e952d437d648da85a814f0d4bc64cd: Status 404 returned error can't find the container with id bde2ef7ab5b3574a9db56241414854c598e952d437d648da85a814f0d4bc64cd Apr 24 16:57:27.737268 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:27.737250 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:57:28.621443 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:28.621401 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr" event={"ID":"2b064f2e-90a8-488f-98e1-8ae698ce7f7d","Type":"ContainerStarted","Data":"04f97b613c7ed5af35df59b3df9a197af77f605e4e95b4eedf51141abe7856b3"} Apr 24 16:57:28.621443 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:28.621441 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr" event={"ID":"2b064f2e-90a8-488f-98e1-8ae698ce7f7d","Type":"ContainerStarted","Data":"bde2ef7ab5b3574a9db56241414854c598e952d437d648da85a814f0d4bc64cd"} Apr 24 16:57:28.622031 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:28.621539 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr" Apr 24 16:57:28.642362 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:28.642318 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr" podStartSLOduration=1.642278713 podStartE2EDuration="1.642278713s" podCreationTimestamp="2026-04-24 16:57:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:57:28.641955882 +0000 UTC m=+1119.013660110" watchObservedRunningTime="2026-04-24 16:57:28.642278713 +0000 UTC m=+1119.013982938" Apr 24 16:57:30.163017 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:30.162979 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn" podUID="cf9447c6-99a6-4598-a2cf-4f2419257de2" containerName="sequence-graph-30dd6" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:57:32.981297 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:32.981260 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn" Apr 24 16:57:33.087751 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:33.087699 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf9447c6-99a6-4598-a2cf-4f2419257de2-proxy-tls\") pod \"cf9447c6-99a6-4598-a2cf-4f2419257de2\" (UID: \"cf9447c6-99a6-4598-a2cf-4f2419257de2\") " Apr 24 16:57:33.087932 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:33.087784 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf9447c6-99a6-4598-a2cf-4f2419257de2-openshift-service-ca-bundle\") pod \"cf9447c6-99a6-4598-a2cf-4f2419257de2\" (UID: \"cf9447c6-99a6-4598-a2cf-4f2419257de2\") " Apr 24 16:57:33.088162 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:33.088128 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf9447c6-99a6-4598-a2cf-4f2419257de2-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "cf9447c6-99a6-4598-a2cf-4f2419257de2" (UID: "cf9447c6-99a6-4598-a2cf-4f2419257de2"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:57:33.089914 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:33.089888 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf9447c6-99a6-4598-a2cf-4f2419257de2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "cf9447c6-99a6-4598-a2cf-4f2419257de2" (UID: "cf9447c6-99a6-4598-a2cf-4f2419257de2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:57:33.189177 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:33.189143 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf9447c6-99a6-4598-a2cf-4f2419257de2-openshift-service-ca-bundle\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 16:57:33.189177 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:33.189175 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cf9447c6-99a6-4598-a2cf-4f2419257de2-proxy-tls\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 16:57:33.635063 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:33.634961 2578 generic.go:358] "Generic (PLEG): container finished" podID="cf9447c6-99a6-4598-a2cf-4f2419257de2" containerID="59f099a48cfea77b4d8a1a414ef601cba7d6ee698261ee92871c90366e7c8098" exitCode=0 Apr 24 16:57:33.635063 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:33.635026 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn" Apr 24 16:57:33.635329 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:33.635060 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn" event={"ID":"cf9447c6-99a6-4598-a2cf-4f2419257de2","Type":"ContainerDied","Data":"59f099a48cfea77b4d8a1a414ef601cba7d6ee698261ee92871c90366e7c8098"} Apr 24 16:57:33.635329 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:33.635101 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn" event={"ID":"cf9447c6-99a6-4598-a2cf-4f2419257de2","Type":"ContainerDied","Data":"d7f9e64f72bee538709bccfa976820bc6b1942bec1e2220a7e7c4c604413480d"} Apr 24 16:57:33.635329 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:33.635121 2578 scope.go:117] "RemoveContainer" containerID="59f099a48cfea77b4d8a1a414ef601cba7d6ee698261ee92871c90366e7c8098" Apr 24 16:57:33.643352 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:33.643330 2578 scope.go:117] "RemoveContainer" containerID="59f099a48cfea77b4d8a1a414ef601cba7d6ee698261ee92871c90366e7c8098" Apr 24 16:57:33.643602 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:57:33.643581 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59f099a48cfea77b4d8a1a414ef601cba7d6ee698261ee92871c90366e7c8098\": container with ID starting with 59f099a48cfea77b4d8a1a414ef601cba7d6ee698261ee92871c90366e7c8098 not found: ID does not exist" containerID="59f099a48cfea77b4d8a1a414ef601cba7d6ee698261ee92871c90366e7c8098" Apr 24 16:57:33.643677 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:33.643614 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59f099a48cfea77b4d8a1a414ef601cba7d6ee698261ee92871c90366e7c8098"} err="failed to get container status \"59f099a48cfea77b4d8a1a414ef601cba7d6ee698261ee92871c90366e7c8098\": rpc error: code = NotFound desc = could not find container \"59f099a48cfea77b4d8a1a414ef601cba7d6ee698261ee92871c90366e7c8098\": container with ID starting with 59f099a48cfea77b4d8a1a414ef601cba7d6ee698261ee92871c90366e7c8098 not found: ID does not exist" Apr 24 16:57:33.656274 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:33.656251 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn"] Apr 24 16:57:33.658720 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:33.658700 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-30dd6-744fc64586-ptbhn"] Apr 24 16:57:34.237400 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:34.237367 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf9447c6-99a6-4598-a2cf-4f2419257de2" path="/var/lib/kubelet/pods/cf9447c6-99a6-4598-a2cf-4f2419257de2/volumes" Apr 24 16:57:34.629300 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:34.629200 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr" Apr 24 16:57:37.342243 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:37.342211 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr"] Apr 24 16:57:37.342645 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:37.342457 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr" podUID="2b064f2e-90a8-488f-98e1-8ae698ce7f7d" containerName="ensemble-graph-f6171" containerID="cri-o://04f97b613c7ed5af35df59b3df9a197af77f605e4e95b4eedf51141abe7856b3" gracePeriod=30 Apr 24 16:57:39.627900 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:39.627861 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr" podUID="2b064f2e-90a8-488f-98e1-8ae698ce7f7d" containerName="ensemble-graph-f6171" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:57:44.628342 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:44.628296 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr" podUID="2b064f2e-90a8-488f-98e1-8ae698ce7f7d" containerName="ensemble-graph-f6171" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:57:49.627829 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:49.627777 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr" podUID="2b064f2e-90a8-488f-98e1-8ae698ce7f7d" containerName="ensemble-graph-f6171" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:57:49.628216 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:49.627897 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr" Apr 24 16:57:54.627982 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:54.627940 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr" podUID="2b064f2e-90a8-488f-98e1-8ae698ce7f7d" containerName="ensemble-graph-f6171" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:57:59.628710 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:57:59.628670 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr" podUID="2b064f2e-90a8-488f-98e1-8ae698ce7f7d" containerName="ensemble-graph-f6171" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:58:02.986054 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:02.986016 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q"] Apr 24 16:58:02.986445 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:02.986366 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cf9447c6-99a6-4598-a2cf-4f2419257de2" containerName="sequence-graph-30dd6" Apr 24 16:58:02.986445 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:02.986380 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf9447c6-99a6-4598-a2cf-4f2419257de2" containerName="sequence-graph-30dd6" Apr 24 16:58:02.986445 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:02.986434 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="cf9447c6-99a6-4598-a2cf-4f2419257de2" containerName="sequence-graph-30dd6" Apr 24 16:58:02.990575 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:02.990558 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q" Apr 24 16:58:02.992685 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:02.992663 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-3eac5-kube-rbac-proxy-sar-config\"" Apr 24 16:58:02.992790 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:02.992668 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-3eac5-serving-cert\"" Apr 24 16:58:02.998133 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:02.998110 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q"] Apr 24 16:58:03.015957 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:03.015931 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7a7516a-b5e9-4aad-9c46-29fd815b5171-openshift-service-ca-bundle\") pod \"sequence-graph-3eac5-77647df9db-6km8q\" (UID: \"d7a7516a-b5e9-4aad-9c46-29fd815b5171\") " pod="kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q" Apr 24 16:58:03.016071 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:03.015975 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7a7516a-b5e9-4aad-9c46-29fd815b5171-proxy-tls\") pod \"sequence-graph-3eac5-77647df9db-6km8q\" (UID: \"d7a7516a-b5e9-4aad-9c46-29fd815b5171\") " pod="kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q" Apr 24 16:58:03.116596 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:03.116558 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7a7516a-b5e9-4aad-9c46-29fd815b5171-openshift-service-ca-bundle\") pod \"sequence-graph-3eac5-77647df9db-6km8q\" (UID: \"d7a7516a-b5e9-4aad-9c46-29fd815b5171\") " pod="kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q" Apr 24 16:58:03.116771 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:03.116625 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7a7516a-b5e9-4aad-9c46-29fd815b5171-proxy-tls\") pod \"sequence-graph-3eac5-77647df9db-6km8q\" (UID: \"d7a7516a-b5e9-4aad-9c46-29fd815b5171\") " pod="kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q" Apr 24 16:58:03.116821 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:58:03.116787 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sequence-graph-3eac5-serving-cert: secret "sequence-graph-3eac5-serving-cert" not found Apr 24 16:58:03.116870 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:58:03.116859 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7a7516a-b5e9-4aad-9c46-29fd815b5171-proxy-tls podName:d7a7516a-b5e9-4aad-9c46-29fd815b5171 nodeName:}" failed. No retries permitted until 2026-04-24 16:58:03.616839544 +0000 UTC m=+1153.988543757 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d7a7516a-b5e9-4aad-9c46-29fd815b5171-proxy-tls") pod "sequence-graph-3eac5-77647df9db-6km8q" (UID: "d7a7516a-b5e9-4aad-9c46-29fd815b5171") : secret "sequence-graph-3eac5-serving-cert" not found Apr 24 16:58:03.117197 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:03.117177 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7a7516a-b5e9-4aad-9c46-29fd815b5171-openshift-service-ca-bundle\") pod \"sequence-graph-3eac5-77647df9db-6km8q\" (UID: \"d7a7516a-b5e9-4aad-9c46-29fd815b5171\") " pod="kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q" Apr 24 16:58:03.619992 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:03.619937 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7a7516a-b5e9-4aad-9c46-29fd815b5171-proxy-tls\") pod \"sequence-graph-3eac5-77647df9db-6km8q\" (UID: \"d7a7516a-b5e9-4aad-9c46-29fd815b5171\") " pod="kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q" Apr 24 16:58:03.622688 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:03.622658 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7a7516a-b5e9-4aad-9c46-29fd815b5171-proxy-tls\") pod \"sequence-graph-3eac5-77647df9db-6km8q\" (UID: \"d7a7516a-b5e9-4aad-9c46-29fd815b5171\") " pod="kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q" Apr 24 16:58:03.900414 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:03.900384 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q" Apr 24 16:58:04.021301 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:04.021259 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q"] Apr 24 16:58:04.627991 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:04.627949 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr" podUID="2b064f2e-90a8-488f-98e1-8ae698ce7f7d" containerName="ensemble-graph-f6171" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:58:04.734803 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:04.734763 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q" event={"ID":"d7a7516a-b5e9-4aad-9c46-29fd815b5171","Type":"ContainerStarted","Data":"1d1a333403983afeb012b850f63ced2a2b38f203fa15be76ce850573c84dac73"} Apr 24 16:58:04.734803 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:04.734805 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q" event={"ID":"d7a7516a-b5e9-4aad-9c46-29fd815b5171","Type":"ContainerStarted","Data":"684e2b1e56bc0a2930d9e88729571932bac48f45df6840248128fade933f5daa"} Apr 24 16:58:04.735019 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:04.734887 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q" Apr 24 16:58:04.753171 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:04.753123 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q" podStartSLOduration=2.753105167 podStartE2EDuration="2.753105167s" podCreationTimestamp="2026-04-24 16:58:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:58:04.751615393 +0000 UTC m=+1155.123319618" watchObservedRunningTime="2026-04-24 16:58:04.753105167 +0000 UTC m=+1155.124809393" Apr 24 16:58:07.367137 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:58:07.367097 2578 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b064f2e_90a8_488f_98e1_8ae698ce7f7d.slice/crio-conmon-04f97b613c7ed5af35df59b3df9a197af77f605e4e95b4eedf51141abe7856b3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b064f2e_90a8_488f_98e1_8ae698ce7f7d.slice/crio-04f97b613c7ed5af35df59b3df9a197af77f605e4e95b4eedf51141abe7856b3.scope\": RecentStats: unable to find data in memory cache]" Apr 24 16:58:07.367471 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:58:07.367140 2578 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b064f2e_90a8_488f_98e1_8ae698ce7f7d.slice/crio-04f97b613c7ed5af35df59b3df9a197af77f605e4e95b4eedf51141abe7856b3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b064f2e_90a8_488f_98e1_8ae698ce7f7d.slice/crio-conmon-04f97b613c7ed5af35df59b3df9a197af77f605e4e95b4eedf51141abe7856b3.scope\": RecentStats: unable to find data in memory cache]" Apr 24 16:58:07.367584 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:58:07.367554 2578 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b064f2e_90a8_488f_98e1_8ae698ce7f7d.slice/crio-bde2ef7ab5b3574a9db56241414854c598e952d437d648da85a814f0d4bc64cd\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b064f2e_90a8_488f_98e1_8ae698ce7f7d.slice/crio-conmon-04f97b613c7ed5af35df59b3df9a197af77f605e4e95b4eedf51141abe7856b3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b064f2e_90a8_488f_98e1_8ae698ce7f7d.slice/crio-04f97b613c7ed5af35df59b3df9a197af77f605e4e95b4eedf51141abe7856b3.scope\": RecentStats: unable to find data in memory cache]" Apr 24 16:58:07.484299 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:07.484259 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr" Apr 24 16:58:07.549084 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:07.549046 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b064f2e-90a8-488f-98e1-8ae698ce7f7d-proxy-tls\") pod \"2b064f2e-90a8-488f-98e1-8ae698ce7f7d\" (UID: \"2b064f2e-90a8-488f-98e1-8ae698ce7f7d\") " Apr 24 16:58:07.549220 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:07.549167 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b064f2e-90a8-488f-98e1-8ae698ce7f7d-openshift-service-ca-bundle\") pod \"2b064f2e-90a8-488f-98e1-8ae698ce7f7d\" (UID: \"2b064f2e-90a8-488f-98e1-8ae698ce7f7d\") " Apr 24 16:58:07.549563 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:07.549539 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b064f2e-90a8-488f-98e1-8ae698ce7f7d-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "2b064f2e-90a8-488f-98e1-8ae698ce7f7d" (UID: "2b064f2e-90a8-488f-98e1-8ae698ce7f7d"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:58:07.551340 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:07.551316 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b064f2e-90a8-488f-98e1-8ae698ce7f7d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2b064f2e-90a8-488f-98e1-8ae698ce7f7d" (UID: "2b064f2e-90a8-488f-98e1-8ae698ce7f7d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:58:07.650163 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:07.650072 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b064f2e-90a8-488f-98e1-8ae698ce7f7d-openshift-service-ca-bundle\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 16:58:07.650163 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:07.650105 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2b064f2e-90a8-488f-98e1-8ae698ce7f7d-proxy-tls\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 16:58:07.749777 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:07.749740 2578 generic.go:358] "Generic (PLEG): container finished" podID="2b064f2e-90a8-488f-98e1-8ae698ce7f7d" containerID="04f97b613c7ed5af35df59b3df9a197af77f605e4e95b4eedf51141abe7856b3" exitCode=0 Apr 24 16:58:07.749955 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:07.749793 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr" event={"ID":"2b064f2e-90a8-488f-98e1-8ae698ce7f7d","Type":"ContainerDied","Data":"04f97b613c7ed5af35df59b3df9a197af77f605e4e95b4eedf51141abe7856b3"} Apr 24 16:58:07.749955 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:07.749811 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr" Apr 24 16:58:07.749955 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:07.749825 2578 scope.go:117] "RemoveContainer" containerID="04f97b613c7ed5af35df59b3df9a197af77f605e4e95b4eedf51141abe7856b3" Apr 24 16:58:07.749955 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:07.749814 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr" event={"ID":"2b064f2e-90a8-488f-98e1-8ae698ce7f7d","Type":"ContainerDied","Data":"bde2ef7ab5b3574a9db56241414854c598e952d437d648da85a814f0d4bc64cd"} Apr 24 16:58:07.758369 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:07.758348 2578 scope.go:117] "RemoveContainer" containerID="04f97b613c7ed5af35df59b3df9a197af77f605e4e95b4eedf51141abe7856b3" Apr 24 16:58:07.758624 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:58:07.758605 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04f97b613c7ed5af35df59b3df9a197af77f605e4e95b4eedf51141abe7856b3\": container with ID starting with 04f97b613c7ed5af35df59b3df9a197af77f605e4e95b4eedf51141abe7856b3 not found: ID does not exist" containerID="04f97b613c7ed5af35df59b3df9a197af77f605e4e95b4eedf51141abe7856b3" Apr 24 16:58:07.758671 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:07.758632 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04f97b613c7ed5af35df59b3df9a197af77f605e4e95b4eedf51141abe7856b3"} err="failed to get container status \"04f97b613c7ed5af35df59b3df9a197af77f605e4e95b4eedf51141abe7856b3\": rpc error: code = NotFound desc = could not find container \"04f97b613c7ed5af35df59b3df9a197af77f605e4e95b4eedf51141abe7856b3\": container with ID starting with 04f97b613c7ed5af35df59b3df9a197af77f605e4e95b4eedf51141abe7856b3 not found: ID does not exist" Apr 24 16:58:07.770861 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:07.770839 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr"] Apr 24 16:58:07.775745 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:07.775723 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-f6171-6bffd6cbc4-82phr"] Apr 24 16:58:08.237419 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:08.237387 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b064f2e-90a8-488f-98e1-8ae698ce7f7d" path="/var/lib/kubelet/pods/2b064f2e-90a8-488f-98e1-8ae698ce7f7d/volumes" Apr 24 16:58:10.747635 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:10.747608 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q" Apr 24 16:58:13.053863 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:13.053829 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q"] Apr 24 16:58:13.054253 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:13.054025 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q" podUID="d7a7516a-b5e9-4aad-9c46-29fd815b5171" containerName="sequence-graph-3eac5" containerID="cri-o://1d1a333403983afeb012b850f63ced2a2b38f203fa15be76ce850573c84dac73" gracePeriod=30 Apr 24 16:58:15.746402 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:15.746353 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q" podUID="d7a7516a-b5e9-4aad-9c46-29fd815b5171" containerName="sequence-graph-3eac5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:58:20.746819 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:20.746780 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q" podUID="d7a7516a-b5e9-4aad-9c46-29fd815b5171" containerName="sequence-graph-3eac5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:58:25.746096 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:25.746053 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q" podUID="d7a7516a-b5e9-4aad-9c46-29fd815b5171" containerName="sequence-graph-3eac5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:58:25.746499 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:25.746153 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q" Apr 24 16:58:30.746079 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:30.746030 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q" podUID="d7a7516a-b5e9-4aad-9c46-29fd815b5171" containerName="sequence-graph-3eac5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:58:35.746523 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:35.746479 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q" podUID="d7a7516a-b5e9-4aad-9c46-29fd815b5171" containerName="sequence-graph-3eac5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:58:37.563010 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:37.562975 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287"] Apr 24 16:58:37.563388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:37.563260 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2b064f2e-90a8-488f-98e1-8ae698ce7f7d" containerName="ensemble-graph-f6171" Apr 24 16:58:37.563388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:37.563272 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b064f2e-90a8-488f-98e1-8ae698ce7f7d" containerName="ensemble-graph-f6171" Apr 24 16:58:37.563388 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:37.563353 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="2b064f2e-90a8-488f-98e1-8ae698ce7f7d" containerName="ensemble-graph-f6171" Apr 24 16:58:37.566311 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:37.566279 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287" Apr 24 16:58:37.568759 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:37.568736 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-510ff-kube-rbac-proxy-sar-config\"" Apr 24 16:58:37.569233 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:37.569213 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-510ff-serving-cert\"" Apr 24 16:58:37.580384 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:37.580360 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287"] Apr 24 16:58:37.688630 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:37.688598 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3238c417-fa6b-4acd-be0a-ae1791599d78-proxy-tls\") pod \"ensemble-graph-510ff-85d6b4bb57-jr287\" (UID: \"3238c417-fa6b-4acd-be0a-ae1791599d78\") " pod="kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287" Apr 24 16:58:37.688811 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:37.688656 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3238c417-fa6b-4acd-be0a-ae1791599d78-openshift-service-ca-bundle\") pod \"ensemble-graph-510ff-85d6b4bb57-jr287\" (UID: \"3238c417-fa6b-4acd-be0a-ae1791599d78\") " pod="kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287" Apr 24 16:58:37.789483 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:37.789452 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3238c417-fa6b-4acd-be0a-ae1791599d78-proxy-tls\") pod \"ensemble-graph-510ff-85d6b4bb57-jr287\" (UID: \"3238c417-fa6b-4acd-be0a-ae1791599d78\") " pod="kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287" Apr 24 16:58:37.789626 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:37.789510 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3238c417-fa6b-4acd-be0a-ae1791599d78-openshift-service-ca-bundle\") pod \"ensemble-graph-510ff-85d6b4bb57-jr287\" (UID: \"3238c417-fa6b-4acd-be0a-ae1791599d78\") " pod="kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287" Apr 24 16:58:37.789687 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:58:37.789628 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-510ff-serving-cert: secret "ensemble-graph-510ff-serving-cert" not found Apr 24 16:58:37.789734 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:58:37.789714 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3238c417-fa6b-4acd-be0a-ae1791599d78-proxy-tls podName:3238c417-fa6b-4acd-be0a-ae1791599d78 nodeName:}" failed. No retries permitted until 2026-04-24 16:58:38.289690192 +0000 UTC m=+1188.661394414 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/3238c417-fa6b-4acd-be0a-ae1791599d78-proxy-tls") pod "ensemble-graph-510ff-85d6b4bb57-jr287" (UID: "3238c417-fa6b-4acd-be0a-ae1791599d78") : secret "ensemble-graph-510ff-serving-cert" not found Apr 24 16:58:37.790061 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:37.790042 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3238c417-fa6b-4acd-be0a-ae1791599d78-openshift-service-ca-bundle\") pod \"ensemble-graph-510ff-85d6b4bb57-jr287\" (UID: \"3238c417-fa6b-4acd-be0a-ae1791599d78\") " pod="kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287" Apr 24 16:58:38.294863 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:38.294822 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3238c417-fa6b-4acd-be0a-ae1791599d78-proxy-tls\") pod \"ensemble-graph-510ff-85d6b4bb57-jr287\" (UID: \"3238c417-fa6b-4acd-be0a-ae1791599d78\") " pod="kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287" Apr 24 16:58:38.297380 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:38.297362 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3238c417-fa6b-4acd-be0a-ae1791599d78-proxy-tls\") pod \"ensemble-graph-510ff-85d6b4bb57-jr287\" (UID: \"3238c417-fa6b-4acd-be0a-ae1791599d78\") " pod="kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287" Apr 24 16:58:38.475921 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:38.475886 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287" Apr 24 16:58:38.601445 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:38.601414 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287"] Apr 24 16:58:38.605026 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:58:38.604996 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3238c417_fa6b_4acd_be0a_ae1791599d78.slice/crio-02dd4ecf1ff369ff639c8f0f203035751cd302589f0a67d2aca72f0e00ad09e5 WatchSource:0}: Error finding container 02dd4ecf1ff369ff639c8f0f203035751cd302589f0a67d2aca72f0e00ad09e5: Status 404 returned error can't find the container with id 02dd4ecf1ff369ff639c8f0f203035751cd302589f0a67d2aca72f0e00ad09e5 Apr 24 16:58:38.842024 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:38.841931 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287" event={"ID":"3238c417-fa6b-4acd-be0a-ae1791599d78","Type":"ContainerStarted","Data":"c4a2a90068ef0f1613a976de5a275f85b906751ae5ede6327bdbdf328eeb0a74"} Apr 24 16:58:38.842024 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:38.841968 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287" event={"ID":"3238c417-fa6b-4acd-be0a-ae1791599d78","Type":"ContainerStarted","Data":"02dd4ecf1ff369ff639c8f0f203035751cd302589f0a67d2aca72f0e00ad09e5"} Apr 24 16:58:38.842218 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:38.842053 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287" Apr 24 16:58:38.858533 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:38.858488 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287" podStartSLOduration=1.858475171 podStartE2EDuration="1.858475171s" podCreationTimestamp="2026-04-24 16:58:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:58:38.857831922 +0000 UTC m=+1189.229536148" watchObservedRunningTime="2026-04-24 16:58:38.858475171 +0000 UTC m=+1189.230179396" Apr 24 16:58:40.746403 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:40.746355 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q" podUID="d7a7516a-b5e9-4aad-9c46-29fd815b5171" containerName="sequence-graph-3eac5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:58:43.203420 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:43.203391 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q" Apr 24 16:58:43.338429 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:43.338320 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7a7516a-b5e9-4aad-9c46-29fd815b5171-proxy-tls\") pod \"d7a7516a-b5e9-4aad-9c46-29fd815b5171\" (UID: \"d7a7516a-b5e9-4aad-9c46-29fd815b5171\") " Apr 24 16:58:43.338429 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:43.338404 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7a7516a-b5e9-4aad-9c46-29fd815b5171-openshift-service-ca-bundle\") pod \"d7a7516a-b5e9-4aad-9c46-29fd815b5171\" (UID: \"d7a7516a-b5e9-4aad-9c46-29fd815b5171\") " Apr 24 16:58:43.338758 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:43.338733 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7a7516a-b5e9-4aad-9c46-29fd815b5171-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "d7a7516a-b5e9-4aad-9c46-29fd815b5171" (UID: "d7a7516a-b5e9-4aad-9c46-29fd815b5171"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:58:43.340586 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:43.340565 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7a7516a-b5e9-4aad-9c46-29fd815b5171-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d7a7516a-b5e9-4aad-9c46-29fd815b5171" (UID: "d7a7516a-b5e9-4aad-9c46-29fd815b5171"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:58:43.439092 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:43.439060 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7a7516a-b5e9-4aad-9c46-29fd815b5171-proxy-tls\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 16:58:43.439092 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:43.439090 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7a7516a-b5e9-4aad-9c46-29fd815b5171-openshift-service-ca-bundle\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 16:58:43.857335 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:43.857299 2578 generic.go:358] "Generic (PLEG): container finished" podID="d7a7516a-b5e9-4aad-9c46-29fd815b5171" containerID="1d1a333403983afeb012b850f63ced2a2b38f203fa15be76ce850573c84dac73" exitCode=0 Apr 24 16:58:43.857503 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:43.857355 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q" Apr 24 16:58:43.857503 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:43.857370 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q" event={"ID":"d7a7516a-b5e9-4aad-9c46-29fd815b5171","Type":"ContainerDied","Data":"1d1a333403983afeb012b850f63ced2a2b38f203fa15be76ce850573c84dac73"} Apr 24 16:58:43.857503 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:43.857396 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q" event={"ID":"d7a7516a-b5e9-4aad-9c46-29fd815b5171","Type":"ContainerDied","Data":"684e2b1e56bc0a2930d9e88729571932bac48f45df6840248128fade933f5daa"} Apr 24 16:58:43.857503 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:43.857412 2578 scope.go:117] "RemoveContainer" containerID="1d1a333403983afeb012b850f63ced2a2b38f203fa15be76ce850573c84dac73" Apr 24 16:58:43.866028 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:43.866008 2578 scope.go:117] "RemoveContainer" containerID="1d1a333403983afeb012b850f63ced2a2b38f203fa15be76ce850573c84dac73" Apr 24 16:58:43.866320 ip-10-0-131-47 kubenswrapper[2578]: E0424 16:58:43.866278 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d1a333403983afeb012b850f63ced2a2b38f203fa15be76ce850573c84dac73\": container with ID starting with 1d1a333403983afeb012b850f63ced2a2b38f203fa15be76ce850573c84dac73 not found: ID does not exist" containerID="1d1a333403983afeb012b850f63ced2a2b38f203fa15be76ce850573c84dac73" Apr 24 16:58:43.866370 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:43.866331 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d1a333403983afeb012b850f63ced2a2b38f203fa15be76ce850573c84dac73"} err="failed to get container status \"1d1a333403983afeb012b850f63ced2a2b38f203fa15be76ce850573c84dac73\": rpc error: code = NotFound desc = could not find container \"1d1a333403983afeb012b850f63ced2a2b38f203fa15be76ce850573c84dac73\": container with ID starting with 1d1a333403983afeb012b850f63ced2a2b38f203fa15be76ce850573c84dac73 not found: ID does not exist" Apr 24 16:58:43.898920 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:43.898889 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q"] Apr 24 16:58:43.900036 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:43.900016 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-3eac5-77647df9db-6km8q"] Apr 24 16:58:44.236927 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:44.236899 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7a7516a-b5e9-4aad-9c46-29fd815b5171" path="/var/lib/kubelet/pods/d7a7516a-b5e9-4aad-9c46-29fd815b5171/volumes" Apr 24 16:58:44.850365 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:44.850335 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287" Apr 24 16:58:50.177826 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:50.177784 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-f2l2f_45369a43-7ed3-4f16-a1dc-7f1a61e06fc4/console-operator/1.log" Apr 24 16:58:50.180681 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:50.180655 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/ovn-acl-logging/0.log" Apr 24 16:58:50.180964 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:50.180946 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-f2l2f_45369a43-7ed3-4f16-a1dc-7f1a61e06fc4/console-operator/1.log" Apr 24 16:58:50.183841 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:58:50.183820 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/ovn-acl-logging/0.log" Apr 24 16:59:13.312414 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:59:13.312374 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l"] Apr 24 16:59:13.312882 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:59:13.312679 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7a7516a-b5e9-4aad-9c46-29fd815b5171" containerName="sequence-graph-3eac5" Apr 24 16:59:13.312882 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:59:13.312692 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7a7516a-b5e9-4aad-9c46-29fd815b5171" containerName="sequence-graph-3eac5" Apr 24 16:59:13.312882 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:59:13.312756 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7a7516a-b5e9-4aad-9c46-29fd815b5171" containerName="sequence-graph-3eac5" Apr 24 16:59:13.315612 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:59:13.315595 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l" Apr 24 16:59:13.317968 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:59:13.317946 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-77cb4-serving-cert\"" Apr 24 16:59:13.318086 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:59:13.317949 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-77cb4-kube-rbac-proxy-sar-config\"" Apr 24 16:59:13.325827 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:59:13.325806 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l"] Apr 24 16:59:13.366595 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:59:13.366569 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c4103ad4-10d3-4113-9913-dd967ce6405b-proxy-tls\") pod \"sequence-graph-77cb4-7d9f7db785-bfm8l\" (UID: \"c4103ad4-10d3-4113-9913-dd967ce6405b\") " pod="kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l" Apr 24 16:59:13.366595 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:59:13.366606 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4103ad4-10d3-4113-9913-dd967ce6405b-openshift-service-ca-bundle\") pod \"sequence-graph-77cb4-7d9f7db785-bfm8l\" (UID: \"c4103ad4-10d3-4113-9913-dd967ce6405b\") " pod="kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l" Apr 24 16:59:13.467430 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:59:13.467394 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c4103ad4-10d3-4113-9913-dd967ce6405b-proxy-tls\") pod \"sequence-graph-77cb4-7d9f7db785-bfm8l\" (UID: \"c4103ad4-10d3-4113-9913-dd967ce6405b\") " pod="kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l" Apr 24 16:59:13.467620 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:59:13.467446 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4103ad4-10d3-4113-9913-dd967ce6405b-openshift-service-ca-bundle\") pod \"sequence-graph-77cb4-7d9f7db785-bfm8l\" (UID: \"c4103ad4-10d3-4113-9913-dd967ce6405b\") " pod="kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l" Apr 24 16:59:13.468072 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:59:13.468050 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4103ad4-10d3-4113-9913-dd967ce6405b-openshift-service-ca-bundle\") pod \"sequence-graph-77cb4-7d9f7db785-bfm8l\" (UID: \"c4103ad4-10d3-4113-9913-dd967ce6405b\") " pod="kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l" Apr 24 16:59:13.469993 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:59:13.469972 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c4103ad4-10d3-4113-9913-dd967ce6405b-proxy-tls\") pod \"sequence-graph-77cb4-7d9f7db785-bfm8l\" (UID: \"c4103ad4-10d3-4113-9913-dd967ce6405b\") " pod="kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l" Apr 24 16:59:13.625649 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:59:13.625548 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l" Apr 24 16:59:13.767435 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:59:13.767404 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l"] Apr 24 16:59:13.770716 ip-10-0-131-47 kubenswrapper[2578]: W0424 16:59:13.770681 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4103ad4_10d3_4113_9913_dd967ce6405b.slice/crio-05b45946f4cd3ce081509646ac8ecc7b0013b54f61b21037bf3ec651277f8afe WatchSource:0}: Error finding container 05b45946f4cd3ce081509646ac8ecc7b0013b54f61b21037bf3ec651277f8afe: Status 404 returned error can't find the container with id 05b45946f4cd3ce081509646ac8ecc7b0013b54f61b21037bf3ec651277f8afe Apr 24 16:59:13.949229 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:59:13.949190 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l" event={"ID":"c4103ad4-10d3-4113-9913-dd967ce6405b","Type":"ContainerStarted","Data":"17b66c716e05015316da2ea43fdc84f149c4049cd88efda4e9223cfbde206fe3"} Apr 24 16:59:13.949229 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:59:13.949230 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l" event={"ID":"c4103ad4-10d3-4113-9913-dd967ce6405b","Type":"ContainerStarted","Data":"05b45946f4cd3ce081509646ac8ecc7b0013b54f61b21037bf3ec651277f8afe"} Apr 24 16:59:13.949521 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:59:13.949263 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l" Apr 24 16:59:19.958546 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:59:19.958509 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l" Apr 24 16:59:19.975530 ip-10-0-131-47 kubenswrapper[2578]: I0424 16:59:19.975486 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l" podStartSLOduration=6.975472353 podStartE2EDuration="6.975472353s" podCreationTimestamp="2026-04-24 16:59:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:59:13.968649107 +0000 UTC m=+1224.340353333" watchObservedRunningTime="2026-04-24 16:59:19.975472353 +0000 UTC m=+1230.347176578" Apr 24 17:03:50.199677 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:03:50.199645 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-f2l2f_45369a43-7ed3-4f16-a1dc-7f1a61e06fc4/console-operator/1.log" Apr 24 17:03:50.201812 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:03:50.201787 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-f2l2f_45369a43-7ed3-4f16-a1dc-7f1a61e06fc4/console-operator/1.log" Apr 24 17:03:50.202420 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:03:50.202392 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/ovn-acl-logging/0.log" Apr 24 17:03:50.204376 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:03:50.204357 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/ovn-acl-logging/0.log" Apr 24 17:06:52.073318 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:06:52.073261 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287"] Apr 24 17:06:52.075741 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:06:52.073522 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287" podUID="3238c417-fa6b-4acd-be0a-ae1791599d78" containerName="ensemble-graph-510ff" containerID="cri-o://c4a2a90068ef0f1613a976de5a275f85b906751ae5ede6327bdbdf328eeb0a74" gracePeriod=30 Apr 24 17:06:54.849061 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:06:54.849022 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287" podUID="3238c417-fa6b-4acd-be0a-ae1791599d78" containerName="ensemble-graph-510ff" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 17:06:59.849615 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:06:59.849574 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287" podUID="3238c417-fa6b-4acd-be0a-ae1791599d78" containerName="ensemble-graph-510ff" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 17:07:04.849437 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:04.849397 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287" podUID="3238c417-fa6b-4acd-be0a-ae1791599d78" containerName="ensemble-graph-510ff" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 17:07:04.849812 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:04.849522 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287" Apr 24 17:07:09.848934 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:09.848849 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287" podUID="3238c417-fa6b-4acd-be0a-ae1791599d78" containerName="ensemble-graph-510ff" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 17:07:14.848987 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:14.848936 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287" podUID="3238c417-fa6b-4acd-be0a-ae1791599d78" containerName="ensemble-graph-510ff" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 17:07:19.849412 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:19.849367 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287" podUID="3238c417-fa6b-4acd-be0a-ae1791599d78" containerName="ensemble-graph-510ff" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 17:07:22.255130 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:22.255096 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287" Apr 24 17:07:22.346225 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:22.346132 2578 generic.go:358] "Generic (PLEG): container finished" podID="3238c417-fa6b-4acd-be0a-ae1791599d78" containerID="c4a2a90068ef0f1613a976de5a275f85b906751ae5ede6327bdbdf328eeb0a74" exitCode=0 Apr 24 17:07:22.346225 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:22.346200 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287" Apr 24 17:07:22.346529 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:22.346225 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287" event={"ID":"3238c417-fa6b-4acd-be0a-ae1791599d78","Type":"ContainerDied","Data":"c4a2a90068ef0f1613a976de5a275f85b906751ae5ede6327bdbdf328eeb0a74"} Apr 24 17:07:22.346529 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:22.346275 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287" event={"ID":"3238c417-fa6b-4acd-be0a-ae1791599d78","Type":"ContainerDied","Data":"02dd4ecf1ff369ff639c8f0f203035751cd302589f0a67d2aca72f0e00ad09e5"} Apr 24 17:07:22.346529 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:22.346318 2578 scope.go:117] "RemoveContainer" containerID="c4a2a90068ef0f1613a976de5a275f85b906751ae5ede6327bdbdf328eeb0a74" Apr 24 17:07:22.354034 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:22.354015 2578 scope.go:117] "RemoveContainer" containerID="c4a2a90068ef0f1613a976de5a275f85b906751ae5ede6327bdbdf328eeb0a74" Apr 24 17:07:22.354324 ip-10-0-131-47 kubenswrapper[2578]: E0424 17:07:22.354304 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4a2a90068ef0f1613a976de5a275f85b906751ae5ede6327bdbdf328eeb0a74\": container with ID starting with c4a2a90068ef0f1613a976de5a275f85b906751ae5ede6327bdbdf328eeb0a74 not found: ID does not exist" containerID="c4a2a90068ef0f1613a976de5a275f85b906751ae5ede6327bdbdf328eeb0a74" Apr 24 17:07:22.354377 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:22.354335 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4a2a90068ef0f1613a976de5a275f85b906751ae5ede6327bdbdf328eeb0a74"} err="failed to get container status \"c4a2a90068ef0f1613a976de5a275f85b906751ae5ede6327bdbdf328eeb0a74\": rpc error: code = NotFound desc = could not find container \"c4a2a90068ef0f1613a976de5a275f85b906751ae5ede6327bdbdf328eeb0a74\": container with ID starting with c4a2a90068ef0f1613a976de5a275f85b906751ae5ede6327bdbdf328eeb0a74 not found: ID does not exist" Apr 24 17:07:22.386640 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:22.386613 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3238c417-fa6b-4acd-be0a-ae1791599d78-openshift-service-ca-bundle\") pod \"3238c417-fa6b-4acd-be0a-ae1791599d78\" (UID: \"3238c417-fa6b-4acd-be0a-ae1791599d78\") " Apr 24 17:07:22.386735 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:22.386669 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3238c417-fa6b-4acd-be0a-ae1791599d78-proxy-tls\") pod \"3238c417-fa6b-4acd-be0a-ae1791599d78\" (UID: \"3238c417-fa6b-4acd-be0a-ae1791599d78\") " Apr 24 17:07:22.386978 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:22.386955 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3238c417-fa6b-4acd-be0a-ae1791599d78-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "3238c417-fa6b-4acd-be0a-ae1791599d78" (UID: "3238c417-fa6b-4acd-be0a-ae1791599d78"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:07:22.388881 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:22.388858 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3238c417-fa6b-4acd-be0a-ae1791599d78-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3238c417-fa6b-4acd-be0a-ae1791599d78" (UID: "3238c417-fa6b-4acd-be0a-ae1791599d78"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:07:22.487370 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:22.487335 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3238c417-fa6b-4acd-be0a-ae1791599d78-openshift-service-ca-bundle\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 17:07:22.487370 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:22.487363 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3238c417-fa6b-4acd-be0a-ae1791599d78-proxy-tls\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 17:07:22.668322 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:22.668271 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287"] Apr 24 17:07:22.671113 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:22.671083 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-510ff-85d6b4bb57-jr287"] Apr 24 17:07:24.237526 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:24.237487 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3238c417-fa6b-4acd-be0a-ae1791599d78" path="/var/lib/kubelet/pods/3238c417-fa6b-4acd-be0a-ae1791599d78/volumes" Apr 24 17:07:27.921482 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:27.921443 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l"] Apr 24 17:07:27.921938 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:27.921751 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l" podUID="c4103ad4-10d3-4113-9913-dd967ce6405b" containerName="sequence-graph-77cb4" containerID="cri-o://17b66c716e05015316da2ea43fdc84f149c4049cd88efda4e9223cfbde206fe3" gracePeriod=30 Apr 24 17:07:29.956862 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:29.956818 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l" podUID="c4103ad4-10d3-4113-9913-dd967ce6405b" containerName="sequence-graph-77cb4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 17:07:34.956781 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:34.956741 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l" podUID="c4103ad4-10d3-4113-9913-dd967ce6405b" containerName="sequence-graph-77cb4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 17:07:39.957205 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:39.957156 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l" podUID="c4103ad4-10d3-4113-9913-dd967ce6405b" containerName="sequence-graph-77cb4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 17:07:39.957609 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:39.957260 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l" Apr 24 17:07:44.956759 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:44.956715 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l" podUID="c4103ad4-10d3-4113-9913-dd967ce6405b" containerName="sequence-graph-77cb4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 17:07:49.957211 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:49.957165 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l" podUID="c4103ad4-10d3-4113-9913-dd967ce6405b" containerName="sequence-graph-77cb4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 17:07:52.298787 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:52.298748 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs"] Apr 24 17:07:52.299195 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:52.299054 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3238c417-fa6b-4acd-be0a-ae1791599d78" containerName="ensemble-graph-510ff" Apr 24 17:07:52.299195 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:52.299066 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="3238c417-fa6b-4acd-be0a-ae1791599d78" containerName="ensemble-graph-510ff" Apr 24 17:07:52.299195 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:52.299128 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="3238c417-fa6b-4acd-be0a-ae1791599d78" containerName="ensemble-graph-510ff" Apr 24 17:07:52.302401 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:52.302384 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs" Apr 24 17:07:52.304646 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:52.304624 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-3827a-kube-rbac-proxy-sar-config\"" Apr 24 17:07:52.304719 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:52.304647 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-3827a-serving-cert\"" Apr 24 17:07:52.309791 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:52.309766 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs"] Apr 24 17:07:52.425566 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:52.425526 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d298a1c-dd16-416a-82e4-347696632949-proxy-tls\") pod \"splitter-graph-3827a-79444bd679-ncdhs\" (UID: \"9d298a1c-dd16-416a-82e4-347696632949\") " pod="kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs" Apr 24 17:07:52.425741 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:52.425596 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d298a1c-dd16-416a-82e4-347696632949-openshift-service-ca-bundle\") pod \"splitter-graph-3827a-79444bd679-ncdhs\" (UID: \"9d298a1c-dd16-416a-82e4-347696632949\") " pod="kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs" Apr 24 17:07:52.526401 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:52.526359 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d298a1c-dd16-416a-82e4-347696632949-openshift-service-ca-bundle\") pod \"splitter-graph-3827a-79444bd679-ncdhs\" (UID: \"9d298a1c-dd16-416a-82e4-347696632949\") " pod="kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs" Apr 24 17:07:52.526574 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:52.526427 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d298a1c-dd16-416a-82e4-347696632949-proxy-tls\") pod \"splitter-graph-3827a-79444bd679-ncdhs\" (UID: \"9d298a1c-dd16-416a-82e4-347696632949\") " pod="kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs" Apr 24 17:07:52.526980 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:52.526960 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d298a1c-dd16-416a-82e4-347696632949-openshift-service-ca-bundle\") pod \"splitter-graph-3827a-79444bd679-ncdhs\" (UID: \"9d298a1c-dd16-416a-82e4-347696632949\") " pod="kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs" Apr 24 17:07:52.528981 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:52.528957 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d298a1c-dd16-416a-82e4-347696632949-proxy-tls\") pod \"splitter-graph-3827a-79444bd679-ncdhs\" (UID: \"9d298a1c-dd16-416a-82e4-347696632949\") " pod="kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs" Apr 24 17:07:52.613392 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:52.613307 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs" Apr 24 17:07:52.732497 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:52.732448 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs"] Apr 24 17:07:52.735192 ip-10-0-131-47 kubenswrapper[2578]: W0424 17:07:52.735164 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d298a1c_dd16_416a_82e4_347696632949.slice/crio-ebc6b0c6bd0ca3598ab9676ee68ae4546c26934a6f3931d06ade51cf77eed77e WatchSource:0}: Error finding container ebc6b0c6bd0ca3598ab9676ee68ae4546c26934a6f3931d06ade51cf77eed77e: Status 404 returned error can't find the container with id ebc6b0c6bd0ca3598ab9676ee68ae4546c26934a6f3931d06ade51cf77eed77e Apr 24 17:07:52.736907 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:52.736887 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 17:07:53.442671 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:53.442633 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs" event={"ID":"9d298a1c-dd16-416a-82e4-347696632949","Type":"ContainerStarted","Data":"9c98889dfcb4bbd1026186339e75a24a7c43a62d7f1d4396a2273e588fc28fc0"} Apr 24 17:07:53.442671 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:53.442671 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs" event={"ID":"9d298a1c-dd16-416a-82e4-347696632949","Type":"ContainerStarted","Data":"ebc6b0c6bd0ca3598ab9676ee68ae4546c26934a6f3931d06ade51cf77eed77e"} Apr 24 17:07:53.443091 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:53.442694 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs" Apr 24 17:07:53.459600 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:53.459555 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs" podStartSLOduration=1.459540126 podStartE2EDuration="1.459540126s" podCreationTimestamp="2026-04-24 17:07:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:07:53.457551927 +0000 UTC m=+1743.829256165" watchObservedRunningTime="2026-04-24 17:07:53.459540126 +0000 UTC m=+1743.831244351" Apr 24 17:07:54.957036 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:54.956994 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l" podUID="c4103ad4-10d3-4113-9913-dd967ce6405b" containerName="sequence-graph-77cb4" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 17:07:58.090887 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:58.090856 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l" Apr 24 17:07:58.273419 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:58.273335 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4103ad4-10d3-4113-9913-dd967ce6405b-openshift-service-ca-bundle\") pod \"c4103ad4-10d3-4113-9913-dd967ce6405b\" (UID: \"c4103ad4-10d3-4113-9913-dd967ce6405b\") " Apr 24 17:07:58.273584 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:58.273433 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c4103ad4-10d3-4113-9913-dd967ce6405b-proxy-tls\") pod \"c4103ad4-10d3-4113-9913-dd967ce6405b\" (UID: \"c4103ad4-10d3-4113-9913-dd967ce6405b\") " Apr 24 17:07:58.273691 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:58.273666 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4103ad4-10d3-4113-9913-dd967ce6405b-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "c4103ad4-10d3-4113-9913-dd967ce6405b" (UID: "c4103ad4-10d3-4113-9913-dd967ce6405b"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:07:58.275606 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:58.275584 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4103ad4-10d3-4113-9913-dd967ce6405b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c4103ad4-10d3-4113-9913-dd967ce6405b" (UID: "c4103ad4-10d3-4113-9913-dd967ce6405b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:07:58.374568 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:58.374528 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c4103ad4-10d3-4113-9913-dd967ce6405b-proxy-tls\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 17:07:58.374568 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:58.374568 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4103ad4-10d3-4113-9913-dd967ce6405b-openshift-service-ca-bundle\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 17:07:58.457587 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:58.457553 2578 generic.go:358] "Generic (PLEG): container finished" podID="c4103ad4-10d3-4113-9913-dd967ce6405b" containerID="17b66c716e05015316da2ea43fdc84f149c4049cd88efda4e9223cfbde206fe3" exitCode=0 Apr 24 17:07:58.457745 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:58.457620 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l" Apr 24 17:07:58.457745 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:58.457612 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l" event={"ID":"c4103ad4-10d3-4113-9913-dd967ce6405b","Type":"ContainerDied","Data":"17b66c716e05015316da2ea43fdc84f149c4049cd88efda4e9223cfbde206fe3"} Apr 24 17:07:58.457745 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:58.457722 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l" event={"ID":"c4103ad4-10d3-4113-9913-dd967ce6405b","Type":"ContainerDied","Data":"05b45946f4cd3ce081509646ac8ecc7b0013b54f61b21037bf3ec651277f8afe"} Apr 24 17:07:58.457745 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:58.457739 2578 scope.go:117] "RemoveContainer" containerID="17b66c716e05015316da2ea43fdc84f149c4049cd88efda4e9223cfbde206fe3" Apr 24 17:07:58.466511 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:58.466491 2578 scope.go:117] "RemoveContainer" containerID="17b66c716e05015316da2ea43fdc84f149c4049cd88efda4e9223cfbde206fe3" Apr 24 17:07:58.466788 ip-10-0-131-47 kubenswrapper[2578]: E0424 17:07:58.466763 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17b66c716e05015316da2ea43fdc84f149c4049cd88efda4e9223cfbde206fe3\": container with ID starting with 17b66c716e05015316da2ea43fdc84f149c4049cd88efda4e9223cfbde206fe3 not found: ID does not exist" containerID="17b66c716e05015316da2ea43fdc84f149c4049cd88efda4e9223cfbde206fe3" Apr 24 17:07:58.466839 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:58.466798 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17b66c716e05015316da2ea43fdc84f149c4049cd88efda4e9223cfbde206fe3"} err="failed to get container status \"17b66c716e05015316da2ea43fdc84f149c4049cd88efda4e9223cfbde206fe3\": rpc error: code = NotFound desc = could not find container \"17b66c716e05015316da2ea43fdc84f149c4049cd88efda4e9223cfbde206fe3\": container with ID starting with 17b66c716e05015316da2ea43fdc84f149c4049cd88efda4e9223cfbde206fe3 not found: ID does not exist" Apr 24 17:07:58.478255 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:58.478231 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l"] Apr 24 17:07:58.481853 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:58.481831 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-77cb4-7d9f7db785-bfm8l"] Apr 24 17:07:59.450961 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:07:59.450934 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs" Apr 24 17:08:00.237516 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:00.237484 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4103ad4-10d3-4113-9913-dd967ce6405b" path="/var/lib/kubelet/pods/c4103ad4-10d3-4113-9913-dd967ce6405b/volumes" Apr 24 17:08:02.359968 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:02.359893 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs"] Apr 24 17:08:02.360333 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:02.360110 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs" podUID="9d298a1c-dd16-416a-82e4-347696632949" containerName="splitter-graph-3827a" containerID="cri-o://9c98889dfcb4bbd1026186339e75a24a7c43a62d7f1d4396a2273e588fc28fc0" gracePeriod=30 Apr 24 17:08:04.448877 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:04.448825 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs" podUID="9d298a1c-dd16-416a-82e4-347696632949" containerName="splitter-graph-3827a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 17:08:09.448857 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:09.448816 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs" podUID="9d298a1c-dd16-416a-82e4-347696632949" containerName="splitter-graph-3827a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 17:08:14.449374 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:14.449335 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs" podUID="9d298a1c-dd16-416a-82e4-347696632949" containerName="splitter-graph-3827a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 17:08:14.449752 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:14.449443 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs" Apr 24 17:08:19.449693 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:19.449649 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs" podUID="9d298a1c-dd16-416a-82e4-347696632949" containerName="splitter-graph-3827a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 17:08:24.449415 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:24.449372 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs" podUID="9d298a1c-dd16-416a-82e4-347696632949" containerName="splitter-graph-3827a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 17:08:28.121634 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:28.121594 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br"] Apr 24 17:08:28.122181 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:28.122016 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c4103ad4-10d3-4113-9913-dd967ce6405b" containerName="sequence-graph-77cb4" Apr 24 17:08:28.122181 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:28.122034 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4103ad4-10d3-4113-9913-dd967ce6405b" containerName="sequence-graph-77cb4" Apr 24 17:08:28.122181 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:28.122105 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c4103ad4-10d3-4113-9913-dd967ce6405b" containerName="sequence-graph-77cb4" Apr 24 17:08:28.124958 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:28.124938 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br" Apr 24 17:08:28.127428 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:28.127405 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-8edb5-serving-cert\"" Apr 24 17:08:28.127544 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:28.127410 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-8edb5-kube-rbac-proxy-sar-config\"" Apr 24 17:08:28.134639 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:28.134614 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br"] Apr 24 17:08:28.195728 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:28.195687 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db54f995-a27f-4f3e-97da-f93dbc9c5e0b-proxy-tls\") pod \"switch-graph-8edb5-5fc995bf6d-m66br\" (UID: \"db54f995-a27f-4f3e-97da-f93dbc9c5e0b\") " pod="kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br" Apr 24 17:08:28.195885 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:28.195732 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db54f995-a27f-4f3e-97da-f93dbc9c5e0b-openshift-service-ca-bundle\") pod \"switch-graph-8edb5-5fc995bf6d-m66br\" (UID: \"db54f995-a27f-4f3e-97da-f93dbc9c5e0b\") " pod="kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br" Apr 24 17:08:28.296139 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:28.296102 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db54f995-a27f-4f3e-97da-f93dbc9c5e0b-proxy-tls\") pod \"switch-graph-8edb5-5fc995bf6d-m66br\" (UID: \"db54f995-a27f-4f3e-97da-f93dbc9c5e0b\") " pod="kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br" Apr 24 17:08:28.296139 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:28.296148 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db54f995-a27f-4f3e-97da-f93dbc9c5e0b-openshift-service-ca-bundle\") pod \"switch-graph-8edb5-5fc995bf6d-m66br\" (UID: \"db54f995-a27f-4f3e-97da-f93dbc9c5e0b\") " pod="kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br" Apr 24 17:08:28.296440 ip-10-0-131-47 kubenswrapper[2578]: E0424 17:08:28.296241 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-8edb5-serving-cert: secret "switch-graph-8edb5-serving-cert" not found Apr 24 17:08:28.296440 ip-10-0-131-47 kubenswrapper[2578]: E0424 17:08:28.296342 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db54f995-a27f-4f3e-97da-f93dbc9c5e0b-proxy-tls podName:db54f995-a27f-4f3e-97da-f93dbc9c5e0b nodeName:}" failed. No retries permitted until 2026-04-24 17:08:28.796321432 +0000 UTC m=+1779.168025644 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/db54f995-a27f-4f3e-97da-f93dbc9c5e0b-proxy-tls") pod "switch-graph-8edb5-5fc995bf6d-m66br" (UID: "db54f995-a27f-4f3e-97da-f93dbc9c5e0b") : secret "switch-graph-8edb5-serving-cert" not found Apr 24 17:08:28.296805 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:28.296784 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db54f995-a27f-4f3e-97da-f93dbc9c5e0b-openshift-service-ca-bundle\") pod \"switch-graph-8edb5-5fc995bf6d-m66br\" (UID: \"db54f995-a27f-4f3e-97da-f93dbc9c5e0b\") " pod="kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br" Apr 24 17:08:28.800593 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:28.800552 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db54f995-a27f-4f3e-97da-f93dbc9c5e0b-proxy-tls\") pod \"switch-graph-8edb5-5fc995bf6d-m66br\" (UID: \"db54f995-a27f-4f3e-97da-f93dbc9c5e0b\") " pod="kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br" Apr 24 17:08:28.803076 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:28.803053 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db54f995-a27f-4f3e-97da-f93dbc9c5e0b-proxy-tls\") pod \"switch-graph-8edb5-5fc995bf6d-m66br\" (UID: \"db54f995-a27f-4f3e-97da-f93dbc9c5e0b\") " pod="kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br" Apr 24 17:08:29.034797 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:29.034770 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br" Apr 24 17:08:29.159761 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:29.159737 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br"] Apr 24 17:08:29.161840 ip-10-0-131-47 kubenswrapper[2578]: W0424 17:08:29.161806 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb54f995_a27f_4f3e_97da_f93dbc9c5e0b.slice/crio-895bc8e0cb21e9d28bfd10a6866d9ebc8b2373c35e7721ee4f20735fd9b73cc6 WatchSource:0}: Error finding container 895bc8e0cb21e9d28bfd10a6866d9ebc8b2373c35e7721ee4f20735fd9b73cc6: Status 404 returned error can't find the container with id 895bc8e0cb21e9d28bfd10a6866d9ebc8b2373c35e7721ee4f20735fd9b73cc6 Apr 24 17:08:29.449864 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:29.449823 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs" podUID="9d298a1c-dd16-416a-82e4-347696632949" containerName="splitter-graph-3827a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 17:08:29.550638 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:29.550601 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br" event={"ID":"db54f995-a27f-4f3e-97da-f93dbc9c5e0b","Type":"ContainerStarted","Data":"635ccd5d194f72c84d520086c9fd8fcb74de5a490bb968622f6dec02905bb092"} Apr 24 17:08:29.550638 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:29.550639 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br" event={"ID":"db54f995-a27f-4f3e-97da-f93dbc9c5e0b","Type":"ContainerStarted","Data":"895bc8e0cb21e9d28bfd10a6866d9ebc8b2373c35e7721ee4f20735fd9b73cc6"} Apr 24 17:08:29.550851 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:29.550751 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br" Apr 24 17:08:29.570829 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:29.570781 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br" podStartSLOduration=1.570768534 podStartE2EDuration="1.570768534s" podCreationTimestamp="2026-04-24 17:08:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:08:29.568758211 +0000 UTC m=+1779.940462436" watchObservedRunningTime="2026-04-24 17:08:29.570768534 +0000 UTC m=+1779.942472759" Apr 24 17:08:32.560415 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:32.560375 2578 generic.go:358] "Generic (PLEG): container finished" podID="9d298a1c-dd16-416a-82e4-347696632949" containerID="9c98889dfcb4bbd1026186339e75a24a7c43a62d7f1d4396a2273e588fc28fc0" exitCode=0 Apr 24 17:08:32.560784 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:32.560449 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs" event={"ID":"9d298a1c-dd16-416a-82e4-347696632949","Type":"ContainerDied","Data":"9c98889dfcb4bbd1026186339e75a24a7c43a62d7f1d4396a2273e588fc28fc0"} Apr 24 17:08:33.007006 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:33.006983 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs" Apr 24 17:08:33.032187 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:33.032163 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d298a1c-dd16-416a-82e4-347696632949-proxy-tls\") pod \"9d298a1c-dd16-416a-82e4-347696632949\" (UID: \"9d298a1c-dd16-416a-82e4-347696632949\") " Apr 24 17:08:33.032337 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:33.032226 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d298a1c-dd16-416a-82e4-347696632949-openshift-service-ca-bundle\") pod \"9d298a1c-dd16-416a-82e4-347696632949\" (UID: \"9d298a1c-dd16-416a-82e4-347696632949\") " Apr 24 17:08:33.032604 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:33.032581 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d298a1c-dd16-416a-82e4-347696632949-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "9d298a1c-dd16-416a-82e4-347696632949" (UID: "9d298a1c-dd16-416a-82e4-347696632949"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:08:33.034425 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:33.034400 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d298a1c-dd16-416a-82e4-347696632949-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9d298a1c-dd16-416a-82e4-347696632949" (UID: "9d298a1c-dd16-416a-82e4-347696632949"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:08:33.133368 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:33.133253 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d298a1c-dd16-416a-82e4-347696632949-proxy-tls\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 17:08:33.133368 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:33.133317 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d298a1c-dd16-416a-82e4-347696632949-openshift-service-ca-bundle\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 17:08:33.563779 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:33.563740 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs" event={"ID":"9d298a1c-dd16-416a-82e4-347696632949","Type":"ContainerDied","Data":"ebc6b0c6bd0ca3598ab9676ee68ae4546c26934a6f3931d06ade51cf77eed77e"} Apr 24 17:08:33.564152 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:33.563795 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs" Apr 24 17:08:33.564152 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:33.563795 2578 scope.go:117] "RemoveContainer" containerID="9c98889dfcb4bbd1026186339e75a24a7c43a62d7f1d4396a2273e588fc28fc0" Apr 24 17:08:33.583096 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:33.583070 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs"] Apr 24 17:08:33.588932 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:33.588909 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-3827a-79444bd679-ncdhs"] Apr 24 17:08:34.237523 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:34.237487 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d298a1c-dd16-416a-82e4-347696632949" path="/var/lib/kubelet/pods/9d298a1c-dd16-416a-82e4-347696632949/volumes" Apr 24 17:08:35.558929 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:35.558896 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br" Apr 24 17:08:50.225345 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:50.225313 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-f2l2f_45369a43-7ed3-4f16-a1dc-7f1a61e06fc4/console-operator/1.log" Apr 24 17:08:50.227688 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:50.227668 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-f2l2f_45369a43-7ed3-4f16-a1dc-7f1a61e06fc4/console-operator/1.log" Apr 24 17:08:50.227868 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:50.227850 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/ovn-acl-logging/0.log" Apr 24 17:08:50.230709 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:08:50.230691 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/ovn-acl-logging/0.log" Apr 24 17:09:02.572545 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:09:02.572510 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml"] Apr 24 17:09:02.572952 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:09:02.572791 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d298a1c-dd16-416a-82e4-347696632949" containerName="splitter-graph-3827a" Apr 24 17:09:02.572952 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:09:02.572801 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d298a1c-dd16-416a-82e4-347696632949" containerName="splitter-graph-3827a" Apr 24 17:09:02.572952 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:09:02.572853 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d298a1c-dd16-416a-82e4-347696632949" containerName="splitter-graph-3827a" Apr 24 17:09:02.576956 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:09:02.576938 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml" Apr 24 17:09:02.579014 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:09:02.578994 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-9c25c-kube-rbac-proxy-sar-config\"" Apr 24 17:09:02.579136 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:09:02.578997 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-9c25c-serving-cert\"" Apr 24 17:09:02.583946 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:09:02.583679 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml"] Apr 24 17:09:02.667554 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:09:02.667516 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4138b7fb-1a27-4eeb-8100-8e5ff46d968e-proxy-tls\") pod \"splitter-graph-9c25c-8bff57c84-thhml\" (UID: \"4138b7fb-1a27-4eeb-8100-8e5ff46d968e\") " pod="kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml" Apr 24 17:09:02.667726 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:09:02.667622 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4138b7fb-1a27-4eeb-8100-8e5ff46d968e-openshift-service-ca-bundle\") pod \"splitter-graph-9c25c-8bff57c84-thhml\" (UID: \"4138b7fb-1a27-4eeb-8100-8e5ff46d968e\") " pod="kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml" Apr 24 17:09:02.768788 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:09:02.768732 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4138b7fb-1a27-4eeb-8100-8e5ff46d968e-proxy-tls\") pod \"splitter-graph-9c25c-8bff57c84-thhml\" (UID: \"4138b7fb-1a27-4eeb-8100-8e5ff46d968e\") " pod="kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml" Apr 24 17:09:02.768997 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:09:02.768801 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4138b7fb-1a27-4eeb-8100-8e5ff46d968e-openshift-service-ca-bundle\") pod \"splitter-graph-9c25c-8bff57c84-thhml\" (UID: \"4138b7fb-1a27-4eeb-8100-8e5ff46d968e\") " pod="kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml" Apr 24 17:09:02.768997 ip-10-0-131-47 kubenswrapper[2578]: E0424 17:09:02.768896 2578 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-9c25c-serving-cert: secret "splitter-graph-9c25c-serving-cert" not found Apr 24 17:09:02.768997 ip-10-0-131-47 kubenswrapper[2578]: E0424 17:09:02.768973 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4138b7fb-1a27-4eeb-8100-8e5ff46d968e-proxy-tls podName:4138b7fb-1a27-4eeb-8100-8e5ff46d968e nodeName:}" failed. No retries permitted until 2026-04-24 17:09:03.268954451 +0000 UTC m=+1813.640658656 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4138b7fb-1a27-4eeb-8100-8e5ff46d968e-proxy-tls") pod "splitter-graph-9c25c-8bff57c84-thhml" (UID: "4138b7fb-1a27-4eeb-8100-8e5ff46d968e") : secret "splitter-graph-9c25c-serving-cert" not found Apr 24 17:09:02.769545 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:09:02.769517 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4138b7fb-1a27-4eeb-8100-8e5ff46d968e-openshift-service-ca-bundle\") pod \"splitter-graph-9c25c-8bff57c84-thhml\" (UID: \"4138b7fb-1a27-4eeb-8100-8e5ff46d968e\") " pod="kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml" Apr 24 17:09:03.273185 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:09:03.273144 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4138b7fb-1a27-4eeb-8100-8e5ff46d968e-proxy-tls\") pod \"splitter-graph-9c25c-8bff57c84-thhml\" (UID: \"4138b7fb-1a27-4eeb-8100-8e5ff46d968e\") " pod="kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml" Apr 24 17:09:03.275865 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:09:03.275838 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4138b7fb-1a27-4eeb-8100-8e5ff46d968e-proxy-tls\") pod \"splitter-graph-9c25c-8bff57c84-thhml\" (UID: \"4138b7fb-1a27-4eeb-8100-8e5ff46d968e\") " pod="kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml" Apr 24 17:09:03.488249 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:09:03.488214 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml" Apr 24 17:09:03.609065 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:09:03.609039 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml"] Apr 24 17:09:03.611464 ip-10-0-131-47 kubenswrapper[2578]: W0424 17:09:03.611434 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4138b7fb_1a27_4eeb_8100_8e5ff46d968e.slice/crio-202bdb537ded9f228f8b13208a65b9f051217713c7e7e493de750b363ce54435 WatchSource:0}: Error finding container 202bdb537ded9f228f8b13208a65b9f051217713c7e7e493de750b363ce54435: Status 404 returned error can't find the container with id 202bdb537ded9f228f8b13208a65b9f051217713c7e7e493de750b363ce54435 Apr 24 17:09:03.647420 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:09:03.647390 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml" event={"ID":"4138b7fb-1a27-4eeb-8100-8e5ff46d968e","Type":"ContainerStarted","Data":"202bdb537ded9f228f8b13208a65b9f051217713c7e7e493de750b363ce54435"} Apr 24 17:09:04.651144 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:09:04.651108 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml" event={"ID":"4138b7fb-1a27-4eeb-8100-8e5ff46d968e","Type":"ContainerStarted","Data":"3d5acec34c8b1fd074b5e7b11bba5ec6f8096e8a8a4aba083c086fe6308ba129"} Apr 24 17:09:04.651534 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:09:04.651162 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml" Apr 24 17:09:04.668071 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:09:04.668025 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml" podStartSLOduration=2.668011281 podStartE2EDuration="2.668011281s" podCreationTimestamp="2026-04-24 17:09:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:09:04.666123513 +0000 UTC m=+1815.037827738" watchObservedRunningTime="2026-04-24 17:09:04.668011281 +0000 UTC m=+1815.039715535" Apr 24 17:09:10.660272 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:09:10.660240 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml" Apr 24 17:13:50.248188 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:13:50.248160 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-f2l2f_45369a43-7ed3-4f16-a1dc-7f1a61e06fc4/console-operator/1.log" Apr 24 17:13:50.251577 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:13:50.251550 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/ovn-acl-logging/0.log" Apr 24 17:13:50.253818 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:13:50.253799 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-f2l2f_45369a43-7ed3-4f16-a1dc-7f1a61e06fc4/console-operator/1.log" Apr 24 17:13:50.256698 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:13:50.256679 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/ovn-acl-logging/0.log" Apr 24 17:17:17.296459 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:17:17.296425 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml"] Apr 24 17:17:17.299026 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:17:17.296691 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml" podUID="4138b7fb-1a27-4eeb-8100-8e5ff46d968e" containerName="splitter-graph-9c25c" containerID="cri-o://3d5acec34c8b1fd074b5e7b11bba5ec6f8096e8a8a4aba083c086fe6308ba129" gracePeriod=30 Apr 24 17:17:20.657917 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:17:20.657872 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml" podUID="4138b7fb-1a27-4eeb-8100-8e5ff46d968e" containerName="splitter-graph-9c25c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 17:17:25.658933 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:17:25.658887 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml" podUID="4138b7fb-1a27-4eeb-8100-8e5ff46d968e" containerName="splitter-graph-9c25c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 17:17:30.657959 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:17:30.657918 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml" podUID="4138b7fb-1a27-4eeb-8100-8e5ff46d968e" containerName="splitter-graph-9c25c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 17:17:30.658353 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:17:30.658022 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml" Apr 24 17:17:35.658406 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:17:35.658361 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml" podUID="4138b7fb-1a27-4eeb-8100-8e5ff46d968e" containerName="splitter-graph-9c25c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 17:17:40.658113 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:17:40.658019 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml" podUID="4138b7fb-1a27-4eeb-8100-8e5ff46d968e" containerName="splitter-graph-9c25c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 17:17:45.658643 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:17:45.658606 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml" podUID="4138b7fb-1a27-4eeb-8100-8e5ff46d968e" containerName="splitter-graph-9c25c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 17:17:47.936219 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:17:47.936188 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml" Apr 24 17:17:48.037186 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:17:48.037149 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4138b7fb-1a27-4eeb-8100-8e5ff46d968e-openshift-service-ca-bundle\") pod \"4138b7fb-1a27-4eeb-8100-8e5ff46d968e\" (UID: \"4138b7fb-1a27-4eeb-8100-8e5ff46d968e\") " Apr 24 17:17:48.037411 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:17:48.037219 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4138b7fb-1a27-4eeb-8100-8e5ff46d968e-proxy-tls\") pod \"4138b7fb-1a27-4eeb-8100-8e5ff46d968e\" (UID: \"4138b7fb-1a27-4eeb-8100-8e5ff46d968e\") " Apr 24 17:17:48.037614 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:17:48.037587 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4138b7fb-1a27-4eeb-8100-8e5ff46d968e-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "4138b7fb-1a27-4eeb-8100-8e5ff46d968e" (UID: "4138b7fb-1a27-4eeb-8100-8e5ff46d968e"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:17:48.039478 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:17:48.039455 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4138b7fb-1a27-4eeb-8100-8e5ff46d968e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4138b7fb-1a27-4eeb-8100-8e5ff46d968e" (UID: "4138b7fb-1a27-4eeb-8100-8e5ff46d968e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:17:48.138220 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:17:48.138137 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4138b7fb-1a27-4eeb-8100-8e5ff46d968e-proxy-tls\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 17:17:48.138220 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:17:48.138170 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4138b7fb-1a27-4eeb-8100-8e5ff46d968e-openshift-service-ca-bundle\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 17:17:48.144317 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:17:48.144274 2578 generic.go:358] "Generic (PLEG): container finished" podID="4138b7fb-1a27-4eeb-8100-8e5ff46d968e" containerID="3d5acec34c8b1fd074b5e7b11bba5ec6f8096e8a8a4aba083c086fe6308ba129" exitCode=0 Apr 24 17:17:48.144469 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:17:48.144369 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml" Apr 24 17:17:48.144469 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:17:48.144380 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml" event={"ID":"4138b7fb-1a27-4eeb-8100-8e5ff46d968e","Type":"ContainerDied","Data":"3d5acec34c8b1fd074b5e7b11bba5ec6f8096e8a8a4aba083c086fe6308ba129"} Apr 24 17:17:48.144469 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:17:48.144418 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml" event={"ID":"4138b7fb-1a27-4eeb-8100-8e5ff46d968e","Type":"ContainerDied","Data":"202bdb537ded9f228f8b13208a65b9f051217713c7e7e493de750b363ce54435"} Apr 24 17:17:48.144469 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:17:48.144435 2578 scope.go:117] "RemoveContainer" containerID="3d5acec34c8b1fd074b5e7b11bba5ec6f8096e8a8a4aba083c086fe6308ba129" Apr 24 17:17:48.152784 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:17:48.152766 2578 scope.go:117] "RemoveContainer" containerID="3d5acec34c8b1fd074b5e7b11bba5ec6f8096e8a8a4aba083c086fe6308ba129" Apr 24 17:17:48.153068 ip-10-0-131-47 kubenswrapper[2578]: E0424 17:17:48.153045 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d5acec34c8b1fd074b5e7b11bba5ec6f8096e8a8a4aba083c086fe6308ba129\": container with ID starting with 3d5acec34c8b1fd074b5e7b11bba5ec6f8096e8a8a4aba083c086fe6308ba129 not found: ID does not exist" containerID="3d5acec34c8b1fd074b5e7b11bba5ec6f8096e8a8a4aba083c086fe6308ba129" Apr 24 17:17:48.153152 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:17:48.153073 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d5acec34c8b1fd074b5e7b11bba5ec6f8096e8a8a4aba083c086fe6308ba129"} err="failed to get container status \"3d5acec34c8b1fd074b5e7b11bba5ec6f8096e8a8a4aba083c086fe6308ba129\": rpc error: code = NotFound desc = could not find container \"3d5acec34c8b1fd074b5e7b11bba5ec6f8096e8a8a4aba083c086fe6308ba129\": container with ID starting with 3d5acec34c8b1fd074b5e7b11bba5ec6f8096e8a8a4aba083c086fe6308ba129 not found: ID does not exist" Apr 24 17:17:48.164673 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:17:48.164643 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml"] Apr 24 17:17:48.167593 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:17:48.167572 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-9c25c-8bff57c84-thhml"] Apr 24 17:17:48.237438 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:17:48.237407 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4138b7fb-1a27-4eeb-8100-8e5ff46d968e" path="/var/lib/kubelet/pods/4138b7fb-1a27-4eeb-8100-8e5ff46d968e/volumes" Apr 24 17:18:50.271323 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:18:50.271275 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-f2l2f_45369a43-7ed3-4f16-a1dc-7f1a61e06fc4/console-operator/1.log" Apr 24 17:18:50.273815 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:18:50.273795 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/ovn-acl-logging/0.log" Apr 24 17:18:50.275171 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:18:50.275138 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-f2l2f_45369a43-7ed3-4f16-a1dc-7f1a61e06fc4/console-operator/1.log" Apr 24 17:18:50.277670 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:18:50.277651 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/ovn-acl-logging/0.log" Apr 24 17:23:50.291105 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:23:50.291073 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-f2l2f_45369a43-7ed3-4f16-a1dc-7f1a61e06fc4/console-operator/1.log" Apr 24 17:23:50.293735 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:23:50.293711 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/ovn-acl-logging/0.log" Apr 24 17:23:50.295461 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:23:50.295441 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-f2l2f_45369a43-7ed3-4f16-a1dc-7f1a61e06fc4/console-operator/1.log" Apr 24 17:23:50.298038 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:23:50.298021 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/ovn-acl-logging/0.log" Apr 24 17:24:47.251961 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:24:47.251927 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br"] Apr 24 17:24:47.252836 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:24:47.252154 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br" podUID="db54f995-a27f-4f3e-97da-f93dbc9c5e0b" containerName="switch-graph-8edb5" containerID="cri-o://635ccd5d194f72c84d520086c9fd8fcb74de5a490bb968622f6dec02905bb092" gracePeriod=30 Apr 24 17:24:50.557796 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:24:50.557758 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br" podUID="db54f995-a27f-4f3e-97da-f93dbc9c5e0b" containerName="switch-graph-8edb5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 17:24:55.558303 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:24:55.558241 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br" podUID="db54f995-a27f-4f3e-97da-f93dbc9c5e0b" containerName="switch-graph-8edb5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 17:25:00.557481 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:00.557441 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br" podUID="db54f995-a27f-4f3e-97da-f93dbc9c5e0b" containerName="switch-graph-8edb5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 17:25:00.557916 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:00.557571 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br" Apr 24 17:25:03.386418 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:03.386390 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-8edb5-5fc995bf6d-m66br_db54f995-a27f-4f3e-97da-f93dbc9c5e0b/switch-graph-8edb5/0.log" Apr 24 17:25:04.200838 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:04.200793 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-8edb5-5fc995bf6d-m66br_db54f995-a27f-4f3e-97da-f93dbc9c5e0b/switch-graph-8edb5/0.log" Apr 24 17:25:05.015926 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:05.015896 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-8edb5-5fc995bf6d-m66br_db54f995-a27f-4f3e-97da-f93dbc9c5e0b/switch-graph-8edb5/0.log" Apr 24 17:25:05.557842 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:05.557795 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br" podUID="db54f995-a27f-4f3e-97da-f93dbc9c5e0b" containerName="switch-graph-8edb5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 17:25:05.858887 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:05.858806 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-8edb5-5fc995bf6d-m66br_db54f995-a27f-4f3e-97da-f93dbc9c5e0b/switch-graph-8edb5/0.log" Apr 24 17:25:06.686564 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:06.686528 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-8edb5-5fc995bf6d-m66br_db54f995-a27f-4f3e-97da-f93dbc9c5e0b/switch-graph-8edb5/0.log" Apr 24 17:25:07.474526 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:07.474495 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-8edb5-5fc995bf6d-m66br_db54f995-a27f-4f3e-97da-f93dbc9c5e0b/switch-graph-8edb5/0.log" Apr 24 17:25:08.260809 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:08.260777 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-8edb5-5fc995bf6d-m66br_db54f995-a27f-4f3e-97da-f93dbc9c5e0b/switch-graph-8edb5/0.log" Apr 24 17:25:09.065323 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:09.065298 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-8edb5-5fc995bf6d-m66br_db54f995-a27f-4f3e-97da-f93dbc9c5e0b/switch-graph-8edb5/0.log" Apr 24 17:25:09.857470 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:09.857440 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-8edb5-5fc995bf6d-m66br_db54f995-a27f-4f3e-97da-f93dbc9c5e0b/switch-graph-8edb5/0.log" Apr 24 17:25:10.558243 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:10.558202 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br" podUID="db54f995-a27f-4f3e-97da-f93dbc9c5e0b" containerName="switch-graph-8edb5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 17:25:10.713035 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:10.713002 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-8edb5-5fc995bf6d-m66br_db54f995-a27f-4f3e-97da-f93dbc9c5e0b/switch-graph-8edb5/0.log" Apr 24 17:25:11.577218 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:11.577189 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-8edb5-5fc995bf6d-m66br_db54f995-a27f-4f3e-97da-f93dbc9c5e0b/switch-graph-8edb5/0.log" Apr 24 17:25:12.413742 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:12.413709 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-8edb5-5fc995bf6d-m66br_db54f995-a27f-4f3e-97da-f93dbc9c5e0b/switch-graph-8edb5/0.log" Apr 24 17:25:15.558066 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:15.558026 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br" podUID="db54f995-a27f-4f3e-97da-f93dbc9c5e0b" containerName="switch-graph-8edb5" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 17:25:17.396335 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:17.396307 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br" Apr 24 17:25:17.397967 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:17.397940 2578 generic.go:358] "Generic (PLEG): container finished" podID="db54f995-a27f-4f3e-97da-f93dbc9c5e0b" containerID="635ccd5d194f72c84d520086c9fd8fcb74de5a490bb968622f6dec02905bb092" exitCode=0 Apr 24 17:25:17.398069 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:17.397976 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br" event={"ID":"db54f995-a27f-4f3e-97da-f93dbc9c5e0b","Type":"ContainerDied","Data":"635ccd5d194f72c84d520086c9fd8fcb74de5a490bb968622f6dec02905bb092"} Apr 24 17:25:17.398069 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:17.397998 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br" event={"ID":"db54f995-a27f-4f3e-97da-f93dbc9c5e0b","Type":"ContainerDied","Data":"895bc8e0cb21e9d28bfd10a6866d9ebc8b2373c35e7721ee4f20735fd9b73cc6"} Apr 24 17:25:17.398069 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:17.397994 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br" Apr 24 17:25:17.398164 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:17.398071 2578 scope.go:117] "RemoveContainer" containerID="635ccd5d194f72c84d520086c9fd8fcb74de5a490bb968622f6dec02905bb092" Apr 24 17:25:17.405748 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:17.405732 2578 scope.go:117] "RemoveContainer" containerID="635ccd5d194f72c84d520086c9fd8fcb74de5a490bb968622f6dec02905bb092" Apr 24 17:25:17.405970 ip-10-0-131-47 kubenswrapper[2578]: E0424 17:25:17.405954 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"635ccd5d194f72c84d520086c9fd8fcb74de5a490bb968622f6dec02905bb092\": container with ID starting with 635ccd5d194f72c84d520086c9fd8fcb74de5a490bb968622f6dec02905bb092 not found: ID does not exist" containerID="635ccd5d194f72c84d520086c9fd8fcb74de5a490bb968622f6dec02905bb092" Apr 24 17:25:17.406019 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:17.405976 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"635ccd5d194f72c84d520086c9fd8fcb74de5a490bb968622f6dec02905bb092"} err="failed to get container status \"635ccd5d194f72c84d520086c9fd8fcb74de5a490bb968622f6dec02905bb092\": rpc error: code = NotFound desc = could not find container \"635ccd5d194f72c84d520086c9fd8fcb74de5a490bb968622f6dec02905bb092\": container with ID starting with 635ccd5d194f72c84d520086c9fd8fcb74de5a490bb968622f6dec02905bb092 not found: ID does not exist" Apr 24 17:25:17.567916 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:17.567814 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db54f995-a27f-4f3e-97da-f93dbc9c5e0b-proxy-tls\") pod \"db54f995-a27f-4f3e-97da-f93dbc9c5e0b\" (UID: \"db54f995-a27f-4f3e-97da-f93dbc9c5e0b\") " Apr 24 17:25:17.567916 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:17.567870 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db54f995-a27f-4f3e-97da-f93dbc9c5e0b-openshift-service-ca-bundle\") pod \"db54f995-a27f-4f3e-97da-f93dbc9c5e0b\" (UID: \"db54f995-a27f-4f3e-97da-f93dbc9c5e0b\") " Apr 24 17:25:17.568242 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:17.568215 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db54f995-a27f-4f3e-97da-f93dbc9c5e0b-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "db54f995-a27f-4f3e-97da-f93dbc9c5e0b" (UID: "db54f995-a27f-4f3e-97da-f93dbc9c5e0b"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 17:25:17.570061 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:17.570038 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db54f995-a27f-4f3e-97da-f93dbc9c5e0b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "db54f995-a27f-4f3e-97da-f93dbc9c5e0b" (UID: "db54f995-a27f-4f3e-97da-f93dbc9c5e0b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 17:25:17.668991 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:17.668953 2578 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db54f995-a27f-4f3e-97da-f93dbc9c5e0b-proxy-tls\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 17:25:17.668991 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:17.668988 2578 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db54f995-a27f-4f3e-97da-f93dbc9c5e0b-openshift-service-ca-bundle\") on node \"ip-10-0-131-47.ec2.internal\" DevicePath \"\"" Apr 24 17:25:17.719061 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:17.719032 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br"] Apr 24 17:25:17.722388 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:17.722363 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-8edb5-5fc995bf6d-m66br"] Apr 24 17:25:17.874104 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:17.874024 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-rlzxl_81d7a862-3288-4023-b0d6-2464e9278dac/global-pull-secret-syncer/0.log" Apr 24 17:25:18.011764 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:18.011729 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-dt6kn_46e943b2-628a-486a-adb0-7bb92be03a03/konnectivity-agent/0.log" Apr 24 17:25:18.087395 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:18.087368 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-47.ec2.internal_8b712f12e316b1a6ded9d349ca82d37a/haproxy/0.log" Apr 24 17:25:18.237617 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:18.237583 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db54f995-a27f-4f3e-97da-f93dbc9c5e0b" path="/var/lib/kubelet/pods/db54f995-a27f-4f3e-97da-f93dbc9c5e0b/volumes" Apr 24 17:25:21.662407 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:21.662378 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-n6v4x_4075e057-13f0-412d-96df-8e124be59b52/cluster-monitoring-operator/0.log" Apr 24 17:25:21.981682 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:21.981651 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dpwmn_f9ce762e-9cdb-4a93-8198-3c06cb490198/node-exporter/0.log" Apr 24 17:25:22.004896 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:22.004873 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dpwmn_f9ce762e-9cdb-4a93-8198-3c06cb490198/kube-rbac-proxy/0.log" Apr 24 17:25:22.025402 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:22.025382 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-dpwmn_f9ce762e-9cdb-4a93-8198-3c06cb490198/init-textfile/0.log" Apr 24 17:25:24.178571 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:24.178538 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-f2l2f_45369a43-7ed3-4f16-a1dc-7f1a61e06fc4/console-operator/1.log" Apr 24 17:25:24.183654 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:24.183628 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-f2l2f_45369a43-7ed3-4f16-a1dc-7f1a61e06fc4/console-operator/2.log" Apr 24 17:25:25.031123 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.031098 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-kzb6p_ee65ef05-2b40-4b45-a683-47cafa91b43c/volume-data-source-validator/0.log" Apr 24 17:25:25.285109 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.285030 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-f77jl/perf-node-gather-daemonset-h5dkl"] Apr 24 17:25:25.285521 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.285359 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4138b7fb-1a27-4eeb-8100-8e5ff46d968e" containerName="splitter-graph-9c25c" Apr 24 17:25:25.285521 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.285372 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="4138b7fb-1a27-4eeb-8100-8e5ff46d968e" containerName="splitter-graph-9c25c" Apr 24 17:25:25.285521 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.285394 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db54f995-a27f-4f3e-97da-f93dbc9c5e0b" containerName="switch-graph-8edb5" Apr 24 17:25:25.285521 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.285401 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="db54f995-a27f-4f3e-97da-f93dbc9c5e0b" containerName="switch-graph-8edb5" Apr 24 17:25:25.285521 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.285476 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="db54f995-a27f-4f3e-97da-f93dbc9c5e0b" containerName="switch-graph-8edb5" Apr 24 17:25:25.285521 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.285485 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="4138b7fb-1a27-4eeb-8100-8e5ff46d968e" containerName="splitter-graph-9c25c" Apr 24 17:25:25.288183 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.288164 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-h5dkl" Apr 24 17:25:25.290360 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.290339 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-f77jl\"/\"openshift-service-ca.crt\"" Apr 24 17:25:25.290486 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.290379 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-f77jl\"/\"kube-root-ca.crt\"" Apr 24 17:25:25.290978 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.290962 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-f77jl\"/\"default-dockercfg-fbxhs\"" Apr 24 17:25:25.297532 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.297495 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f77jl/perf-node-gather-daemonset-h5dkl"] Apr 24 17:25:25.328667 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.328629 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v796f\" (UniqueName: \"kubernetes.io/projected/bb621468-96ab-4d8c-a895-95107d393b5b-kube-api-access-v796f\") pod \"perf-node-gather-daemonset-h5dkl\" (UID: \"bb621468-96ab-4d8c-a895-95107d393b5b\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-h5dkl" Apr 24 17:25:25.328667 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.328668 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/bb621468-96ab-4d8c-a895-95107d393b5b-podres\") pod \"perf-node-gather-daemonset-h5dkl\" (UID: \"bb621468-96ab-4d8c-a895-95107d393b5b\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-h5dkl" Apr 24 17:25:25.328845 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.328722 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/bb621468-96ab-4d8c-a895-95107d393b5b-proc\") pod \"perf-node-gather-daemonset-h5dkl\" (UID: \"bb621468-96ab-4d8c-a895-95107d393b5b\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-h5dkl" Apr 24 17:25:25.328845 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.328740 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bb621468-96ab-4d8c-a895-95107d393b5b-lib-modules\") pod \"perf-node-gather-daemonset-h5dkl\" (UID: \"bb621468-96ab-4d8c-a895-95107d393b5b\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-h5dkl" Apr 24 17:25:25.328845 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.328765 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bb621468-96ab-4d8c-a895-95107d393b5b-sys\") pod \"perf-node-gather-daemonset-h5dkl\" (UID: \"bb621468-96ab-4d8c-a895-95107d393b5b\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-h5dkl" Apr 24 17:25:25.429808 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.429777 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v796f\" (UniqueName: \"kubernetes.io/projected/bb621468-96ab-4d8c-a895-95107d393b5b-kube-api-access-v796f\") pod \"perf-node-gather-daemonset-h5dkl\" (UID: \"bb621468-96ab-4d8c-a895-95107d393b5b\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-h5dkl" Apr 24 17:25:25.429808 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.429810 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/bb621468-96ab-4d8c-a895-95107d393b5b-podres\") pod \"perf-node-gather-daemonset-h5dkl\" (UID: \"bb621468-96ab-4d8c-a895-95107d393b5b\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-h5dkl" Apr 24 17:25:25.430046 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.429839 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/bb621468-96ab-4d8c-a895-95107d393b5b-proc\") pod \"perf-node-gather-daemonset-h5dkl\" (UID: \"bb621468-96ab-4d8c-a895-95107d393b5b\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-h5dkl" Apr 24 17:25:25.430046 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.429858 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bb621468-96ab-4d8c-a895-95107d393b5b-lib-modules\") pod \"perf-node-gather-daemonset-h5dkl\" (UID: \"bb621468-96ab-4d8c-a895-95107d393b5b\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-h5dkl" Apr 24 17:25:25.430046 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.429883 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bb621468-96ab-4d8c-a895-95107d393b5b-sys\") pod \"perf-node-gather-daemonset-h5dkl\" (UID: \"bb621468-96ab-4d8c-a895-95107d393b5b\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-h5dkl" Apr 24 17:25:25.430046 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.429940 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/bb621468-96ab-4d8c-a895-95107d393b5b-proc\") pod \"perf-node-gather-daemonset-h5dkl\" (UID: \"bb621468-96ab-4d8c-a895-95107d393b5b\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-h5dkl" Apr 24 17:25:25.430046 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.429954 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/bb621468-96ab-4d8c-a895-95107d393b5b-podres\") pod \"perf-node-gather-daemonset-h5dkl\" (UID: \"bb621468-96ab-4d8c-a895-95107d393b5b\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-h5dkl" Apr 24 17:25:25.430046 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.429974 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bb621468-96ab-4d8c-a895-95107d393b5b-sys\") pod \"perf-node-gather-daemonset-h5dkl\" (UID: \"bb621468-96ab-4d8c-a895-95107d393b5b\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-h5dkl" Apr 24 17:25:25.430046 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.429983 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bb621468-96ab-4d8c-a895-95107d393b5b-lib-modules\") pod \"perf-node-gather-daemonset-h5dkl\" (UID: \"bb621468-96ab-4d8c-a895-95107d393b5b\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-h5dkl" Apr 24 17:25:25.438118 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.438090 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v796f\" (UniqueName: \"kubernetes.io/projected/bb621468-96ab-4d8c-a895-95107d393b5b-kube-api-access-v796f\") pod \"perf-node-gather-daemonset-h5dkl\" (UID: \"bb621468-96ab-4d8c-a895-95107d393b5b\") " pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-h5dkl" Apr 24 17:25:25.598856 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.598770 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-h5dkl" Apr 24 17:25:25.699007 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.698975 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dzj8d_eefe78cf-7d49-4547-bded-f34c94ebc29b/dns/0.log" Apr 24 17:25:25.721022 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.721001 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-f77jl/perf-node-gather-daemonset-h5dkl"] Apr 24 17:25:25.722774 ip-10-0-131-47 kubenswrapper[2578]: W0424 17:25:25.722751 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbb621468_96ab_4d8c_a895_95107d393b5b.slice/crio-0b2c553e5926b62cfc25a724a71756e0984e376c6571ebc9061d3f92245c597b WatchSource:0}: Error finding container 0b2c553e5926b62cfc25a724a71756e0984e376c6571ebc9061d3f92245c597b: Status 404 returned error can't find the container with id 0b2c553e5926b62cfc25a724a71756e0984e376c6571ebc9061d3f92245c597b Apr 24 17:25:25.724169 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.724155 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 17:25:25.726030 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.726010 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-dzj8d_eefe78cf-7d49-4547-bded-f34c94ebc29b/kube-rbac-proxy/0.log" Apr 24 17:25:25.875505 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:25.875435 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9nlzl_2c5b2afa-97e0-4381-b83f-848951dec5c9/dns-node-resolver/0.log" Apr 24 17:25:26.374299 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:26.374250 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7m4xb_e8baf786-1fb8-494a-bdb7-c724c853faa3/node-ca/0.log" Apr 24 17:25:26.424450 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:26.424415 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-h5dkl" event={"ID":"bb621468-96ab-4d8c-a895-95107d393b5b","Type":"ContainerStarted","Data":"95f7154622feb0bd78bcbb137e810fdf974163263fa068ad44989ca951794c7f"} Apr 24 17:25:26.424450 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:26.424451 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-h5dkl" event={"ID":"bb621468-96ab-4d8c-a895-95107d393b5b","Type":"ContainerStarted","Data":"0b2c553e5926b62cfc25a724a71756e0984e376c6571ebc9061d3f92245c597b"} Apr 24 17:25:26.424745 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:26.424476 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-h5dkl" Apr 24 17:25:26.440178 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:26.440126 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-h5dkl" podStartSLOduration=1.4401067109999999 podStartE2EDuration="1.440106711s" podCreationTimestamp="2026-04-24 17:25:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:25:26.43962296 +0000 UTC m=+2796.811327186" watchObservedRunningTime="2026-04-24 17:25:26.440106711 +0000 UTC m=+2796.811810939" Apr 24 17:25:27.210986 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:27.210960 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5958786cb4-48wlg_32ea204a-32cf-4de1-bd0e-675e568756f9/router/0.log" Apr 24 17:25:27.603742 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:27.603667 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-kcgx6_9ae2e73c-423b-4876-a52d-cd4111ca0013/serve-healthcheck-canary/0.log" Apr 24 17:25:27.945596 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:27.945562 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-8j4f4_f0f4702a-d6d6-410a-9800-fb13b913d223/insights-operator/0.log" Apr 24 17:25:27.946091 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:27.946074 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-8j4f4_f0f4702a-d6d6-410a-9800-fb13b913d223/insights-operator/1.log" Apr 24 17:25:28.031356 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:28.031328 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cxjs9_88ca7df2-32d7-469f-9396-016d2ca3b6f3/kube-rbac-proxy/0.log" Apr 24 17:25:28.051506 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:28.051483 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cxjs9_88ca7df2-32d7-469f-9396-016d2ca3b6f3/exporter/0.log" Apr 24 17:25:28.071785 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:28.071759 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-cxjs9_88ca7df2-32d7-469f-9396-016d2ca3b6f3/extractor/0.log" Apr 24 17:25:32.436764 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:32.436734 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-f77jl/perf-node-gather-daemonset-h5dkl" Apr 24 17:25:34.826074 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:34.826002 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-jnwr7_335d3a61-4224-4da9-adb2-5f83cb395511/kube-storage-version-migrator-operator/1.log" Apr 24 17:25:34.826990 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:34.826972 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-jnwr7_335d3a61-4224-4da9-adb2-5f83cb395511/kube-storage-version-migrator-operator/0.log" Apr 24 17:25:36.199063 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:36.199032 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hgbpw_0c733890-1ac2-464e-9672-65bf90aded78/kube-multus-additional-cni-plugins/0.log" Apr 24 17:25:36.221374 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:36.221352 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hgbpw_0c733890-1ac2-464e-9672-65bf90aded78/egress-router-binary-copy/0.log" Apr 24 17:25:36.241929 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:36.241903 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hgbpw_0c733890-1ac2-464e-9672-65bf90aded78/cni-plugins/0.log" Apr 24 17:25:36.276726 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:36.276702 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hgbpw_0c733890-1ac2-464e-9672-65bf90aded78/bond-cni-plugin/0.log" Apr 24 17:25:36.299239 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:36.299218 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hgbpw_0c733890-1ac2-464e-9672-65bf90aded78/routeoverride-cni/0.log" Apr 24 17:25:36.323461 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:36.323441 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hgbpw_0c733890-1ac2-464e-9672-65bf90aded78/whereabouts-cni-bincopy/0.log" Apr 24 17:25:36.345644 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:36.345617 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hgbpw_0c733890-1ac2-464e-9672-65bf90aded78/whereabouts-cni/0.log" Apr 24 17:25:36.546784 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:36.546708 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x2tv7_33870c17-d8aa-426a-9bde-7d0a1e04404a/kube-multus/0.log" Apr 24 17:25:36.652850 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:36.652802 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-thz9k_0f368d48-c79b-45b5-8879-9dac1c5cfe3f/network-metrics-daemon/0.log" Apr 24 17:25:36.672892 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:36.672863 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-thz9k_0f368d48-c79b-45b5-8879-9dac1c5cfe3f/kube-rbac-proxy/0.log" Apr 24 17:25:37.691038 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:37.691010 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/ovn-controller/0.log" Apr 24 17:25:37.714171 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:37.714143 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/ovn-acl-logging/0.log" Apr 24 17:25:37.726257 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:37.726234 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/ovn-acl-logging/1.log" Apr 24 17:25:37.767703 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:37.767667 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/kube-rbac-proxy-node/0.log" Apr 24 17:25:37.790398 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:37.790376 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 17:25:37.811233 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:37.811195 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/northd/0.log" Apr 24 17:25:37.831686 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:37.831662 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/nbdb/0.log" Apr 24 17:25:37.856278 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:37.856259 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/sbdb/0.log" Apr 24 17:25:37.977435 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:37.977356 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgfp8_76a7c3cf-c152-4bdc-8d94-50d4af52aeee/ovnkube-controller/0.log" Apr 24 17:25:39.332065 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:39.332038 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-xx69l_0897170b-76cd-4278-802c-03e1d1747af3/check-endpoints/0.log" Apr 24 17:25:39.404005 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:39.403969 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-hcjpg_7230245a-1622-4c39-9d99-ab2e06ac0daf/network-check-target-container/0.log" Apr 24 17:25:40.277025 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:40.276996 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-46rrs_705121d3-75ff-4f72-9362-f2f98bdb4bd4/iptables-alerter/0.log" Apr 24 17:25:40.945222 ip-10-0-131-47 kubenswrapper[2578]: I0424 17:25:40.945191 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-bhmmj_104dda0f-092c-4fa1-98cb-7e6dcc147db2/tuned/0.log"