Apr 17 17:23:50.341097 ip-10-0-140-33 systemd[1]: Starting Kubernetes Kubelet... Apr 17 17:23:50.847927 ip-10-0-140-33 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:23:50.847927 ip-10-0-140-33 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 17:23:50.847927 ip-10-0-140-33 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:23:50.847927 ip-10-0-140-33 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 17:23:50.847927 ip-10-0-140-33 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:23:50.850435 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.850354 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 17:23:50.855369 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855354 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:23:50.855369 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855368 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:23:50.855369 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855372 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:23:50.855460 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855375 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:23:50.855460 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855379 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:23:50.855460 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855383 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:23:50.855460 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855385 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:23:50.855460 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855388 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:23:50.855460 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855391 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:23:50.855460 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855394 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:23:50.855460 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855396 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:23:50.855460 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855399 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:23:50.855460 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855402 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:23:50.855460 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855404 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:23:50.855460 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855407 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:23:50.855460 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855409 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:23:50.855460 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855420 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:23:50.855460 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855423 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:23:50.855460 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855426 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:23:50.855460 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855428 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:23:50.855460 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855431 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:23:50.855460 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855434 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:23:50.855460 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855437 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:23:50.855943 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855439 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:23:50.855943 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855442 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:23:50.855943 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855445 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:23:50.855943 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855447 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:23:50.855943 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855450 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:23:50.855943 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855453 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:23:50.855943 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855455 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:23:50.855943 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855457 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:23:50.855943 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855461 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:23:50.855943 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855464 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:23:50.855943 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855466 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:23:50.855943 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855469 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:23:50.855943 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855477 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:23:50.855943 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855481 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:23:50.855943 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855483 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:23:50.855943 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855486 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:23:50.855943 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855489 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:23:50.855943 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855491 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:23:50.855943 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855494 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:23:50.855943 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855496 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:23:50.856420 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855499 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:23:50.856420 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855501 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:23:50.856420 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855504 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:23:50.856420 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855506 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:23:50.856420 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855508 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:23:50.856420 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855511 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:23:50.856420 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855517 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:23:50.856420 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855521 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:23:50.856420 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855524 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:23:50.856420 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855527 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:23:50.856420 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855530 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:23:50.856420 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855532 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:23:50.856420 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855535 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:23:50.856420 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855538 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:23:50.856420 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855541 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:23:50.856420 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855543 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:23:50.856420 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855546 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:23:50.856420 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855549 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:23:50.856420 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855551 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:23:50.856892 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855554 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:23:50.856892 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855557 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:23:50.856892 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855559 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:23:50.856892 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855561 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:23:50.856892 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855564 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:23:50.856892 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855568 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:23:50.856892 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855570 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:23:50.856892 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855573 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:23:50.856892 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855575 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:23:50.856892 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855578 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:23:50.856892 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855580 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:23:50.856892 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855583 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:23:50.856892 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855585 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:23:50.856892 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855587 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:23:50.856892 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855591 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:23:50.856892 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855595 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:23:50.856892 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855598 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:23:50.856892 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855601 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:23:50.856892 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855604 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:23:50.857353 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855607 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:23:50.857353 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855610 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:23:50.857353 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855613 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:23:50.857353 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855616 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:23:50.857353 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.855619 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:23:50.857353 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856004 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:23:50.857353 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856011 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:23:50.857353 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856015 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:23:50.857353 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856020 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:23:50.857353 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856024 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:23:50.857353 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856027 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:23:50.857353 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856031 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:23:50.857353 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856034 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:23:50.857353 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856036 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:23:50.857353 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856039 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:23:50.857353 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856041 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:23:50.857353 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856044 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:23:50.857353 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856047 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:23:50.857353 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856049 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:23:50.857833 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856052 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:23:50.857833 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856054 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:23:50.857833 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856057 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:23:50.857833 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856060 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:23:50.857833 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856062 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:23:50.857833 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856065 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:23:50.857833 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856067 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:23:50.857833 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856070 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:23:50.857833 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856073 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:23:50.857833 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856075 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:23:50.857833 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856077 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:23:50.857833 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856080 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:23:50.857833 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856082 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:23:50.857833 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856085 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:23:50.857833 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856088 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:23:50.857833 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856091 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:23:50.857833 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856093 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:23:50.857833 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856095 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:23:50.857833 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856098 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:23:50.857833 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856101 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:23:50.858314 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856104 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:23:50.858314 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856106 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:23:50.858314 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856109 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:23:50.858314 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856111 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:23:50.858314 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856114 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:23:50.858314 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856117 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:23:50.858314 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856119 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:23:50.858314 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856122 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:23:50.858314 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856124 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:23:50.858314 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856127 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:23:50.858314 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856129 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:23:50.858314 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856132 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:23:50.858314 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856134 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:23:50.858314 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856137 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:23:50.858314 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856139 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:23:50.858314 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856143 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:23:50.858314 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856145 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:23:50.858314 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856147 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:23:50.858314 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856150 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:23:50.858314 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856153 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:23:50.858314 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856155 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:23:50.858841 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856157 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:23:50.858841 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856160 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:23:50.858841 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856162 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:23:50.858841 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856165 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:23:50.858841 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856167 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:23:50.858841 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856169 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:23:50.858841 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856172 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:23:50.858841 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856175 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:23:50.858841 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856177 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:23:50.858841 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856181 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:23:50.858841 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856183 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:23:50.858841 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856187 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:23:50.858841 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856189 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:23:50.858841 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856192 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:23:50.858841 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856195 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:23:50.858841 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856197 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:23:50.858841 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856200 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:23:50.858841 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856202 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:23:50.858841 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856204 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:23:50.858841 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856207 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:23:50.859358 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856209 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:23:50.859358 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856212 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:23:50.859358 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856214 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:23:50.859358 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856217 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:23:50.859358 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856219 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:23:50.859358 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856222 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:23:50.859358 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856226 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:23:50.859358 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856229 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:23:50.859358 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856232 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:23:50.859358 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856234 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:23:50.859358 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.856237 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:23:50.859358 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.856984 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 17:23:50.859358 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.856997 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 17:23:50.859358 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857003 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 17:23:50.859358 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857007 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 17:23:50.859358 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857011 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 17:23:50.859358 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857015 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 17:23:50.859358 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857019 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 17:23:50.859358 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857023 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 17:23:50.859358 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857028 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 17:23:50.859358 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857031 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 17:23:50.859875 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857035 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 17:23:50.859875 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857038 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 17:23:50.859875 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857041 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 17:23:50.859875 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857044 2575 flags.go:64] FLAG: --cgroup-root="" Apr 17 17:23:50.859875 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857047 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 17:23:50.859875 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857051 2575 flags.go:64] FLAG: --client-ca-file="" Apr 17 17:23:50.859875 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857054 2575 flags.go:64] FLAG: --cloud-config="" Apr 17 17:23:50.859875 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857056 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 17 17:23:50.859875 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857059 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 17:23:50.859875 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857064 2575 flags.go:64] FLAG: --cluster-domain="" Apr 17 17:23:50.859875 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857067 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 17:23:50.859875 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857070 2575 flags.go:64] FLAG: --config-dir="" Apr 17 17:23:50.859875 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857073 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 17:23:50.859875 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857076 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 17:23:50.859875 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857080 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 17:23:50.859875 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857083 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 17:23:50.859875 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857086 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 17:23:50.859875 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857089 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 17:23:50.859875 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857092 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 17 17:23:50.859875 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857095 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 17:23:50.859875 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857098 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 17:23:50.859875 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857101 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 17:23:50.859875 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857104 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 17:23:50.859875 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857108 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 17:23:50.859875 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857111 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 17:23:50.860599 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857114 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 17:23:50.860599 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857117 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 17:23:50.860599 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857120 2575 flags.go:64] FLAG: --enable-server="true" Apr 17 17:23:50.860599 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857123 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 17:23:50.860599 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857127 2575 flags.go:64] FLAG: --event-burst="100" Apr 17 17:23:50.860599 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857130 2575 flags.go:64] FLAG: --event-qps="50" Apr 17 17:23:50.860599 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857133 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 17:23:50.860599 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857137 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 17:23:50.860599 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857140 2575 flags.go:64] FLAG: --eviction-hard="" Apr 17 17:23:50.860599 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857144 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 17:23:50.860599 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857147 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 17:23:50.860599 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857150 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 17:23:50.860599 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857153 2575 flags.go:64] FLAG: --eviction-soft="" Apr 17 17:23:50.860599 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857156 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 17:23:50.860599 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857159 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 17:23:50.860599 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857162 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 17:23:50.860599 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857164 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 17:23:50.860599 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857167 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 17:23:50.860599 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857170 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 17:23:50.860599 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857173 2575 flags.go:64] FLAG: --feature-gates="" Apr 17 17:23:50.860599 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857176 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 17:23:50.860599 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857179 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 17:23:50.860599 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857182 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 17:23:50.860599 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857185 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 17:23:50.860599 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857189 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 17 17:23:50.860599 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857192 2575 flags.go:64] FLAG: --help="false" Apr 17 17:23:50.861250 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857195 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-140-33.ec2.internal" Apr 17 17:23:50.861250 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857198 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 17:23:50.861250 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857201 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 17:23:50.861250 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857204 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 17:23:50.861250 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857208 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 17:23:50.861250 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857211 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 17:23:50.861250 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857214 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 17:23:50.861250 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857217 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 17:23:50.861250 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857220 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 17:23:50.861250 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857223 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 17:23:50.861250 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857226 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 17:23:50.861250 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857230 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 17:23:50.861250 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857232 2575 flags.go:64] FLAG: --kube-reserved="" Apr 17 17:23:50.861250 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857238 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 17:23:50.861250 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857241 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 17:23:50.861250 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857245 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 17:23:50.861250 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857247 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 17:23:50.861250 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857250 2575 flags.go:64] FLAG: --lock-file="" Apr 17 17:23:50.861250 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857253 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 17:23:50.861250 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857256 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 17:23:50.861250 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857259 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 17:23:50.861250 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857264 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 17:23:50.861250 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857267 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 17:23:50.861803 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857270 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 17:23:50.861803 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857273 2575 flags.go:64] FLAG: --logging-format="text" Apr 17 17:23:50.861803 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857276 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 17:23:50.861803 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857279 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 17:23:50.861803 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857282 2575 flags.go:64] FLAG: --manifest-url="" Apr 17 17:23:50.861803 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857285 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 17 17:23:50.861803 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857289 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 17:23:50.861803 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857292 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 17:23:50.861803 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857296 2575 flags.go:64] FLAG: --max-pods="110" Apr 17 17:23:50.861803 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857299 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 17:23:50.861803 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857302 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 17:23:50.861803 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857305 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 17:23:50.861803 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857308 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 17:23:50.861803 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857310 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 17:23:50.861803 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857313 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 17:23:50.861803 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857316 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 17:23:50.861803 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857324 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 17:23:50.861803 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857327 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 17:23:50.861803 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857330 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 17:23:50.861803 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857333 2575 flags.go:64] FLAG: --pod-cidr="" Apr 17 17:23:50.861803 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857336 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 17:23:50.861803 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857342 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 17:23:50.861803 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857346 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 17:23:50.861803 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857350 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 17 17:23:50.862397 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857353 2575 flags.go:64] FLAG: --port="10250" Apr 17 17:23:50.862397 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857356 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 17:23:50.862397 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857359 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-020775477e46844d9" Apr 17 17:23:50.862397 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857363 2575 flags.go:64] FLAG: --qos-reserved="" Apr 17 17:23:50.862397 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857366 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 17 17:23:50.862397 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857369 2575 flags.go:64] FLAG: --register-node="true" Apr 17 17:23:50.862397 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857372 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 17 17:23:50.862397 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857375 2575 flags.go:64] FLAG: --register-with-taints="" Apr 17 17:23:50.862397 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857378 2575 flags.go:64] FLAG: --registry-burst="10" Apr 17 17:23:50.862397 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857381 2575 flags.go:64] FLAG: --registry-qps="5" Apr 17 17:23:50.862397 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857384 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 17 17:23:50.862397 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857387 2575 flags.go:64] FLAG: --reserved-memory="" Apr 17 17:23:50.862397 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857390 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 17:23:50.862397 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857393 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 17:23:50.862397 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857396 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 17:23:50.862397 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857399 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 17:23:50.862397 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857402 2575 flags.go:64] FLAG: --runonce="false" Apr 17 17:23:50.862397 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857405 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 17:23:50.862397 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857408 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 17:23:50.862397 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857410 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 17 17:23:50.862397 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857413 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 17:23:50.862397 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857419 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 17:23:50.862397 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857422 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 17:23:50.862397 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857425 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 17:23:50.862397 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857428 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 17:23:50.862397 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857430 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 17:23:50.863032 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857433 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 17:23:50.863032 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857436 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 17:23:50.863032 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857439 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 17:23:50.863032 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857442 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 17:23:50.863032 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857445 2575 flags.go:64] FLAG: --system-cgroups="" Apr 17 17:23:50.863032 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857448 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 17:23:50.863032 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857456 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 17:23:50.863032 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857460 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 17 17:23:50.863032 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857462 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 17:23:50.863032 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857467 2575 flags.go:64] FLAG: --tls-min-version="" Apr 17 17:23:50.863032 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857470 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 17:23:50.863032 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857472 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 17:23:50.863032 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857475 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 17:23:50.863032 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857478 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 17:23:50.863032 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857481 2575 flags.go:64] FLAG: --v="2" Apr 17 17:23:50.863032 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857485 2575 flags.go:64] FLAG: --version="false" Apr 17 17:23:50.863032 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857489 2575 flags.go:64] FLAG: --vmodule="" Apr 17 17:23:50.863032 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857493 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 17:23:50.863032 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.857496 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 17:23:50.863032 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857585 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:23:50.863032 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857589 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:23:50.863032 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857591 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:23:50.863032 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857594 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:23:50.863032 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857596 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:23:50.863609 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857599 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:23:50.863609 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857602 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:23:50.863609 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857604 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:23:50.863609 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857609 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:23:50.863609 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857611 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:23:50.863609 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857614 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:23:50.863609 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857616 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:23:50.863609 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857619 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:23:50.863609 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857621 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:23:50.863609 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857624 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:23:50.863609 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857626 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:23:50.863609 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857629 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:23:50.863609 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857632 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:23:50.863609 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857634 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:23:50.863609 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857637 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:23:50.863609 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857640 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:23:50.863609 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857642 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:23:50.863609 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857645 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:23:50.863609 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857647 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:23:50.863609 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857650 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:23:50.864331 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857653 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:23:50.864331 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857655 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:23:50.864331 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857658 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:23:50.864331 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857661 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:23:50.864331 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857663 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:23:50.864331 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857666 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:23:50.864331 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857668 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:23:50.864331 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857671 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:23:50.864331 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857674 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:23:50.864331 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857676 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:23:50.864331 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857678 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:23:50.864331 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857681 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:23:50.864331 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857683 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:23:50.864331 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857686 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:23:50.864331 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857688 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:23:50.864331 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857693 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:23:50.864331 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857696 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:23:50.864331 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857698 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:23:50.864331 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857701 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:23:50.865048 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857703 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:23:50.865048 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857706 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:23:50.865048 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857708 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:23:50.865048 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857712 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:23:50.865048 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857716 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:23:50.865048 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857720 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:23:50.865048 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857723 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:23:50.865048 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857725 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:23:50.865048 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857728 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:23:50.865048 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857731 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:23:50.865048 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857733 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:23:50.865048 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857736 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:23:50.865048 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857740 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:23:50.865048 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857743 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:23:50.865048 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857746 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:23:50.865048 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857749 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:23:50.865048 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857752 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:23:50.865048 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857754 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:23:50.865048 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857757 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:23:50.865529 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857759 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:23:50.865529 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857762 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:23:50.865529 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857764 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:23:50.865529 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857768 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:23:50.865529 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857770 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:23:50.865529 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857773 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:23:50.865529 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857775 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:23:50.865529 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857778 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:23:50.865529 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857780 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:23:50.865529 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857794 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:23:50.865529 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857798 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:23:50.865529 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857801 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:23:50.865529 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857804 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:23:50.865529 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857806 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:23:50.865529 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857809 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:23:50.865529 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857812 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:23:50.865529 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857814 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:23:50.865529 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857817 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:23:50.865529 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857831 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:23:50.865529 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857834 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:23:50.866043 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857837 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:23:50.866043 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857839 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:23:50.866043 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.857842 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:23:50.866043 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.858614 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:23:50.866481 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.866462 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 17:23:50.866511 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.866482 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 17:23:50.866537 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866531 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:23:50.866537 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866537 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:23:50.866592 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866540 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:23:50.866592 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866543 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:23:50.866592 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866546 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:23:50.866592 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866549 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:23:50.866592 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866552 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:23:50.866592 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866555 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:23:50.866592 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866558 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:23:50.866592 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866561 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:23:50.866592 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866564 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:23:50.866592 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866566 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:23:50.866592 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866569 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:23:50.866592 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866572 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:23:50.866592 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866575 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:23:50.866592 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866577 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:23:50.866592 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866580 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:23:50.866592 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866583 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:23:50.866592 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866586 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:23:50.866592 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866588 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:23:50.866592 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866591 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:23:50.866592 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866593 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:23:50.867099 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866597 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:23:50.867099 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866600 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:23:50.867099 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866607 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:23:50.867099 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866611 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:23:50.867099 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866613 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:23:50.867099 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866616 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:23:50.867099 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866618 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:23:50.867099 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866620 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:23:50.867099 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866623 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:23:50.867099 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866626 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:23:50.867099 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866628 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:23:50.867099 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866631 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:23:50.867099 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866633 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:23:50.867099 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866637 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:23:50.867099 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866641 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:23:50.867099 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866644 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:23:50.867099 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866647 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:23:50.867099 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866649 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:23:50.867099 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866652 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:23:50.867099 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866654 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:23:50.867588 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866657 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:23:50.867588 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866660 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:23:50.867588 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866662 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:23:50.867588 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866665 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:23:50.867588 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866667 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:23:50.867588 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866670 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:23:50.867588 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866673 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:23:50.867588 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866676 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:23:50.867588 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866678 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:23:50.867588 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866680 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:23:50.867588 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866683 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:23:50.867588 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866686 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:23:50.867588 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866688 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:23:50.867588 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866691 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:23:50.867588 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866694 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:23:50.867588 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866696 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:23:50.867588 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866699 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:23:50.867588 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866702 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:23:50.867588 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866704 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:23:50.868064 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866707 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:23:50.868064 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866709 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:23:50.868064 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866712 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:23:50.868064 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866714 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:23:50.868064 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866717 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:23:50.868064 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866719 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:23:50.868064 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866722 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:23:50.868064 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866724 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:23:50.868064 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866727 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:23:50.868064 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866729 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:23:50.868064 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866732 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:23:50.868064 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866734 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:23:50.868064 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866736 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:23:50.868064 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866739 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:23:50.868064 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866742 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:23:50.868064 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866745 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:23:50.868064 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866747 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:23:50.868064 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866750 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:23:50.868064 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866752 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:23:50.868064 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866755 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:23:50.868541 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866759 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:23:50.868541 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866763 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:23:50.868541 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866766 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:23:50.868541 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866769 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:23:50.868541 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866771 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:23:50.868541 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.866777 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:23:50.868541 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866890 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:23:50.868541 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866896 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:23:50.868541 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866900 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:23:50.868541 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866904 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:23:50.868541 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866908 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:23:50.868541 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866911 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:23:50.868541 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866915 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:23:50.868541 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866918 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:23:50.868541 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866921 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:23:50.868959 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866924 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:23:50.868959 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866926 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:23:50.868959 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866929 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:23:50.868959 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866931 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:23:50.868959 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866934 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:23:50.868959 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866937 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:23:50.868959 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866939 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:23:50.868959 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866942 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:23:50.868959 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866944 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:23:50.868959 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866947 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:23:50.868959 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866949 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:23:50.868959 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866952 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:23:50.868959 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866954 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:23:50.868959 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866958 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:23:50.868959 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866961 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:23:50.868959 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866963 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:23:50.868959 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866966 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:23:50.868959 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866968 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:23:50.868959 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866971 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:23:50.868959 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866973 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:23:50.869442 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866976 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:23:50.869442 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866978 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:23:50.869442 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866981 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:23:50.869442 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866984 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:23:50.869442 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866987 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:23:50.869442 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866989 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:23:50.869442 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866992 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:23:50.869442 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866994 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:23:50.869442 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.866997 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:23:50.869442 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867000 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:23:50.869442 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867002 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:23:50.869442 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867005 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:23:50.869442 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867008 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:23:50.869442 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867010 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:23:50.869442 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867013 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:23:50.869442 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867015 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:23:50.869442 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867018 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:23:50.869442 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867020 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:23:50.869442 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867023 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:23:50.869442 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867025 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:23:50.869951 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867029 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:23:50.869951 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867031 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:23:50.869951 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867034 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:23:50.869951 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867036 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:23:50.869951 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867038 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:23:50.869951 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867041 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:23:50.869951 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867044 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:23:50.869951 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867046 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:23:50.869951 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867049 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:23:50.869951 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867053 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:23:50.869951 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867056 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:23:50.869951 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867059 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:23:50.869951 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867061 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:23:50.869951 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867064 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:23:50.869951 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867066 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:23:50.869951 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867069 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:23:50.869951 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867073 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:23:50.869951 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867075 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:23:50.869951 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867078 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:23:50.870405 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867080 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:23:50.870405 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867083 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:23:50.870405 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867086 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:23:50.870405 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867088 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:23:50.870405 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867091 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:23:50.870405 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867094 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:23:50.870405 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867096 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:23:50.870405 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867099 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:23:50.870405 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867101 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:23:50.870405 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867104 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:23:50.870405 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867106 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:23:50.870405 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867109 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:23:50.870405 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867112 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:23:50.870405 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867114 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:23:50.870405 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867117 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:23:50.870405 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867119 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:23:50.870405 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867122 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:23:50.870405 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:50.867124 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:23:50.870861 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.867129 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:23:50.870861 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.868080 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 17:23:50.870924 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.870912 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 17:23:50.871952 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.871940 2575 server.go:1019] "Starting client certificate rotation" Apr 17 17:23:50.872053 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.872036 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:23:50.872086 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.872078 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:23:50.900674 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.900655 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:23:50.903202 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.903184 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:23:50.920594 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.920572 2575 log.go:25] "Validated CRI v1 runtime API" Apr 17 17:23:50.926263 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.926246 2575 log.go:25] "Validated CRI v1 image API" Apr 17 17:23:50.930245 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.930228 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 17:23:50.931908 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.931893 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:23:50.934929 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.934902 2575 fs.go:135] Filesystem UUIDs: map[47784a92-0457-4a72-be57-9de2f0401548:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 df61af16-18da-4eff-9351-b3fe27c7330a:/dev/nvme0n1p4] Apr 17 17:23:50.934986 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.934929 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 17:23:50.941359 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.941259 2575 manager.go:217] Machine: {Timestamp:2026-04-17 17:23:50.939099177 +0000 UTC m=+0.464779789 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098916 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec24fe6da3eb485b3a2fa109e575940a SystemUUID:ec24fe6d-a3eb-485b-3a2f-a109e575940a BootID:2821e35f-a228-4265-a56a-1e55a7e83374 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:9d:ef:cc:f9:e3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:9d:ef:cc:f9:e3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:e6:3b:b4:f7:73:cf Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 17:23:50.941359 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.941355 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 17:23:50.941552 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.941433 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 17:23:50.942592 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.942569 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 17:23:50.942731 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.942595 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-33.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 17:23:50.942775 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.942740 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 17:23:50.942775 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.942748 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 17:23:50.942775 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.942761 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:23:50.943590 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.943580 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:23:50.944953 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.944943 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:23:50.945070 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.945061 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 17:23:50.947727 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.947718 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 17 17:23:50.947759 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.947734 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 17:23:50.947759 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.947746 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 17:23:50.947759 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.947755 2575 kubelet.go:397] "Adding apiserver pod source" Apr 17 17:23:50.947873 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.947763 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 17:23:50.948862 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.948851 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:23:50.948908 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.948868 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:23:50.952671 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.952655 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 17:23:50.954662 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.954648 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 17:23:50.956069 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.956057 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 17:23:50.956123 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.956074 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 17:23:50.956123 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.956081 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 17:23:50.956123 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.956118 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 17:23:50.956123 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.956125 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 17:23:50.956305 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.956131 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 17:23:50.956305 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.956137 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 17:23:50.956305 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.956142 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 17:23:50.956305 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.956148 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 17:23:50.956305 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.956155 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 17:23:50.956305 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.956172 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 17:23:50.956305 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.956181 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 17:23:50.958075 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.958064 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 17:23:50.958075 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.958076 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 17:23:50.961669 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:50.961647 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-33.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 17:23:50.961775 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.961763 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-33.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 17:23:50.961883 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:50.961811 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 17:23:50.961919 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.961888 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 17:23:50.961952 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.961920 2575 server.go:1295] "Started kubelet" Apr 17 17:23:50.962400 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.962366 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 17:23:50.962729 ip-10-0-140-33 systemd[1]: Started Kubernetes Kubelet. Apr 17 17:23:50.962879 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.962709 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 17:23:50.962957 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.962890 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 17:23:50.964270 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.964257 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 17 17:23:50.964887 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.964867 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 17:23:50.969981 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.969964 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 17:23:50.970067 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.969975 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 17:23:50.970757 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.970708 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 17:23:50.970845 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.970762 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 17:23:50.970909 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.970893 2575 factory.go:55] Registering systemd factory Apr 17 17:23:50.970955 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.970926 2575 factory.go:223] Registration of the systemd container factory successfully Apr 17 17:23:50.971017 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.971002 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 17 17:23:50.971065 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.971019 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 17 17:23:50.971157 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.970659 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 17:23:50.971212 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.971169 2575 factory.go:153] Registering CRI-O factory Apr 17 17:23:50.971212 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.971182 2575 factory.go:223] Registration of the crio container factory successfully Apr 17 17:23:50.971287 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.971228 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 17:23:50.971287 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.971252 2575 factory.go:103] Registering Raw factory Apr 17 17:23:50.971287 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.971274 2575 manager.go:1196] Started watching for new ooms in manager Apr 17 17:23:50.971703 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.971691 2575 manager.go:319] Starting recovery of all containers Apr 17 17:23:50.971964 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:50.968520 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-33.ec2.internal.18a734cc80ec5fa7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-33.ec2.internal,UID:ip-10-0-140-33.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-140-33.ec2.internal,},FirstTimestamp:2026-04-17 17:23:50.961897383 +0000 UTC m=+0.487577993,LastTimestamp:2026-04-17 17:23:50.961897383 +0000 UTC m=+0.487577993,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-33.ec2.internal,}" Apr 17 17:23:50.973143 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:50.973124 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-33.ec2.internal\" not found" Apr 17 17:23:50.973465 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:50.973439 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 17:23:50.976429 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:50.976399 2575 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-140-33.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 17:23:50.976547 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:50.976521 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 17:23:50.978693 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.978662 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 17:23:50.980080 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.980052 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-sqc2x" Apr 17 17:23:50.982775 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.982758 2575 manager.go:324] Recovery completed Apr 17 17:23:50.986549 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.986476 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-sqc2x" Apr 17 17:23:50.988161 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.988149 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:23:50.991225 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.991210 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-33.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:23:50.991292 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.991237 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-33.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:23:50.991292 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.991247 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-33.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:23:50.991708 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.991690 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 17:23:50.991708 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.991707 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 17:23:50.991838 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.991724 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:23:50.993109 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:50.993049 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-33.ec2.internal.18a734cc82abdfe4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-33.ec2.internal,UID:ip-10-0-140-33.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-140-33.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-140-33.ec2.internal,},FirstTimestamp:2026-04-17 17:23:50.991224804 +0000 UTC m=+0.516905414,LastTimestamp:2026-04-17 17:23:50.991224804 +0000 UTC m=+0.516905414,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-33.ec2.internal,}" Apr 17 17:23:50.994340 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.994327 2575 policy_none.go:49] "None policy: Start" Apr 17 17:23:50.994397 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.994344 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 17:23:50.994397 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:50.994353 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 17 17:23:51.035517 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.035495 2575 manager.go:341] "Starting Device Plugin manager" Apr 17 17:23:51.037712 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:51.035574 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 17:23:51.037712 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.035588 2575 server.go:85] "Starting device plugin registration server" Apr 17 17:23:51.037712 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.035817 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 17:23:51.037712 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.035845 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 17:23:51.037712 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.035946 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 17:23:51.037712 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.036018 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 17:23:51.037712 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.036027 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 17:23:51.037712 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:51.036543 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 17:23:51.037712 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:51.036599 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-33.ec2.internal\" not found" Apr 17 17:23:51.089115 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.089089 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 17:23:51.089115 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.089119 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 17:23:51.089233 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.089135 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 17:23:51.089233 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.089142 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 17:23:51.089233 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:51.089176 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 17:23:51.092046 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.092025 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:23:51.136279 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.136237 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:23:51.137325 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.137310 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-33.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:23:51.137394 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.137336 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-33.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:23:51.137394 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.137346 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-33.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:23:51.137394 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.137368 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-33.ec2.internal" Apr 17 17:23:51.146233 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.146217 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-33.ec2.internal" Apr 17 17:23:51.146321 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:51.146241 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-33.ec2.internal\": node \"ip-10-0-140-33.ec2.internal\" not found" Apr 17 17:23:51.160589 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:51.160567 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-33.ec2.internal\" not found" Apr 17 17:23:51.190227 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.190193 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-33.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-140-33.ec2.internal"] Apr 17 17:23:51.190294 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.190261 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:23:51.192033 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.192019 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-33.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:23:51.192099 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.192044 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-33.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:23:51.192099 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.192058 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-33.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:23:51.193347 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.193336 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:23:51.193507 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.193493 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-33.ec2.internal" Apr 17 17:23:51.193544 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.193520 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:23:51.199019 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.198998 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-33.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:23:51.199103 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.199023 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-33.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:23:51.199103 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.199034 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-33.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:23:51.199383 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.199368 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-33.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:23:51.199437 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.199391 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-33.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:23:51.199437 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.199400 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-33.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:23:51.200500 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.200486 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-33.ec2.internal" Apr 17 17:23:51.200571 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.200516 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:23:51.201691 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.201677 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-33.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:23:51.201751 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.201706 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-33.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:23:51.201751 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.201716 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-33.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:23:51.229533 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:51.229515 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-33.ec2.internal\" not found" node="ip-10-0-140-33.ec2.internal" Apr 17 17:23:51.232788 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:51.232771 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-33.ec2.internal\" not found" node="ip-10-0-140-33.ec2.internal" Apr 17 17:23:51.260629 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:51.260611 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-33.ec2.internal\" not found" Apr 17 17:23:51.272930 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.272914 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c1cd4a3b95297fd9fd160804093bee86-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-33.ec2.internal\" (UID: \"c1cd4a3b95297fd9fd160804093bee86\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-33.ec2.internal" Apr 17 17:23:51.272993 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.272940 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c1cd4a3b95297fd9fd160804093bee86-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-33.ec2.internal\" (UID: \"c1cd4a3b95297fd9fd160804093bee86\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-33.ec2.internal" Apr 17 17:23:51.272993 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.272971 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f04a4241565e19b6ce163e5a36620c34-config\") pod \"kube-apiserver-proxy-ip-10-0-140-33.ec2.internal\" (UID: \"f04a4241565e19b6ce163e5a36620c34\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-33.ec2.internal" Apr 17 17:23:51.361305 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:51.361270 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-33.ec2.internal\" not found" Apr 17 17:23:51.373652 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.373633 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c1cd4a3b95297fd9fd160804093bee86-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-33.ec2.internal\" (UID: \"c1cd4a3b95297fd9fd160804093bee86\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-33.ec2.internal" Apr 17 17:23:51.373699 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.373661 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c1cd4a3b95297fd9fd160804093bee86-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-33.ec2.internal\" (UID: \"c1cd4a3b95297fd9fd160804093bee86\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-33.ec2.internal" Apr 17 17:23:51.373699 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.373685 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f04a4241565e19b6ce163e5a36620c34-config\") pod \"kube-apiserver-proxy-ip-10-0-140-33.ec2.internal\" (UID: \"f04a4241565e19b6ce163e5a36620c34\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-33.ec2.internal" Apr 17 17:23:51.373763 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.373728 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c1cd4a3b95297fd9fd160804093bee86-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-33.ec2.internal\" (UID: \"c1cd4a3b95297fd9fd160804093bee86\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-33.ec2.internal" Apr 17 17:23:51.373763 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.373733 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c1cd4a3b95297fd9fd160804093bee86-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-33.ec2.internal\" (UID: \"c1cd4a3b95297fd9fd160804093bee86\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-33.ec2.internal" Apr 17 17:23:51.373845 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.373763 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f04a4241565e19b6ce163e5a36620c34-config\") pod \"kube-apiserver-proxy-ip-10-0-140-33.ec2.internal\" (UID: \"f04a4241565e19b6ce163e5a36620c34\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-33.ec2.internal" Apr 17 17:23:51.462061 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:51.461997 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-33.ec2.internal\" not found" Apr 17 17:23:51.531538 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.531519 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-33.ec2.internal" Apr 17 17:23:51.534636 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.534585 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-33.ec2.internal" Apr 17 17:23:51.562721 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:51.562699 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-33.ec2.internal\" not found" Apr 17 17:23:51.663296 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:51.663257 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-33.ec2.internal\" not found" Apr 17 17:23:51.763742 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:51.763711 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-33.ec2.internal\" not found" Apr 17 17:23:51.864412 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:51.864386 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-33.ec2.internal\" not found" Apr 17 17:23:51.871565 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.871552 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 17:23:51.871690 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.871676 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:23:51.903536 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.903514 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:23:51.905284 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.905269 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:23:51.948879 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.948860 2575 apiserver.go:52] "Watching apiserver" Apr 17 17:23:51.956810 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.956793 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 17:23:51.957146 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.957128 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-nhs5m","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbqb6","openshift-dns/node-resolver-qd2qr","openshift-image-registry/node-ca-fxfbb","openshift-multus/network-metrics-daemon-mqlsd","openshift-network-diagnostics/network-check-target-tlwhr","openshift-network-operator/iptables-alerter-rj22k","openshift-ovn-kubernetes/ovnkube-node-p2pbl","openshift-cluster-node-tuning-operator/tuned-cx8bv","openshift-multus/multus-additional-cni-plugins-hprdr","openshift-multus/multus-wgh2j"] Apr 17 17:23:51.959863 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.959849 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nhs5m" Apr 17 17:23:51.961190 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.961174 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbqb6" Apr 17 17:23:51.962317 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.962290 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-jvc75\"" Apr 17 17:23:51.962406 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.962356 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 17:23:51.962406 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.962386 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qd2qr" Apr 17 17:23:51.962520 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.962452 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 17:23:51.963066 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.963051 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 17:23:51.963139 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.963087 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 17:23:51.963192 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.963142 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 17:23:51.963390 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.963360 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-q47cq\"" Apr 17 17:23:51.963541 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.963528 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fxfbb" Apr 17 17:23:51.964073 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.964059 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 17:23:51.964266 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.964251 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-7kqdp\"" Apr 17 17:23:51.964337 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.964257 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 17:23:51.965197 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.965140 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:23:51.965289 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.965213 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:23:51.966313 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:51.965506 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tlwhr" podUID="e4fd5db6-98ca-462f-b950-10c0fe775718" Apr 17 17:23:51.966313 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.966076 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 17:23:51.966313 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.966117 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 17:23:51.966313 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:51.966040 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqlsd" podUID="41eeea20-b1c0-4cb6-8da8-a4a26a60423d" Apr 17 17:23:51.966540 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.966394 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-pgjpq\"" Apr 17 17:23:51.966591 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.966572 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 17:23:51.967563 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.967545 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rj22k" Apr 17 17:23:51.969555 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.969539 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:51.969702 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.969682 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:23:51.969893 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.969863 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-cwrgc\"" Apr 17 17:23:51.970015 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.970001 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 17:23:51.970077 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.970001 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 17:23:51.970077 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.970051 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 17:23:51.970776 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.970760 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:51.971084 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.971071 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-33.ec2.internal" Apr 17 17:23:51.971267 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.971248 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 17:23:51.971506 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.971487 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 17:23:51.971807 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.971791 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 17:23:51.971807 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.971800 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pkz6h\"" Apr 17 17:23:51.971974 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.971918 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 17:23:51.972110 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.972090 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 17:23:51.972165 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.972151 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 17:23:51.972453 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.972437 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hprdr" Apr 17 17:23:51.972780 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.972632 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:23:51.972780 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.972674 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 17:23:51.972780 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.972756 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-9zw8g\"" Apr 17 17:23:51.973837 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.973808 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wgh2j" Apr 17 17:23:51.974043 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.974027 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 17:23:51.974230 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.974218 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 17:23:51.974518 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.974501 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vktcn\"" Apr 17 17:23:51.974518 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.974503 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 17:23:51.974518 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.974521 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 17:23:51.974679 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.974672 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 17:23:51.975759 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.975716 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 17:23:51.975862 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.975811 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-45gx9\"" Apr 17 17:23:51.977014 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.976999 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/06cbfbab-93e8-4744-be0c-f8d8adfb094d-iptables-alerter-script\") pod \"iptables-alerter-rj22k\" (UID: \"06cbfbab-93e8-4744-be0c-f8d8adfb094d\") " pod="openshift-network-operator/iptables-alerter-rj22k" Apr 17 17:23:51.977094 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977029 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf2lq\" (UniqueName: \"kubernetes.io/projected/adfae8ba-8d02-4f3c-85a7-b2ae828b0579-kube-api-access-cf2lq\") pod \"node-resolver-qd2qr\" (UID: \"adfae8ba-8d02-4f3c-85a7-b2ae828b0579\") " pod="openshift-dns/node-resolver-qd2qr" Apr 17 17:23:51.977094 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977052 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ae488321-502d-450d-a483-234f0aff8bb3-tmp\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:51.977174 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977106 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6c57d472-0514-46ce-bf7a-f6af067b3f5d-etc-selinux\") pod \"aws-ebs-csi-driver-node-hbqb6\" (UID: \"6c57d472-0514-46ce-bf7a-f6af067b3f5d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbqb6" Apr 17 17:23:51.977174 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977141 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/70cf02df-8c16-47aa-8b0e-1b1ee895fe07-serviceca\") pod \"node-ca-fxfbb\" (UID: \"70cf02df-8c16-47aa-8b0e-1b1ee895fe07\") " pod="openshift-image-registry/node-ca-fxfbb" Apr 17 17:23:51.977174 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977167 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/adeb035c-390a-4439-9413-491fc20cac69-cnibin\") pod \"multus-additional-cni-plugins-hprdr\" (UID: \"adeb035c-390a-4439-9413-491fc20cac69\") " pod="openshift-multus/multus-additional-cni-plugins-hprdr" Apr 17 17:23:51.977268 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977187 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6tr4\" (UniqueName: \"kubernetes.io/projected/adeb035c-390a-4439-9413-491fc20cac69-kube-api-access-h6tr4\") pod \"multus-additional-cni-plugins-hprdr\" (UID: \"adeb035c-390a-4439-9413-491fc20cac69\") " pod="openshift-multus/multus-additional-cni-plugins-hprdr" Apr 17 17:23:51.977268 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977203 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ccb0409d-b96e-4a69-8ff3-e260b869ecdf-agent-certs\") pod \"konnectivity-agent-nhs5m\" (UID: \"ccb0409d-b96e-4a69-8ff3-e260b869ecdf\") " pod="kube-system/konnectivity-agent-nhs5m" Apr 17 17:23:51.977268 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977217 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-etc-openvswitch\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:51.977268 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977253 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-node-log\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:51.977268 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977267 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3ff73a5e-853e-4f01-b8d7-e995977da39f-ovnkube-config\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:51.977469 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977280 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-lib-modules\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:51.977469 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977313 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ccb0409d-b96e-4a69-8ff3-e260b869ecdf-konnectivity-ca\") pod \"konnectivity-agent-nhs5m\" (UID: \"ccb0409d-b96e-4a69-8ff3-e260b869ecdf\") " pod="kube-system/konnectivity-agent-nhs5m" Apr 17 17:23:51.977469 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977340 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-run-systemd\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:51.977469 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977355 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-var-lib-openvswitch\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:51.977469 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977395 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-run-ovn\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:51.977469 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977453 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-host-run-ovn-kubernetes\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:51.977681 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977487 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-host\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:51.977681 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977512 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/06cbfbab-93e8-4744-be0c-f8d8adfb094d-host-slash\") pod \"iptables-alerter-rj22k\" (UID: \"06cbfbab-93e8-4744-be0c-f8d8adfb094d\") " pod="openshift-network-operator/iptables-alerter-rj22k" Apr 17 17:23:51.977681 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977533 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-host-kubelet\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:51.977681 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977547 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6c57d472-0514-46ce-bf7a-f6af067b3f5d-socket-dir\") pod \"aws-ebs-csi-driver-node-hbqb6\" (UID: \"6c57d472-0514-46ce-bf7a-f6af067b3f5d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbqb6" Apr 17 17:23:51.977681 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977561 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-host-run-netns\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:51.977681 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977576 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:51.977681 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977598 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-etc-sysctl-d\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:51.977681 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977612 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/adeb035c-390a-4439-9413-491fc20cac69-cni-binary-copy\") pod \"multus-additional-cni-plugins-hprdr\" (UID: \"adeb035c-390a-4439-9413-491fc20cac69\") " pod="openshift-multus/multus-additional-cni-plugins-hprdr" Apr 17 17:23:51.977681 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977625 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70cf02df-8c16-47aa-8b0e-1b1ee895fe07-host\") pod \"node-ca-fxfbb\" (UID: \"70cf02df-8c16-47aa-8b0e-1b1ee895fe07\") " pod="openshift-image-registry/node-ca-fxfbb" Apr 17 17:23:51.977681 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977639 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bspw2\" (UniqueName: \"kubernetes.io/projected/06cbfbab-93e8-4744-be0c-f8d8adfb094d-kube-api-access-bspw2\") pod \"iptables-alerter-rj22k\" (UID: \"06cbfbab-93e8-4744-be0c-f8d8adfb094d\") " pod="openshift-network-operator/iptables-alerter-rj22k" Apr 17 17:23:51.977681 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977666 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-log-socket\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:51.977681 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977681 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-host-cni-netd\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:51.978071 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977694 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-etc-modprobe-d\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:51.978071 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977710 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-etc-sysconfig\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:51.978071 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977738 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3ff73a5e-853e-4f01-b8d7-e995977da39f-env-overrides\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:51.978071 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977751 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3ff73a5e-853e-4f01-b8d7-e995977da39f-ovnkube-script-lib\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:51.978071 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977770 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-etc-kubernetes\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:51.978071 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977783 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-sys\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:51.978071 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977815 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kmg8\" (UniqueName: \"kubernetes.io/projected/ae488321-502d-450d-a483-234f0aff8bb3-kube-api-access-6kmg8\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:51.978071 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977870 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c6ch\" (UniqueName: \"kubernetes.io/projected/e4fd5db6-98ca-462f-b950-10c0fe775718-kube-api-access-2c6ch\") pod \"network-check-target-tlwhr\" (UID: \"e4fd5db6-98ca-462f-b950-10c0fe775718\") " pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:23:51.978071 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977887 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-var-lib-kubelet\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:51.978071 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977908 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c6gh\" (UniqueName: \"kubernetes.io/projected/6c57d472-0514-46ce-bf7a-f6af067b3f5d-kube-api-access-7c6gh\") pod \"aws-ebs-csi-driver-node-hbqb6\" (UID: \"6c57d472-0514-46ce-bf7a-f6af067b3f5d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbqb6" Apr 17 17:23:51.978071 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977923 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-etc-sysctl-conf\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:51.978071 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977937 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-run\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:51.978071 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977952 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6c57d472-0514-46ce-bf7a-f6af067b3f5d-sys-fs\") pod \"aws-ebs-csi-driver-node-hbqb6\" (UID: \"6c57d472-0514-46ce-bf7a-f6af067b3f5d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbqb6" Apr 17 17:23:51.978071 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.977976 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kz4n\" (UniqueName: \"kubernetes.io/projected/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-kube-api-access-2kz4n\") pod \"network-metrics-daemon-mqlsd\" (UID: \"41eeea20-b1c0-4cb6-8da8-a4a26a60423d\") " pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:23:51.978071 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.978001 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/adeb035c-390a-4439-9413-491fc20cac69-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hprdr\" (UID: \"adeb035c-390a-4439-9413-491fc20cac69\") " pod="openshift-multus/multus-additional-cni-plugins-hprdr" Apr 17 17:23:51.978071 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.978016 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/adfae8ba-8d02-4f3c-85a7-b2ae828b0579-hosts-file\") pod \"node-resolver-qd2qr\" (UID: \"adfae8ba-8d02-4f3c-85a7-b2ae828b0579\") " pod="openshift-dns/node-resolver-qd2qr" Apr 17 17:23:51.978524 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.978031 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/adeb035c-390a-4439-9413-491fc20cac69-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hprdr\" (UID: \"adeb035c-390a-4439-9413-491fc20cac69\") " pod="openshift-multus/multus-additional-cni-plugins-hprdr" Apr 17 17:23:51.978524 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.978054 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-host-slash\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:51.978524 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.978080 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3ff73a5e-853e-4f01-b8d7-e995977da39f-ovn-node-metrics-cert\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:51.978524 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.978098 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-etc-systemd\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:51.978524 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.978123 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/adeb035c-390a-4439-9413-491fc20cac69-system-cni-dir\") pod \"multus-additional-cni-plugins-hprdr\" (UID: \"adeb035c-390a-4439-9413-491fc20cac69\") " pod="openshift-multus/multus-additional-cni-plugins-hprdr" Apr 17 17:23:51.978524 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.978139 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/adeb035c-390a-4439-9413-491fc20cac69-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hprdr\" (UID: \"adeb035c-390a-4439-9413-491fc20cac69\") " pod="openshift-multus/multus-additional-cni-plugins-hprdr" Apr 17 17:23:51.978524 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.978156 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/adfae8ba-8d02-4f3c-85a7-b2ae828b0579-tmp-dir\") pod \"node-resolver-qd2qr\" (UID: \"adfae8ba-8d02-4f3c-85a7-b2ae828b0579\") " pod="openshift-dns/node-resolver-qd2qr" Apr 17 17:23:51.978524 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.978181 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h725\" (UniqueName: \"kubernetes.io/projected/70cf02df-8c16-47aa-8b0e-1b1ee895fe07-kube-api-access-8h725\") pod \"node-ca-fxfbb\" (UID: \"70cf02df-8c16-47aa-8b0e-1b1ee895fe07\") " pod="openshift-image-registry/node-ca-fxfbb" Apr 17 17:23:51.978524 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.978198 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-metrics-certs\") pod \"network-metrics-daemon-mqlsd\" (UID: \"41eeea20-b1c0-4cb6-8da8-a4a26a60423d\") " pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:23:51.978524 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.978212 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-run-openvswitch\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:51.978524 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.978227 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/adeb035c-390a-4439-9413-491fc20cac69-os-release\") pod \"multus-additional-cni-plugins-hprdr\" (UID: \"adeb035c-390a-4439-9413-491fc20cac69\") " pod="openshift-multus/multus-additional-cni-plugins-hprdr" Apr 17 17:23:51.978524 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.978240 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-systemd-units\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:51.978524 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.978252 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-host-cni-bin\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:51.978524 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.978276 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4qlq\" (UniqueName: \"kubernetes.io/projected/3ff73a5e-853e-4f01-b8d7-e995977da39f-kube-api-access-j4qlq\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:51.978524 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.978291 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ae488321-502d-450d-a483-234f0aff8bb3-etc-tuned\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:51.978524 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.978303 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c57d472-0514-46ce-bf7a-f6af067b3f5d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hbqb6\" (UID: \"6c57d472-0514-46ce-bf7a-f6af067b3f5d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbqb6" Apr 17 17:23:51.978969 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.978316 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6c57d472-0514-46ce-bf7a-f6af067b3f5d-registration-dir\") pod \"aws-ebs-csi-driver-node-hbqb6\" (UID: \"6c57d472-0514-46ce-bf7a-f6af067b3f5d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbqb6" Apr 17 17:23:51.978969 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.978329 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6c57d472-0514-46ce-bf7a-f6af067b3f5d-device-dir\") pod \"aws-ebs-csi-driver-node-hbqb6\" (UID: \"6c57d472-0514-46ce-bf7a-f6af067b3f5d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbqb6" Apr 17 17:23:51.980946 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.980927 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-140-33.ec2.internal"] Apr 17 17:23:51.981808 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.981793 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:23:51.981894 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.981866 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-33.ec2.internal" Apr 17 17:23:51.982555 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.982542 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:23:51.989022 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.989000 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 17:18:50 +0000 UTC" deadline="2027-10-14 12:22:44.369462415 +0000 UTC" Apr 17 17:23:51.989071 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.989022 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13074h58m52.380442647s" Apr 17 17:23:51.994519 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.994502 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-33.ec2.internal"] Apr 17 17:23:51.994601 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:51.994589 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:23:52.002584 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.002567 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-g2fkq" Apr 17 17:23:52.011793 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.011778 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-g2fkq" Apr 17 17:23:52.071610 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.071592 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 17:23:52.074151 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.074135 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:23:52.078578 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.078560 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8h725\" (UniqueName: \"kubernetes.io/projected/70cf02df-8c16-47aa-8b0e-1b1ee895fe07-kube-api-access-8h725\") pod \"node-ca-fxfbb\" (UID: \"70cf02df-8c16-47aa-8b0e-1b1ee895fe07\") " pod="openshift-image-registry/node-ca-fxfbb" Apr 17 17:23:52.078660 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.078599 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-metrics-certs\") pod \"network-metrics-daemon-mqlsd\" (UID: \"41eeea20-b1c0-4cb6-8da8-a4a26a60423d\") " pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:23:52.078660 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.078627 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-run-openvswitch\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.078762 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.078656 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-multus-cni-dir\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.078762 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.078692 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/adeb035c-390a-4439-9413-491fc20cac69-os-release\") pod \"multus-additional-cni-plugins-hprdr\" (UID: \"adeb035c-390a-4439-9413-491fc20cac69\") " pod="openshift-multus/multus-additional-cni-plugins-hprdr" Apr 17 17:23:52.078762 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.078716 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-systemd-units\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.078762 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:52.078720 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:23:52.078762 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.078723 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-run-openvswitch\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.078762 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.078748 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-host-cni-bin\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.079061 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.078772 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4qlq\" (UniqueName: \"kubernetes.io/projected/3ff73a5e-853e-4f01-b8d7-e995977da39f-kube-api-access-j4qlq\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.079061 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.078789 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-systemd-units\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.079061 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:52.078810 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-metrics-certs podName:41eeea20-b1c0-4cb6-8da8-a4a26a60423d nodeName:}" failed. No retries permitted until 2026-04-17 17:23:52.578777957 +0000 UTC m=+2.104458573 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-metrics-certs") pod "network-metrics-daemon-mqlsd" (UID: "41eeea20-b1c0-4cb6-8da8-a4a26a60423d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:23:52.079061 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.078808 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/adeb035c-390a-4439-9413-491fc20cac69-os-release\") pod \"multus-additional-cni-plugins-hprdr\" (UID: \"adeb035c-390a-4439-9413-491fc20cac69\") " pod="openshift-multus/multus-additional-cni-plugins-hprdr" Apr 17 17:23:52.079061 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.078855 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-host-cni-bin\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.079061 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.078874 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ae488321-502d-450d-a483-234f0aff8bb3-etc-tuned\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:52.079061 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.078909 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c57d472-0514-46ce-bf7a-f6af067b3f5d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hbqb6\" (UID: \"6c57d472-0514-46ce-bf7a-f6af067b3f5d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbqb6" Apr 17 17:23:52.079061 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.078936 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6c57d472-0514-46ce-bf7a-f6af067b3f5d-registration-dir\") pod \"aws-ebs-csi-driver-node-hbqb6\" (UID: \"6c57d472-0514-46ce-bf7a-f6af067b3f5d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbqb6" Apr 17 17:23:52.079061 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.078965 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-cnibin\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.079061 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.078989 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6c57d472-0514-46ce-bf7a-f6af067b3f5d-device-dir\") pod \"aws-ebs-csi-driver-node-hbqb6\" (UID: \"6c57d472-0514-46ce-bf7a-f6af067b3f5d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbqb6" Apr 17 17:23:52.079061 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079005 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c57d472-0514-46ce-bf7a-f6af067b3f5d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hbqb6\" (UID: \"6c57d472-0514-46ce-bf7a-f6af067b3f5d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbqb6" Apr 17 17:23:52.079061 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079015 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-etc-kubernetes\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.079061 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079042 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/06cbfbab-93e8-4744-be0c-f8d8adfb094d-iptables-alerter-script\") pod \"iptables-alerter-rj22k\" (UID: \"06cbfbab-93e8-4744-be0c-f8d8adfb094d\") " pod="openshift-network-operator/iptables-alerter-rj22k" Apr 17 17:23:52.079061 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079050 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6c57d472-0514-46ce-bf7a-f6af067b3f5d-registration-dir\") pod \"aws-ebs-csi-driver-node-hbqb6\" (UID: \"6c57d472-0514-46ce-bf7a-f6af067b3f5d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbqb6" Apr 17 17:23:52.079061 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079067 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cf2lq\" (UniqueName: \"kubernetes.io/projected/adfae8ba-8d02-4f3c-85a7-b2ae828b0579-kube-api-access-cf2lq\") pod \"node-resolver-qd2qr\" (UID: \"adfae8ba-8d02-4f3c-85a7-b2ae828b0579\") " pod="openshift-dns/node-resolver-qd2qr" Apr 17 17:23:52.079746 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079086 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6c57d472-0514-46ce-bf7a-f6af067b3f5d-device-dir\") pod \"aws-ebs-csi-driver-node-hbqb6\" (UID: \"6c57d472-0514-46ce-bf7a-f6af067b3f5d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbqb6" Apr 17 17:23:52.079746 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079091 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ae488321-502d-450d-a483-234f0aff8bb3-tmp\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:52.079746 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079114 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6c57d472-0514-46ce-bf7a-f6af067b3f5d-etc-selinux\") pod \"aws-ebs-csi-driver-node-hbqb6\" (UID: \"6c57d472-0514-46ce-bf7a-f6af067b3f5d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbqb6" Apr 17 17:23:52.079746 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079140 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-host-var-lib-cni-bin\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.079746 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079164 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-hostroot\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.079746 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079188 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-host-run-multus-certs\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.079746 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079215 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/70cf02df-8c16-47aa-8b0e-1b1ee895fe07-serviceca\") pod \"node-ca-fxfbb\" (UID: \"70cf02df-8c16-47aa-8b0e-1b1ee895fe07\") " pod="openshift-image-registry/node-ca-fxfbb" Apr 17 17:23:52.079746 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079239 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/adeb035c-390a-4439-9413-491fc20cac69-cnibin\") pod \"multus-additional-cni-plugins-hprdr\" (UID: \"adeb035c-390a-4439-9413-491fc20cac69\") " pod="openshift-multus/multus-additional-cni-plugins-hprdr" Apr 17 17:23:52.079746 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079241 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 17:23:52.079746 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079263 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6tr4\" (UniqueName: \"kubernetes.io/projected/adeb035c-390a-4439-9413-491fc20cac69-kube-api-access-h6tr4\") pod \"multus-additional-cni-plugins-hprdr\" (UID: \"adeb035c-390a-4439-9413-491fc20cac69\") " pod="openshift-multus/multus-additional-cni-plugins-hprdr" Apr 17 17:23:52.079746 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079280 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6c57d472-0514-46ce-bf7a-f6af067b3f5d-etc-selinux\") pod \"aws-ebs-csi-driver-node-hbqb6\" (UID: \"6c57d472-0514-46ce-bf7a-f6af067b3f5d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbqb6" Apr 17 17:23:52.079746 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079288 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ccb0409d-b96e-4a69-8ff3-e260b869ecdf-agent-certs\") pod \"konnectivity-agent-nhs5m\" (UID: \"ccb0409d-b96e-4a69-8ff3-e260b869ecdf\") " pod="kube-system/konnectivity-agent-nhs5m" Apr 17 17:23:52.079746 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079315 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-etc-openvswitch\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.079746 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079347 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-node-log\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.079746 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079392 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3ff73a5e-853e-4f01-b8d7-e995977da39f-ovnkube-config\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.079746 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079415 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-lib-modules\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:52.079746 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079442 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ccb0409d-b96e-4a69-8ff3-e260b869ecdf-konnectivity-ca\") pod \"konnectivity-agent-nhs5m\" (UID: \"ccb0409d-b96e-4a69-8ff3-e260b869ecdf\") " pod="kube-system/konnectivity-agent-nhs5m" Apr 17 17:23:52.079746 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079465 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-run-systemd\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.080570 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079489 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-var-lib-openvswitch\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.080570 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079512 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-run-ovn\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.080570 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079535 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-host-run-ovn-kubernetes\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.080570 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079561 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-host\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:52.080570 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079588 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-system-cni-dir\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.080570 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079597 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/06cbfbab-93e8-4744-be0c-f8d8adfb094d-iptables-alerter-script\") pod \"iptables-alerter-rj22k\" (UID: \"06cbfbab-93e8-4744-be0c-f8d8adfb094d\") " pod="openshift-network-operator/iptables-alerter-rj22k" Apr 17 17:23:52.080570 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079634 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-multus-socket-dir-parent\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.080570 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079664 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/06cbfbab-93e8-4744-be0c-f8d8adfb094d-host-slash\") pod \"iptables-alerter-rj22k\" (UID: \"06cbfbab-93e8-4744-be0c-f8d8adfb094d\") " pod="openshift-network-operator/iptables-alerter-rj22k" Apr 17 17:23:52.080570 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079674 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-var-lib-openvswitch\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.080570 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079694 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-host-kubelet\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.080570 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079717 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/70cf02df-8c16-47aa-8b0e-1b1ee895fe07-serviceca\") pod \"node-ca-fxfbb\" (UID: \"70cf02df-8c16-47aa-8b0e-1b1ee895fe07\") " pod="openshift-image-registry/node-ca-fxfbb" Apr 17 17:23:52.080570 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079748 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6c57d472-0514-46ce-bf7a-f6af067b3f5d-socket-dir\") pod \"aws-ebs-csi-driver-node-hbqb6\" (UID: \"6c57d472-0514-46ce-bf7a-f6af067b3f5d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbqb6" Apr 17 17:23:52.080570 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079762 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-lib-modules\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:52.080570 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079775 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-os-release\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.080570 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079799 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/adeb035c-390a-4439-9413-491fc20cac69-cnibin\") pod \"multus-additional-cni-plugins-hprdr\" (UID: \"adeb035c-390a-4439-9413-491fc20cac69\") " pod="openshift-multus/multus-additional-cni-plugins-hprdr" Apr 17 17:23:52.080570 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079801 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-host-run-netns\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.080570 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079859 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.081356 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079862 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-host-run-netns\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.081356 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079887 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-etc-sysctl-d\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:52.081356 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079920 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-run-ovn\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.081356 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079916 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-host-run-k8s-cni-cncf-io\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.081356 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079963 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-host-var-lib-kubelet\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.081356 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079989 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-multus-conf-dir\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.081356 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080018 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/adeb035c-390a-4439-9413-491fc20cac69-cni-binary-copy\") pod \"multus-additional-cni-plugins-hprdr\" (UID: \"adeb035c-390a-4439-9413-491fc20cac69\") " pod="openshift-multus/multus-additional-cni-plugins-hprdr" Apr 17 17:23:52.081356 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080043 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70cf02df-8c16-47aa-8b0e-1b1ee895fe07-host\") pod \"node-ca-fxfbb\" (UID: \"70cf02df-8c16-47aa-8b0e-1b1ee895fe07\") " pod="openshift-image-registry/node-ca-fxfbb" Apr 17 17:23:52.081356 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080030 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-host-kubelet\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.081356 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080071 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bspw2\" (UniqueName: \"kubernetes.io/projected/06cbfbab-93e8-4744-be0c-f8d8adfb094d-kube-api-access-bspw2\") pod \"iptables-alerter-rj22k\" (UID: \"06cbfbab-93e8-4744-be0c-f8d8adfb094d\") " pod="openshift-network-operator/iptables-alerter-rj22k" Apr 17 17:23:52.081356 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080098 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-log-socket\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.081356 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080123 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-host-cni-netd\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.081356 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080148 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-etc-modprobe-d\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:52.081356 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080158 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6c57d472-0514-46ce-bf7a-f6af067b3f5d-socket-dir\") pod \"aws-ebs-csi-driver-node-hbqb6\" (UID: \"6c57d472-0514-46ce-bf7a-f6af067b3f5d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbqb6" Apr 17 17:23:52.081356 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080176 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-etc-sysconfig\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:52.081356 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080206 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3f7324f4-55c2-40da-9869-47ec0880aec3-cni-binary-copy\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.081356 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080217 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-etc-openvswitch\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.082266 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080212 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3ff73a5e-853e-4f01-b8d7-e995977da39f-ovnkube-config\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.082266 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080236 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3ff73a5e-853e-4f01-b8d7-e995977da39f-env-overrides\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.082266 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080258 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ccb0409d-b96e-4a69-8ff3-e260b869ecdf-konnectivity-ca\") pod \"konnectivity-agent-nhs5m\" (UID: \"ccb0409d-b96e-4a69-8ff3-e260b869ecdf\") " pod="kube-system/konnectivity-agent-nhs5m" Apr 17 17:23:52.082266 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080264 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3ff73a5e-853e-4f01-b8d7-e995977da39f-ovnkube-script-lib\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.082266 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080265 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-node-log\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.082266 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.079964 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-host-run-ovn-kubernetes\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.082266 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080289 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-etc-kubernetes\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:52.082266 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080349 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/06cbfbab-93e8-4744-be0c-f8d8adfb094d-host-slash\") pod \"iptables-alerter-rj22k\" (UID: \"06cbfbab-93e8-4744-be0c-f8d8adfb094d\") " pod="openshift-network-operator/iptables-alerter-rj22k" Apr 17 17:23:52.082266 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080393 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-host\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:52.082266 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080396 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-etc-modprobe-d\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:52.082266 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080417 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-log-socket\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.082266 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080428 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-host-cni-netd\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.082266 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080460 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-run-systemd\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.082266 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080497 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-etc-sysconfig\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:52.082266 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080499 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/70cf02df-8c16-47aa-8b0e-1b1ee895fe07-host\") pod \"node-ca-fxfbb\" (UID: \"70cf02df-8c16-47aa-8b0e-1b1ee895fe07\") " pod="openshift-image-registry/node-ca-fxfbb" Apr 17 17:23:52.082266 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080536 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.082266 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080556 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-sys\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:52.082266 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080698 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-etc-kubernetes\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:52.083109 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080738 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3ff73a5e-853e-4f01-b8d7-e995977da39f-ovnkube-script-lib\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.083109 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080787 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-etc-sysctl-d\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:52.083109 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080792 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6kmg8\" (UniqueName: \"kubernetes.io/projected/ae488321-502d-450d-a483-234f0aff8bb3-kube-api-access-6kmg8\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:52.083109 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080795 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-sys\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:52.083109 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080842 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2c6ch\" (UniqueName: \"kubernetes.io/projected/e4fd5db6-98ca-462f-b950-10c0fe775718-kube-api-access-2c6ch\") pod \"network-check-target-tlwhr\" (UID: \"e4fd5db6-98ca-462f-b950-10c0fe775718\") " pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:23:52.083109 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080915 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3ff73a5e-853e-4f01-b8d7-e995977da39f-env-overrides\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.083109 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.080952 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-var-lib-kubelet\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:52.083109 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.081068 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7c6gh\" (UniqueName: \"kubernetes.io/projected/6c57d472-0514-46ce-bf7a-f6af067b3f5d-kube-api-access-7c6gh\") pod \"aws-ebs-csi-driver-node-hbqb6\" (UID: \"6c57d472-0514-46ce-bf7a-f6af067b3f5d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbqb6" Apr 17 17:23:52.083109 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.081098 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-etc-sysctl-conf\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:52.083109 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.081072 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-var-lib-kubelet\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:52.083109 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.081123 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-run\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:52.083109 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.081127 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/adeb035c-390a-4439-9413-491fc20cac69-cni-binary-copy\") pod \"multus-additional-cni-plugins-hprdr\" (UID: \"adeb035c-390a-4439-9413-491fc20cac69\") " pod="openshift-multus/multus-additional-cni-plugins-hprdr" Apr 17 17:23:52.083109 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.081154 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6c57d472-0514-46ce-bf7a-f6af067b3f5d-sys-fs\") pod \"aws-ebs-csi-driver-node-hbqb6\" (UID: \"6c57d472-0514-46ce-bf7a-f6af067b3f5d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbqb6" Apr 17 17:23:52.083109 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.081182 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9gc8\" (UniqueName: \"kubernetes.io/projected/3f7324f4-55c2-40da-9869-47ec0880aec3-kube-api-access-f9gc8\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.083109 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.081207 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2kz4n\" (UniqueName: \"kubernetes.io/projected/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-kube-api-access-2kz4n\") pod \"network-metrics-daemon-mqlsd\" (UID: \"41eeea20-b1c0-4cb6-8da8-a4a26a60423d\") " pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:23:52.083109 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.081246 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-etc-sysctl-conf\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:52.083109 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.081268 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6c57d472-0514-46ce-bf7a-f6af067b3f5d-sys-fs\") pod \"aws-ebs-csi-driver-node-hbqb6\" (UID: \"6c57d472-0514-46ce-bf7a-f6af067b3f5d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbqb6" Apr 17 17:23:52.083575 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.081288 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-run\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:52.083575 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.081310 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/adeb035c-390a-4439-9413-491fc20cac69-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hprdr\" (UID: \"adeb035c-390a-4439-9413-491fc20cac69\") " pod="openshift-multus/multus-additional-cni-plugins-hprdr" Apr 17 17:23:52.083575 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.081421 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/adfae8ba-8d02-4f3c-85a7-b2ae828b0579-hosts-file\") pod \"node-resolver-qd2qr\" (UID: \"adfae8ba-8d02-4f3c-85a7-b2ae828b0579\") " pod="openshift-dns/node-resolver-qd2qr" Apr 17 17:23:52.083575 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.081513 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-host-run-netns\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.083575 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.081561 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-host-var-lib-cni-multus\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.083575 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.081598 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3f7324f4-55c2-40da-9869-47ec0880aec3-multus-daemon-config\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.083575 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.081654 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/adfae8ba-8d02-4f3c-85a7-b2ae828b0579-hosts-file\") pod \"node-resolver-qd2qr\" (UID: \"adfae8ba-8d02-4f3c-85a7-b2ae828b0579\") " pod="openshift-dns/node-resolver-qd2qr" Apr 17 17:23:52.083575 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.081659 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/adeb035c-390a-4439-9413-491fc20cac69-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hprdr\" (UID: \"adeb035c-390a-4439-9413-491fc20cac69\") " pod="openshift-multus/multus-additional-cni-plugins-hprdr" Apr 17 17:23:52.083575 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.081815 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/adeb035c-390a-4439-9413-491fc20cac69-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hprdr\" (UID: \"adeb035c-390a-4439-9413-491fc20cac69\") " pod="openshift-multus/multus-additional-cni-plugins-hprdr" Apr 17 17:23:52.083575 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.081855 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/adeb035c-390a-4439-9413-491fc20cac69-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hprdr\" (UID: \"adeb035c-390a-4439-9413-491fc20cac69\") " pod="openshift-multus/multus-additional-cni-plugins-hprdr" Apr 17 17:23:52.083575 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.081880 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-host-slash\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.083575 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.081916 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3ff73a5e-853e-4f01-b8d7-e995977da39f-ovn-node-metrics-cert\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.083575 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.081939 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3ff73a5e-853e-4f01-b8d7-e995977da39f-host-slash\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.083575 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.081971 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-etc-systemd\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:52.083575 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.081998 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/adeb035c-390a-4439-9413-491fc20cac69-system-cni-dir\") pod \"multus-additional-cni-plugins-hprdr\" (UID: \"adeb035c-390a-4439-9413-491fc20cac69\") " pod="openshift-multus/multus-additional-cni-plugins-hprdr" Apr 17 17:23:52.083575 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.082031 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/adeb035c-390a-4439-9413-491fc20cac69-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hprdr\" (UID: \"adeb035c-390a-4439-9413-491fc20cac69\") " pod="openshift-multus/multus-additional-cni-plugins-hprdr" Apr 17 17:23:52.083575 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.082052 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/adfae8ba-8d02-4f3c-85a7-b2ae828b0579-tmp-dir\") pod \"node-resolver-qd2qr\" (UID: \"adfae8ba-8d02-4f3c-85a7-b2ae828b0579\") " pod="openshift-dns/node-resolver-qd2qr" Apr 17 17:23:52.084107 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.082322 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/adfae8ba-8d02-4f3c-85a7-b2ae828b0579-tmp-dir\") pod \"node-resolver-qd2qr\" (UID: \"adfae8ba-8d02-4f3c-85a7-b2ae828b0579\") " pod="openshift-dns/node-resolver-qd2qr" Apr 17 17:23:52.084107 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.082367 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ccb0409d-b96e-4a69-8ff3-e260b869ecdf-agent-certs\") pod \"konnectivity-agent-nhs5m\" (UID: \"ccb0409d-b96e-4a69-8ff3-e260b869ecdf\") " pod="kube-system/konnectivity-agent-nhs5m" Apr 17 17:23:52.084107 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.082369 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/adeb035c-390a-4439-9413-491fc20cac69-system-cni-dir\") pod \"multus-additional-cni-plugins-hprdr\" (UID: \"adeb035c-390a-4439-9413-491fc20cac69\") " pod="openshift-multus/multus-additional-cni-plugins-hprdr" Apr 17 17:23:52.084107 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.082477 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ae488321-502d-450d-a483-234f0aff8bb3-etc-systemd\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:52.084107 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.082986 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/adeb035c-390a-4439-9413-491fc20cac69-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hprdr\" (UID: \"adeb035c-390a-4439-9413-491fc20cac69\") " pod="openshift-multus/multus-additional-cni-plugins-hprdr" Apr 17 17:23:52.084107 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.083076 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ae488321-502d-450d-a483-234f0aff8bb3-tmp\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:52.084107 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.083191 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ae488321-502d-450d-a483-234f0aff8bb3-etc-tuned\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:52.084289 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.084158 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3ff73a5e-853e-4f01-b8d7-e995977da39f-ovn-node-metrics-cert\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.089043 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.089020 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h725\" (UniqueName: \"kubernetes.io/projected/70cf02df-8c16-47aa-8b0e-1b1ee895fe07-kube-api-access-8h725\") pod \"node-ca-fxfbb\" (UID: \"70cf02df-8c16-47aa-8b0e-1b1ee895fe07\") " pod="openshift-image-registry/node-ca-fxfbb" Apr 17 17:23:52.089361 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.089343 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf2lq\" (UniqueName: \"kubernetes.io/projected/adfae8ba-8d02-4f3c-85a7-b2ae828b0579-kube-api-access-cf2lq\") pod \"node-resolver-qd2qr\" (UID: \"adfae8ba-8d02-4f3c-85a7-b2ae828b0579\") " pod="openshift-dns/node-resolver-qd2qr" Apr 17 17:23:52.089537 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.089516 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4qlq\" (UniqueName: \"kubernetes.io/projected/3ff73a5e-853e-4f01-b8d7-e995977da39f-kube-api-access-j4qlq\") pod \"ovnkube-node-p2pbl\" (UID: \"3ff73a5e-853e-4f01-b8d7-e995977da39f\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.090607 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:52.090586 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:23:52.090723 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:52.090611 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:23:52.090723 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:52.090624 2575 projected.go:194] Error preparing data for projected volume kube-api-access-2c6ch for pod openshift-network-diagnostics/network-check-target-tlwhr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:23:52.090723 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:52.090686 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e4fd5db6-98ca-462f-b950-10c0fe775718-kube-api-access-2c6ch podName:e4fd5db6-98ca-462f-b950-10c0fe775718 nodeName:}" failed. No retries permitted until 2026-04-17 17:23:52.590668623 +0000 UTC m=+2.116349226 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-2c6ch" (UniqueName: "kubernetes.io/projected/e4fd5db6-98ca-462f-b950-10c0fe775718-kube-api-access-2c6ch") pod "network-check-target-tlwhr" (UID: "e4fd5db6-98ca-462f-b950-10c0fe775718") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:23:52.092356 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.092334 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c6gh\" (UniqueName: \"kubernetes.io/projected/6c57d472-0514-46ce-bf7a-f6af067b3f5d-kube-api-access-7c6gh\") pod \"aws-ebs-csi-driver-node-hbqb6\" (UID: \"6c57d472-0514-46ce-bf7a-f6af067b3f5d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbqb6" Apr 17 17:23:52.092712 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.092691 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kmg8\" (UniqueName: \"kubernetes.io/projected/ae488321-502d-450d-a483-234f0aff8bb3-kube-api-access-6kmg8\") pod \"tuned-cx8bv\" (UID: \"ae488321-502d-450d-a483-234f0aff8bb3\") " pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:52.092917 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.092814 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6tr4\" (UniqueName: \"kubernetes.io/projected/adeb035c-390a-4439-9413-491fc20cac69-kube-api-access-h6tr4\") pod \"multus-additional-cni-plugins-hprdr\" (UID: \"adeb035c-390a-4439-9413-491fc20cac69\") " pod="openshift-multus/multus-additional-cni-plugins-hprdr" Apr 17 17:23:52.093909 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.093886 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bspw2\" (UniqueName: \"kubernetes.io/projected/06cbfbab-93e8-4744-be0c-f8d8adfb094d-kube-api-access-bspw2\") pod \"iptables-alerter-rj22k\" (UID: \"06cbfbab-93e8-4744-be0c-f8d8adfb094d\") " pod="openshift-network-operator/iptables-alerter-rj22k" Apr 17 17:23:52.094073 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.094057 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kz4n\" (UniqueName: \"kubernetes.io/projected/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-kube-api-access-2kz4n\") pod \"network-metrics-daemon-mqlsd\" (UID: \"41eeea20-b1c0-4cb6-8da8-a4a26a60423d\") " pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:23:52.103876 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.103856 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:23:52.116387 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.116368 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" Apr 17 17:23:52.135461 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.135441 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hprdr" Apr 17 17:23:52.167029 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:52.167004 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf04a4241565e19b6ce163e5a36620c34.slice/crio-12b415010559e99b9db6f0bb0312e6eb8db764984662a64de62f385644926bad WatchSource:0}: Error finding container 12b415010559e99b9db6f0bb0312e6eb8db764984662a64de62f385644926bad: Status 404 returned error can't find the container with id 12b415010559e99b9db6f0bb0312e6eb8db764984662a64de62f385644926bad Apr 17 17:23:52.168260 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:52.168239 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1cd4a3b95297fd9fd160804093bee86.slice/crio-b40e867dff32e77ed93b626535d2c99845378a00053d1911c52a72746dc2aba1 WatchSource:0}: Error finding container b40e867dff32e77ed93b626535d2c99845378a00053d1911c52a72746dc2aba1: Status 404 returned error can't find the container with id b40e867dff32e77ed93b626535d2c99845378a00053d1911c52a72746dc2aba1 Apr 17 17:23:52.171894 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.171880 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:23:52.183226 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.183201 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-host-var-lib-cni-bin\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.183319 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.183233 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-hostroot\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.183319 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.183259 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-host-run-multus-certs\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.183319 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.183289 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-host-var-lib-cni-bin\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.183319 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.183316 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-system-cni-dir\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.183523 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.183318 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-hostroot\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.183523 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.183341 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-multus-socket-dir-parent\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.183523 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.183356 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-host-run-multus-certs\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.183523 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.183370 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-os-release\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.183523 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.183396 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-host-run-k8s-cni-cncf-io\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.183523 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.183411 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-host-var-lib-kubelet\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.183523 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.183406 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-system-cni-dir\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.183523 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.183419 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-multus-socket-dir-parent\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.183523 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.183445 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-multus-conf-dir\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.183523 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.183481 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-host-run-k8s-cni-cncf-io\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.183523 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.183483 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-host-var-lib-kubelet\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.183523 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.183492 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-multus-conf-dir\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.183523 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.183493 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-os-release\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.184023 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.183544 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3f7324f4-55c2-40da-9869-47ec0880aec3-cni-binary-copy\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.184023 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.183849 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9gc8\" (UniqueName: \"kubernetes.io/projected/3f7324f4-55c2-40da-9869-47ec0880aec3-kube-api-access-f9gc8\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.184023 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.183881 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-host-run-netns\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.184023 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.183901 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-host-var-lib-cni-multus\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.184023 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.183924 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3f7324f4-55c2-40da-9869-47ec0880aec3-multus-daemon-config\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.184023 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.183947 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-host-var-lib-cni-multus\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.184023 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.183995 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-multus-cni-dir\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.184330 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.184025 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-cnibin\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.184330 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.184052 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-etc-kubernetes\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.184330 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.184055 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-host-run-netns\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.184330 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.184108 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-etc-kubernetes\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.184330 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.184026 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3f7324f4-55c2-40da-9869-47ec0880aec3-cni-binary-copy\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.184330 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.184139 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-cnibin\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.184330 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.184169 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3f7324f4-55c2-40da-9869-47ec0880aec3-multus-cni-dir\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.184604 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.184407 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3f7324f4-55c2-40da-9869-47ec0880aec3-multus-daemon-config\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.191469 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.191452 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9gc8\" (UniqueName: \"kubernetes.io/projected/3f7324f4-55c2-40da-9869-47ec0880aec3-kube-api-access-f9gc8\") pod \"multus-wgh2j\" (UID: \"3f7324f4-55c2-40da-9869-47ec0880aec3\") " pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.280344 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.280293 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nhs5m" Apr 17 17:23:52.285868 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:52.285704 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccb0409d_b96e_4a69_8ff3_e260b869ecdf.slice/crio-5acfc6ac8cdded04537e66b9c8adbdfd1f0d9d10249ec7b7e0601f161c868fd6 WatchSource:0}: Error finding container 5acfc6ac8cdded04537e66b9c8adbdfd1f0d9d10249ec7b7e0601f161c868fd6: Status 404 returned error can't find the container with id 5acfc6ac8cdded04537e66b9c8adbdfd1f0d9d10249ec7b7e0601f161c868fd6 Apr 17 17:23:52.319457 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.319433 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbqb6" Apr 17 17:23:52.325163 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:52.325134 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c57d472_0514_46ce_bf7a_f6af067b3f5d.slice/crio-21087cc308cbdc0243e21eecf640d61e393deaba5452c080b90b319114e79946 WatchSource:0}: Error finding container 21087cc308cbdc0243e21eecf640d61e393deaba5452c080b90b319114e79946: Status 404 returned error can't find the container with id 21087cc308cbdc0243e21eecf640d61e393deaba5452c080b90b319114e79946 Apr 17 17:23:52.343191 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.343176 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qd2qr" Apr 17 17:23:52.349752 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:52.349732 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadfae8ba_8d02_4f3c_85a7_b2ae828b0579.slice/crio-e1339ce9e673d66591ae1d9ebd7d96f328a965f280b91be2f3a58a70d2456b25 WatchSource:0}: Error finding container e1339ce9e673d66591ae1d9ebd7d96f328a965f280b91be2f3a58a70d2456b25: Status 404 returned error can't find the container with id e1339ce9e673d66591ae1d9ebd7d96f328a965f280b91be2f3a58a70d2456b25 Apr 17 17:23:52.356649 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.356631 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fxfbb" Apr 17 17:23:52.362788 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:52.362768 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70cf02df_8c16_47aa_8b0e_1b1ee895fe07.slice/crio-894faa91bc5f2c54784ace712dbc12275e147dc45c183a0eba01fdf906827510 WatchSource:0}: Error finding container 894faa91bc5f2c54784ace712dbc12275e147dc45c183a0eba01fdf906827510: Status 404 returned error can't find the container with id 894faa91bc5f2c54784ace712dbc12275e147dc45c183a0eba01fdf906827510 Apr 17 17:23:52.374201 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.374176 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rj22k" Apr 17 17:23:52.377128 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:52.377108 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ff73a5e_853e_4f01_b8d7_e995977da39f.slice/crio-86137c929486827a25de905d209a1c1ea8c92cccab331955d5f3b6f33793f69f WatchSource:0}: Error finding container 86137c929486827a25de905d209a1c1ea8c92cccab331955d5f3b6f33793f69f: Status 404 returned error can't find the container with id 86137c929486827a25de905d209a1c1ea8c92cccab331955d5f3b6f33793f69f Apr 17 17:23:52.380769 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:52.380748 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06cbfbab_93e8_4744_be0c_f8d8adfb094d.slice/crio-3c20fe9a4e96d7c312d0c76db4b80947d4d94961ade7b8109891d0af8e47f591 WatchSource:0}: Error finding container 3c20fe9a4e96d7c312d0c76db4b80947d4d94961ade7b8109891d0af8e47f591: Status 404 returned error can't find the container with id 3c20fe9a4e96d7c312d0c76db4b80947d4d94961ade7b8109891d0af8e47f591 Apr 17 17:23:52.396984 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:52.396967 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae488321_502d_450d_a483_234f0aff8bb3.slice/crio-ed5dac496df7b76ac3a121ad17473440a366fbf2e381687d28393b1e50eefecc WatchSource:0}: Error finding container ed5dac496df7b76ac3a121ad17473440a366fbf2e381687d28393b1e50eefecc: Status 404 returned error can't find the container with id ed5dac496df7b76ac3a121ad17473440a366fbf2e381687d28393b1e50eefecc Apr 17 17:23:52.403934 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:52.403916 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadeb035c_390a_4439_9413_491fc20cac69.slice/crio-7c1d1a6bd51b405ef945efebf2d74087bc4c22a4bc2890493f553c736a2973a2 WatchSource:0}: Error finding container 7c1d1a6bd51b405ef945efebf2d74087bc4c22a4bc2890493f553c736a2973a2: Status 404 returned error can't find the container with id 7c1d1a6bd51b405ef945efebf2d74087bc4c22a4bc2890493f553c736a2973a2 Apr 17 17:23:52.440804 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.440781 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wgh2j" Apr 17 17:23:52.445560 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:23:52.445540 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f7324f4_55c2_40da_9869_47ec0880aec3.slice/crio-1bc2a1b76c6d1ad6543ea85705ec63fedebab7362d169871801449fe93964048 WatchSource:0}: Error finding container 1bc2a1b76c6d1ad6543ea85705ec63fedebab7362d169871801449fe93964048: Status 404 returned error can't find the container with id 1bc2a1b76c6d1ad6543ea85705ec63fedebab7362d169871801449fe93964048 Apr 17 17:23:52.587212 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.587117 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-metrics-certs\") pod \"network-metrics-daemon-mqlsd\" (UID: \"41eeea20-b1c0-4cb6-8da8-a4a26a60423d\") " pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:23:52.587340 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:52.587230 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:23:52.587340 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:52.587274 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-metrics-certs podName:41eeea20-b1c0-4cb6-8da8-a4a26a60423d nodeName:}" failed. No retries permitted until 2026-04-17 17:23:53.587260559 +0000 UTC m=+3.112941155 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-metrics-certs") pod "network-metrics-daemon-mqlsd" (UID: "41eeea20-b1c0-4cb6-8da8-a4a26a60423d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:23:52.688161 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.688133 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2c6ch\" (UniqueName: \"kubernetes.io/projected/e4fd5db6-98ca-462f-b950-10c0fe775718-kube-api-access-2c6ch\") pod \"network-check-target-tlwhr\" (UID: \"e4fd5db6-98ca-462f-b950-10c0fe775718\") " pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:23:52.688331 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:52.688251 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:23:52.688331 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:52.688272 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:23:52.688331 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:52.688283 2575 projected.go:194] Error preparing data for projected volume kube-api-access-2c6ch for pod openshift-network-diagnostics/network-check-target-tlwhr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:23:52.688503 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:52.688345 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e4fd5db6-98ca-462f-b950-10c0fe775718-kube-api-access-2c6ch podName:e4fd5db6-98ca-462f-b950-10c0fe775718 nodeName:}" failed. No retries permitted until 2026-04-17 17:23:53.688326939 +0000 UTC m=+3.214007537 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-2c6ch" (UniqueName: "kubernetes.io/projected/e4fd5db6-98ca-462f-b950-10c0fe775718-kube-api-access-2c6ch") pod "network-check-target-tlwhr" (UID: "e4fd5db6-98ca-462f-b950-10c0fe775718") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:23:52.757754 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:52.757723 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:23:53.014413 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:53.014369 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:18:52 +0000 UTC" deadline="2027-12-21 04:11:11.492726481 +0000 UTC" Apr 17 17:23:53.014413 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:53.014411 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14698h47m18.478319485s" Apr 17 17:23:53.129118 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:53.129062 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" event={"ID":"ae488321-502d-450d-a483-234f0aff8bb3","Type":"ContainerStarted","Data":"ed5dac496df7b76ac3a121ad17473440a366fbf2e381687d28393b1e50eefecc"} Apr 17 17:23:53.131556 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:53.131527 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" event={"ID":"3ff73a5e-853e-4f01-b8d7-e995977da39f","Type":"ContainerStarted","Data":"86137c929486827a25de905d209a1c1ea8c92cccab331955d5f3b6f33793f69f"} Apr 17 17:23:53.136671 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:53.136642 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fxfbb" event={"ID":"70cf02df-8c16-47aa-8b0e-1b1ee895fe07","Type":"ContainerStarted","Data":"894faa91bc5f2c54784ace712dbc12275e147dc45c183a0eba01fdf906827510"} Apr 17 17:23:53.146033 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:53.145974 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-33.ec2.internal" event={"ID":"f04a4241565e19b6ce163e5a36620c34","Type":"ContainerStarted","Data":"12b415010559e99b9db6f0bb0312e6eb8db764984662a64de62f385644926bad"} Apr 17 17:23:53.150437 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:53.150412 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wgh2j" event={"ID":"3f7324f4-55c2-40da-9869-47ec0880aec3","Type":"ContainerStarted","Data":"1bc2a1b76c6d1ad6543ea85705ec63fedebab7362d169871801449fe93964048"} Apr 17 17:23:53.155690 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:53.155642 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hprdr" event={"ID":"adeb035c-390a-4439-9413-491fc20cac69","Type":"ContainerStarted","Data":"7c1d1a6bd51b405ef945efebf2d74087bc4c22a4bc2890493f553c736a2973a2"} Apr 17 17:23:53.158304 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:53.158235 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rj22k" event={"ID":"06cbfbab-93e8-4744-be0c-f8d8adfb094d","Type":"ContainerStarted","Data":"3c20fe9a4e96d7c312d0c76db4b80947d4d94961ade7b8109891d0af8e47f591"} Apr 17 17:23:53.160598 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:53.160509 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qd2qr" event={"ID":"adfae8ba-8d02-4f3c-85a7-b2ae828b0579","Type":"ContainerStarted","Data":"e1339ce9e673d66591ae1d9ebd7d96f328a965f280b91be2f3a58a70d2456b25"} Apr 17 17:23:53.176104 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:53.176082 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbqb6" event={"ID":"6c57d472-0514-46ce-bf7a-f6af067b3f5d","Type":"ContainerStarted","Data":"21087cc308cbdc0243e21eecf640d61e393deaba5452c080b90b319114e79946"} Apr 17 17:23:53.188603 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:53.188579 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nhs5m" event={"ID":"ccb0409d-b96e-4a69-8ff3-e260b869ecdf","Type":"ContainerStarted","Data":"5acfc6ac8cdded04537e66b9c8adbdfd1f0d9d10249ec7b7e0601f161c868fd6"} Apr 17 17:23:53.212076 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:53.211844 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-33.ec2.internal" event={"ID":"c1cd4a3b95297fd9fd160804093bee86","Type":"ContainerStarted","Data":"b40e867dff32e77ed93b626535d2c99845378a00053d1911c52a72746dc2aba1"} Apr 17 17:23:53.593966 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:53.593898 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-metrics-certs\") pod \"network-metrics-daemon-mqlsd\" (UID: \"41eeea20-b1c0-4cb6-8da8-a4a26a60423d\") " pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:23:53.594159 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:53.594037 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:23:53.594159 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:53.594100 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-metrics-certs podName:41eeea20-b1c0-4cb6-8da8-a4a26a60423d nodeName:}" failed. No retries permitted until 2026-04-17 17:23:55.594080909 +0000 UTC m=+5.119761515 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-metrics-certs") pod "network-metrics-daemon-mqlsd" (UID: "41eeea20-b1c0-4cb6-8da8-a4a26a60423d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:23:53.694743 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:53.694701 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2c6ch\" (UniqueName: \"kubernetes.io/projected/e4fd5db6-98ca-462f-b950-10c0fe775718-kube-api-access-2c6ch\") pod \"network-check-target-tlwhr\" (UID: \"e4fd5db6-98ca-462f-b950-10c0fe775718\") " pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:23:53.694955 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:53.694861 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:23:53.694955 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:53.694882 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:23:53.694955 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:53.694894 2575 projected.go:194] Error preparing data for projected volume kube-api-access-2c6ch for pod openshift-network-diagnostics/network-check-target-tlwhr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:23:53.694955 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:53.694954 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e4fd5db6-98ca-462f-b950-10c0fe775718-kube-api-access-2c6ch podName:e4fd5db6-98ca-462f-b950-10c0fe775718 nodeName:}" failed. No retries permitted until 2026-04-17 17:23:55.694929198 +0000 UTC m=+5.220609808 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-2c6ch" (UniqueName: "kubernetes.io/projected/e4fd5db6-98ca-462f-b950-10c0fe775718-kube-api-access-2c6ch") pod "network-check-target-tlwhr" (UID: "e4fd5db6-98ca-462f-b950-10c0fe775718") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:23:54.014764 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:54.014721 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:18:52 +0000 UTC" deadline="2027-10-26 03:19:55.916933129 +0000 UTC" Apr 17 17:23:54.014764 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:54.014764 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13353h56m1.902181944s" Apr 17 17:23:54.090472 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:54.090446 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:23:54.090653 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:54.090564 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tlwhr" podUID="e4fd5db6-98ca-462f-b950-10c0fe775718" Apr 17 17:23:54.091053 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:54.091033 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:23:54.091157 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:54.091140 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqlsd" podUID="41eeea20-b1c0-4cb6-8da8-a4a26a60423d" Apr 17 17:23:54.852165 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:54.852074 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-xkj74"] Apr 17 17:23:54.855889 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:54.855431 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:23:54.855889 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:54.855504 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xkj74" podUID="909ae085-82dd-4695-a517-7bf565c61c39" Apr 17 17:23:54.904194 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:54.904164 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/909ae085-82dd-4695-a517-7bf565c61c39-dbus\") pod \"global-pull-secret-syncer-xkj74\" (UID: \"909ae085-82dd-4695-a517-7bf565c61c39\") " pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:23:54.904338 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:54.904242 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/909ae085-82dd-4695-a517-7bf565c61c39-kubelet-config\") pod \"global-pull-secret-syncer-xkj74\" (UID: \"909ae085-82dd-4695-a517-7bf565c61c39\") " pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:23:54.904338 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:54.904292 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/909ae085-82dd-4695-a517-7bf565c61c39-original-pull-secret\") pod \"global-pull-secret-syncer-xkj74\" (UID: \"909ae085-82dd-4695-a517-7bf565c61c39\") " pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:23:55.005574 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:55.004894 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/909ae085-82dd-4695-a517-7bf565c61c39-kubelet-config\") pod \"global-pull-secret-syncer-xkj74\" (UID: \"909ae085-82dd-4695-a517-7bf565c61c39\") " pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:23:55.005574 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:55.004957 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/909ae085-82dd-4695-a517-7bf565c61c39-original-pull-secret\") pod \"global-pull-secret-syncer-xkj74\" (UID: \"909ae085-82dd-4695-a517-7bf565c61c39\") " pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:23:55.005574 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:55.004996 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/909ae085-82dd-4695-a517-7bf565c61c39-dbus\") pod \"global-pull-secret-syncer-xkj74\" (UID: \"909ae085-82dd-4695-a517-7bf565c61c39\") " pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:23:55.005574 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:55.005159 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/909ae085-82dd-4695-a517-7bf565c61c39-dbus\") pod \"global-pull-secret-syncer-xkj74\" (UID: \"909ae085-82dd-4695-a517-7bf565c61c39\") " pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:23:55.005574 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:55.005222 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/909ae085-82dd-4695-a517-7bf565c61c39-kubelet-config\") pod \"global-pull-secret-syncer-xkj74\" (UID: \"909ae085-82dd-4695-a517-7bf565c61c39\") " pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:23:55.005574 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:55.005315 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:23:55.005574 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:55.005372 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/909ae085-82dd-4695-a517-7bf565c61c39-original-pull-secret podName:909ae085-82dd-4695-a517-7bf565c61c39 nodeName:}" failed. No retries permitted until 2026-04-17 17:23:55.505353665 +0000 UTC m=+5.031034277 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/909ae085-82dd-4695-a517-7bf565c61c39-original-pull-secret") pod "global-pull-secret-syncer-xkj74" (UID: "909ae085-82dd-4695-a517-7bf565c61c39") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:23:55.508943 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:55.508909 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/909ae085-82dd-4695-a517-7bf565c61c39-original-pull-secret\") pod \"global-pull-secret-syncer-xkj74\" (UID: \"909ae085-82dd-4695-a517-7bf565c61c39\") " pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:23:55.509373 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:55.509073 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:23:55.509373 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:55.509136 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/909ae085-82dd-4695-a517-7bf565c61c39-original-pull-secret podName:909ae085-82dd-4695-a517-7bf565c61c39 nodeName:}" failed. No retries permitted until 2026-04-17 17:23:56.509118995 +0000 UTC m=+6.034799604 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/909ae085-82dd-4695-a517-7bf565c61c39-original-pull-secret") pod "global-pull-secret-syncer-xkj74" (UID: "909ae085-82dd-4695-a517-7bf565c61c39") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:23:55.610528 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:55.609870 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-metrics-certs\") pod \"network-metrics-daemon-mqlsd\" (UID: \"41eeea20-b1c0-4cb6-8da8-a4a26a60423d\") " pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:23:55.610528 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:55.610095 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:23:55.610528 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:55.610156 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-metrics-certs podName:41eeea20-b1c0-4cb6-8da8-a4a26a60423d nodeName:}" failed. No retries permitted until 2026-04-17 17:23:59.610136762 +0000 UTC m=+9.135817377 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-metrics-certs") pod "network-metrics-daemon-mqlsd" (UID: "41eeea20-b1c0-4cb6-8da8-a4a26a60423d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:23:55.711412 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:55.711376 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2c6ch\" (UniqueName: \"kubernetes.io/projected/e4fd5db6-98ca-462f-b950-10c0fe775718-kube-api-access-2c6ch\") pod \"network-check-target-tlwhr\" (UID: \"e4fd5db6-98ca-462f-b950-10c0fe775718\") " pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:23:55.711585 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:55.711537 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:23:55.711585 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:55.711560 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:23:55.711585 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:55.711573 2575 projected.go:194] Error preparing data for projected volume kube-api-access-2c6ch for pod openshift-network-diagnostics/network-check-target-tlwhr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:23:55.711746 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:55.711632 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e4fd5db6-98ca-462f-b950-10c0fe775718-kube-api-access-2c6ch podName:e4fd5db6-98ca-462f-b950-10c0fe775718 nodeName:}" failed. No retries permitted until 2026-04-17 17:23:59.711612547 +0000 UTC m=+9.237293169 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-2c6ch" (UniqueName: "kubernetes.io/projected/e4fd5db6-98ca-462f-b950-10c0fe775718-kube-api-access-2c6ch") pod "network-check-target-tlwhr" (UID: "e4fd5db6-98ca-462f-b950-10c0fe775718") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:23:56.090087 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:56.090054 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:23:56.090272 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:56.090173 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tlwhr" podUID="e4fd5db6-98ca-462f-b950-10c0fe775718" Apr 17 17:23:56.090560 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:56.090537 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:23:56.090692 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:56.090670 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqlsd" podUID="41eeea20-b1c0-4cb6-8da8-a4a26a60423d" Apr 17 17:23:56.518943 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:56.518855 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/909ae085-82dd-4695-a517-7bf565c61c39-original-pull-secret\") pod \"global-pull-secret-syncer-xkj74\" (UID: \"909ae085-82dd-4695-a517-7bf565c61c39\") " pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:23:56.519440 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:56.518988 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:23:56.519440 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:56.519050 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/909ae085-82dd-4695-a517-7bf565c61c39-original-pull-secret podName:909ae085-82dd-4695-a517-7bf565c61c39 nodeName:}" failed. No retries permitted until 2026-04-17 17:23:58.519031574 +0000 UTC m=+8.044712182 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/909ae085-82dd-4695-a517-7bf565c61c39-original-pull-secret") pod "global-pull-secret-syncer-xkj74" (UID: "909ae085-82dd-4695-a517-7bf565c61c39") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:23:57.090460 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:57.090039 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:23:57.090460 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:57.090178 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xkj74" podUID="909ae085-82dd-4695-a517-7bf565c61c39" Apr 17 17:23:58.090056 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:58.089556 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:23:58.090056 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:58.089684 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tlwhr" podUID="e4fd5db6-98ca-462f-b950-10c0fe775718" Apr 17 17:23:58.090056 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:58.089949 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:23:58.090586 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:58.090113 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqlsd" podUID="41eeea20-b1c0-4cb6-8da8-a4a26a60423d" Apr 17 17:23:58.537273 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:58.537233 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/909ae085-82dd-4695-a517-7bf565c61c39-original-pull-secret\") pod \"global-pull-secret-syncer-xkj74\" (UID: \"909ae085-82dd-4695-a517-7bf565c61c39\") " pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:23:58.537452 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:58.537391 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:23:58.537516 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:58.537462 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/909ae085-82dd-4695-a517-7bf565c61c39-original-pull-secret podName:909ae085-82dd-4695-a517-7bf565c61c39 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:02.537441546 +0000 UTC m=+12.063122142 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/909ae085-82dd-4695-a517-7bf565c61c39-original-pull-secret") pod "global-pull-secret-syncer-xkj74" (UID: "909ae085-82dd-4695-a517-7bf565c61c39") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:23:59.089724 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:59.089607 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:23:59.089961 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:59.089731 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xkj74" podUID="909ae085-82dd-4695-a517-7bf565c61c39" Apr 17 17:23:59.644459 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:59.644305 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-metrics-certs\") pod \"network-metrics-daemon-mqlsd\" (UID: \"41eeea20-b1c0-4cb6-8da8-a4a26a60423d\") " pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:23:59.644942 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:59.644500 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:23:59.644942 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:59.644563 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-metrics-certs podName:41eeea20-b1c0-4cb6-8da8-a4a26a60423d nodeName:}" failed. No retries permitted until 2026-04-17 17:24:07.644543701 +0000 UTC m=+17.170224324 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-metrics-certs") pod "network-metrics-daemon-mqlsd" (UID: "41eeea20-b1c0-4cb6-8da8-a4a26a60423d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:23:59.745368 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:23:59.745314 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2c6ch\" (UniqueName: \"kubernetes.io/projected/e4fd5db6-98ca-462f-b950-10c0fe775718-kube-api-access-2c6ch\") pod \"network-check-target-tlwhr\" (UID: \"e4fd5db6-98ca-462f-b950-10c0fe775718\") " pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:23:59.745570 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:59.745498 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:23:59.745570 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:59.745522 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:23:59.745570 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:59.745535 2575 projected.go:194] Error preparing data for projected volume kube-api-access-2c6ch for pod openshift-network-diagnostics/network-check-target-tlwhr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:23:59.745761 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:23:59.745595 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e4fd5db6-98ca-462f-b950-10c0fe775718-kube-api-access-2c6ch podName:e4fd5db6-98ca-462f-b950-10c0fe775718 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:07.745576032 +0000 UTC m=+17.271256644 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-2c6ch" (UniqueName: "kubernetes.io/projected/e4fd5db6-98ca-462f-b950-10c0fe775718-kube-api-access-2c6ch") pod "network-check-target-tlwhr" (UID: "e4fd5db6-98ca-462f-b950-10c0fe775718") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:00.090196 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:00.090162 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:24:00.090358 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:00.090165 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:24:00.090358 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:00.090273 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tlwhr" podUID="e4fd5db6-98ca-462f-b950-10c0fe775718" Apr 17 17:24:00.090483 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:00.090384 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqlsd" podUID="41eeea20-b1c0-4cb6-8da8-a4a26a60423d" Apr 17 17:24:01.091329 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:01.090818 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:24:01.091329 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:01.090943 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xkj74" podUID="909ae085-82dd-4695-a517-7bf565c61c39" Apr 17 17:24:02.090085 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:02.090051 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:24:02.090278 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:02.090051 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:24:02.090278 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:02.090185 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqlsd" podUID="41eeea20-b1c0-4cb6-8da8-a4a26a60423d" Apr 17 17:24:02.090278 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:02.090256 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tlwhr" podUID="e4fd5db6-98ca-462f-b950-10c0fe775718" Apr 17 17:24:02.569115 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:02.569078 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/909ae085-82dd-4695-a517-7bf565c61c39-original-pull-secret\") pod \"global-pull-secret-syncer-xkj74\" (UID: \"909ae085-82dd-4695-a517-7bf565c61c39\") " pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:24:02.569564 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:02.569272 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:02.569564 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:02.569358 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/909ae085-82dd-4695-a517-7bf565c61c39-original-pull-secret podName:909ae085-82dd-4695-a517-7bf565c61c39 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:10.569335517 +0000 UTC m=+20.095016115 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/909ae085-82dd-4695-a517-7bf565c61c39-original-pull-secret") pod "global-pull-secret-syncer-xkj74" (UID: "909ae085-82dd-4695-a517-7bf565c61c39") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:03.090325 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:03.090295 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:24:03.090517 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:03.090422 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xkj74" podUID="909ae085-82dd-4695-a517-7bf565c61c39" Apr 17 17:24:04.090284 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:04.090249 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:24:04.090669 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:04.090249 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:24:04.090669 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:04.090361 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tlwhr" podUID="e4fd5db6-98ca-462f-b950-10c0fe775718" Apr 17 17:24:04.090669 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:04.090430 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqlsd" podUID="41eeea20-b1c0-4cb6-8da8-a4a26a60423d" Apr 17 17:24:05.089656 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:05.089630 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:24:05.089842 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:05.089748 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xkj74" podUID="909ae085-82dd-4695-a517-7bf565c61c39" Apr 17 17:24:06.089464 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:06.089431 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:24:06.089920 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:06.089443 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:24:06.089920 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:06.089577 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tlwhr" podUID="e4fd5db6-98ca-462f-b950-10c0fe775718" Apr 17 17:24:06.089920 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:06.089660 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqlsd" podUID="41eeea20-b1c0-4cb6-8da8-a4a26a60423d" Apr 17 17:24:07.090429 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:07.090399 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:24:07.090951 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:07.090522 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xkj74" podUID="909ae085-82dd-4695-a517-7bf565c61c39" Apr 17 17:24:07.706786 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:07.706756 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-metrics-certs\") pod \"network-metrics-daemon-mqlsd\" (UID: \"41eeea20-b1c0-4cb6-8da8-a4a26a60423d\") " pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:24:07.706993 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:07.706934 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:07.707048 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:07.706997 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-metrics-certs podName:41eeea20-b1c0-4cb6-8da8-a4a26a60423d nodeName:}" failed. No retries permitted until 2026-04-17 17:24:23.706981559 +0000 UTC m=+33.232662161 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-metrics-certs") pod "network-metrics-daemon-mqlsd" (UID: "41eeea20-b1c0-4cb6-8da8-a4a26a60423d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:07.807478 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:07.807447 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2c6ch\" (UniqueName: \"kubernetes.io/projected/e4fd5db6-98ca-462f-b950-10c0fe775718-kube-api-access-2c6ch\") pod \"network-check-target-tlwhr\" (UID: \"e4fd5db6-98ca-462f-b950-10c0fe775718\") " pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:24:07.807661 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:07.807594 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:07.807661 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:07.807614 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:07.807661 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:07.807625 2575 projected.go:194] Error preparing data for projected volume kube-api-access-2c6ch for pod openshift-network-diagnostics/network-check-target-tlwhr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:07.807812 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:07.807683 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e4fd5db6-98ca-462f-b950-10c0fe775718-kube-api-access-2c6ch podName:e4fd5db6-98ca-462f-b950-10c0fe775718 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:23.807666372 +0000 UTC m=+33.333346987 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-2c6ch" (UniqueName: "kubernetes.io/projected/e4fd5db6-98ca-462f-b950-10c0fe775718-kube-api-access-2c6ch") pod "network-check-target-tlwhr" (UID: "e4fd5db6-98ca-462f-b950-10c0fe775718") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:08.090178 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:08.090148 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:24:08.090178 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:08.090168 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:24:08.090491 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:08.090252 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tlwhr" podUID="e4fd5db6-98ca-462f-b950-10c0fe775718" Apr 17 17:24:08.090491 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:08.090399 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqlsd" podUID="41eeea20-b1c0-4cb6-8da8-a4a26a60423d" Apr 17 17:24:09.089369 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:09.089339 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:24:09.089546 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:09.089455 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xkj74" podUID="909ae085-82dd-4695-a517-7bf565c61c39" Apr 17 17:24:10.089577 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:10.089542 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:24:10.089947 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:10.089601 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:24:10.089947 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:10.089692 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqlsd" podUID="41eeea20-b1c0-4cb6-8da8-a4a26a60423d" Apr 17 17:24:10.089947 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:10.089863 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tlwhr" podUID="e4fd5db6-98ca-462f-b950-10c0fe775718" Apr 17 17:24:10.628790 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:10.628768 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/909ae085-82dd-4695-a517-7bf565c61c39-original-pull-secret\") pod \"global-pull-secret-syncer-xkj74\" (UID: \"909ae085-82dd-4695-a517-7bf565c61c39\") " pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:24:10.628899 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:10.628887 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:10.628937 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:10.628933 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/909ae085-82dd-4695-a517-7bf565c61c39-original-pull-secret podName:909ae085-82dd-4695-a517-7bf565c61c39 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:26.628920577 +0000 UTC m=+36.154601174 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/909ae085-82dd-4695-a517-7bf565c61c39-original-pull-secret") pod "global-pull-secret-syncer-xkj74" (UID: "909ae085-82dd-4695-a517-7bf565c61c39") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:24:11.091579 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:11.091387 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:24:11.092046 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:11.091620 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xkj74" podUID="909ae085-82dd-4695-a517-7bf565c61c39" Apr 17 17:24:11.246839 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:11.246796 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wgh2j" event={"ID":"3f7324f4-55c2-40da-9869-47ec0880aec3","Type":"ContainerStarted","Data":"250050fbf3a37a602d6a512f85c482d7e8a254a229f8bce81e5f69f411d4889e"} Apr 17 17:24:11.248493 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:11.248392 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" event={"ID":"ae488321-502d-450d-a483-234f0aff8bb3","Type":"ContainerStarted","Data":"6e8ebc05ffa2bfc38019c8254fede9fe29a185f75812051ee335b2c0bfef1042"} Apr 17 17:24:11.249999 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:11.249972 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" event={"ID":"3ff73a5e-853e-4f01-b8d7-e995977da39f","Type":"ContainerStarted","Data":"3de1e8563a38daf387096e5df2d7694129f2c7c054167fe4e812c2ac81607eb5"} Apr 17 17:24:11.251658 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:11.251632 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-33.ec2.internal" event={"ID":"f04a4241565e19b6ce163e5a36620c34","Type":"ContainerStarted","Data":"5aee499ee08aa6d7c5b85f1a5914ebe521d2ab435e27c51fd23999e88189505c"} Apr 17 17:24:11.266084 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:11.265993 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-wgh2j" podStartSLOduration=2.102079676 podStartE2EDuration="20.265970539s" podCreationTimestamp="2026-04-17 17:23:51 +0000 UTC" firstStartedPulling="2026-04-17 17:23:52.446760633 +0000 UTC m=+1.972441230" lastFinishedPulling="2026-04-17 17:24:10.610651497 +0000 UTC m=+20.136332093" observedRunningTime="2026-04-17 17:24:11.263890084 +0000 UTC m=+20.789570704" watchObservedRunningTime="2026-04-17 17:24:11.265970539 +0000 UTC m=+20.791651159" Apr 17 17:24:11.286785 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:11.286693 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-cx8bv" podStartSLOduration=2.080545897 podStartE2EDuration="20.286678873s" podCreationTimestamp="2026-04-17 17:23:51 +0000 UTC" firstStartedPulling="2026-04-17 17:23:52.399146919 +0000 UTC m=+1.924827517" lastFinishedPulling="2026-04-17 17:24:10.605279893 +0000 UTC m=+20.130960493" observedRunningTime="2026-04-17 17:24:11.286049197 +0000 UTC m=+20.811729832" watchObservedRunningTime="2026-04-17 17:24:11.286678873 +0000 UTC m=+20.812359492" Apr 17 17:24:11.305800 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:11.305749 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-33.ec2.internal" podStartSLOduration=20.305731173 podStartE2EDuration="20.305731173s" podCreationTimestamp="2026-04-17 17:23:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:24:11.305546206 +0000 UTC m=+20.831226818" watchObservedRunningTime="2026-04-17 17:24:11.305731173 +0000 UTC m=+20.831411793" Apr 17 17:24:12.090267 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:12.090099 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:24:12.090433 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:12.090162 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:24:12.090433 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:12.090350 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tlwhr" podUID="e4fd5db6-98ca-462f-b950-10c0fe775718" Apr 17 17:24:12.090433 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:12.090409 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqlsd" podUID="41eeea20-b1c0-4cb6-8da8-a4a26a60423d" Apr 17 17:24:12.254914 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:12.254877 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qd2qr" event={"ID":"adfae8ba-8d02-4f3c-85a7-b2ae828b0579","Type":"ContainerStarted","Data":"742230898ad6676a815dc4492ab0c449cfb1c8ca1fa27c35fa0d113ee61b4a1f"} Apr 17 17:24:12.256263 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:12.256233 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbqb6" event={"ID":"6c57d472-0514-46ce-bf7a-f6af067b3f5d","Type":"ContainerStarted","Data":"65b09a66b1f21dad8e59f5f1c4e578e26287dc46c6112e80976f533c3daa7954"} Apr 17 17:24:12.257704 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:12.257672 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nhs5m" event={"ID":"ccb0409d-b96e-4a69-8ff3-e260b869ecdf","Type":"ContainerStarted","Data":"d10d0e2cddc26cadd521e2ced9912b676b6067c14e46cb50eccb6c3e16bb77a0"} Apr 17 17:24:12.259287 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:12.259260 2575 generic.go:358] "Generic (PLEG): container finished" podID="c1cd4a3b95297fd9fd160804093bee86" containerID="71dc471e4cb461ff5b382ec080d7762178af945175d4c77f1e8c4e308def2e64" exitCode=0 Apr 17 17:24:12.259406 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:12.259319 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-33.ec2.internal" event={"ID":"c1cd4a3b95297fd9fd160804093bee86","Type":"ContainerDied","Data":"71dc471e4cb461ff5b382ec080d7762178af945175d4c77f1e8c4e308def2e64"} Apr 17 17:24:12.262315 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:12.262283 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" event={"ID":"3ff73a5e-853e-4f01-b8d7-e995977da39f","Type":"ContainerStarted","Data":"366d3f15c5b93d8948baaeeea19c1c36d3313c37e146b51121d2e5774e3b2a74"} Apr 17 17:24:12.262315 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:12.262314 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" event={"ID":"3ff73a5e-853e-4f01-b8d7-e995977da39f","Type":"ContainerStarted","Data":"c160fa8e5904aba4c07174945861fb1ceb98d6c035cdaf3e91a229d1693f8ed1"} Apr 17 17:24:12.262478 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:12.262327 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" event={"ID":"3ff73a5e-853e-4f01-b8d7-e995977da39f","Type":"ContainerStarted","Data":"5c27c3c236d1ee8b299c31d174d46b0824adbfb52beb6eb07932cfa97f13a3ca"} Apr 17 17:24:12.262478 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:12.262338 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" event={"ID":"3ff73a5e-853e-4f01-b8d7-e995977da39f","Type":"ContainerStarted","Data":"572b418bd137ea9f143a8e5e3b4975417a232ea2e56902fc1b6aa4403d41de50"} Apr 17 17:24:12.262478 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:12.262349 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" event={"ID":"3ff73a5e-853e-4f01-b8d7-e995977da39f","Type":"ContainerStarted","Data":"81e815e4bd98592af82693cade8229a0d92d3b6961516068f02963cf70581ab3"} Apr 17 17:24:12.263675 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:12.263653 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fxfbb" event={"ID":"70cf02df-8c16-47aa-8b0e-1b1ee895fe07","Type":"ContainerStarted","Data":"54e6de31e86fac4014eb849a998cdce0a416ce7a8a85b48a0490958dfb7686e7"} Apr 17 17:24:12.265226 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:12.265204 2575 generic.go:358] "Generic (PLEG): container finished" podID="adeb035c-390a-4439-9413-491fc20cac69" containerID="5cc5af0a840daa0e948096d9636509c71a2756917cebc0ccf79b5fa4afc88602" exitCode=0 Apr 17 17:24:12.265324 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:12.265262 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hprdr" event={"ID":"adeb035c-390a-4439-9413-491fc20cac69","Type":"ContainerDied","Data":"5cc5af0a840daa0e948096d9636509c71a2756917cebc0ccf79b5fa4afc88602"} Apr 17 17:24:12.266772 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:12.266749 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rj22k" event={"ID":"06cbfbab-93e8-4744-be0c-f8d8adfb094d","Type":"ContainerStarted","Data":"04c654216b50bac9ed5eab452f24da570ce7d46b96d0d04b8d39e01418bd830b"} Apr 17 17:24:12.268245 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:12.268212 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qd2qr" podStartSLOduration=3.015633893 podStartE2EDuration="21.268201594s" podCreationTimestamp="2026-04-17 17:23:51 +0000 UTC" firstStartedPulling="2026-04-17 17:23:52.351226397 +0000 UTC m=+1.876906998" lastFinishedPulling="2026-04-17 17:24:10.603794097 +0000 UTC m=+20.129474699" observedRunningTime="2026-04-17 17:24:12.268055734 +0000 UTC m=+21.793736354" watchObservedRunningTime="2026-04-17 17:24:12.268201594 +0000 UTC m=+21.793882213" Apr 17 17:24:12.280641 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:12.280598 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-fxfbb" podStartSLOduration=3.067339799 podStartE2EDuration="21.280584425s" podCreationTimestamp="2026-04-17 17:23:51 +0000 UTC" firstStartedPulling="2026-04-17 17:23:52.364418133 +0000 UTC m=+1.890098730" lastFinishedPulling="2026-04-17 17:24:10.577662756 +0000 UTC m=+20.103343356" observedRunningTime="2026-04-17 17:24:12.280364877 +0000 UTC m=+21.806045498" watchObservedRunningTime="2026-04-17 17:24:12.280584425 +0000 UTC m=+21.806265046" Apr 17 17:24:12.344090 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:12.344004 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-nhs5m" podStartSLOduration=3.053400628 podStartE2EDuration="21.343987753s" podCreationTimestamp="2026-04-17 17:23:51 +0000 UTC" firstStartedPulling="2026-04-17 17:23:52.287170291 +0000 UTC m=+1.812850887" lastFinishedPulling="2026-04-17 17:24:10.577757407 +0000 UTC m=+20.103438012" observedRunningTime="2026-04-17 17:24:12.328383282 +0000 UTC m=+21.854063935" watchObservedRunningTime="2026-04-17 17:24:12.343987753 +0000 UTC m=+21.869668371" Apr 17 17:24:12.344464 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:12.344427 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-rj22k" podStartSLOduration=3.148483731 podStartE2EDuration="21.344419026s" podCreationTimestamp="2026-04-17 17:23:51 +0000 UTC" firstStartedPulling="2026-04-17 17:23:52.382084357 +0000 UTC m=+1.907764954" lastFinishedPulling="2026-04-17 17:24:10.578019646 +0000 UTC m=+20.103700249" observedRunningTime="2026-04-17 17:24:12.344121462 +0000 UTC m=+21.869802082" watchObservedRunningTime="2026-04-17 17:24:12.344419026 +0000 UTC m=+21.870099647" Apr 17 17:24:12.763386 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:12.763363 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 17:24:13.050834 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:13.050727 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T17:24:12.763382265Z","UUID":"a631d9e7-baad-4b79-b007-11b4e6ede439","Handler":null,"Name":"","Endpoint":""} Apr 17 17:24:13.052808 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:13.052641 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 17:24:13.052808 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:13.052670 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 17:24:13.093478 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:13.093452 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:24:13.093686 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:13.093668 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xkj74" podUID="909ae085-82dd-4695-a517-7bf565c61c39" Apr 17 17:24:13.270013 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:13.269975 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbqb6" event={"ID":"6c57d472-0514-46ce-bf7a-f6af067b3f5d","Type":"ContainerStarted","Data":"49a56ee556d53bf629837ec46fa05d67cc96a6372a91557bf6077302d698c8bd"} Apr 17 17:24:13.272292 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:13.272261 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-33.ec2.internal" event={"ID":"c1cd4a3b95297fd9fd160804093bee86","Type":"ContainerStarted","Data":"0907b612947a078caebbc5f91b95e3b64b22bec2a7eecf4b3e331b7387ae2c4f"} Apr 17 17:24:13.297202 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:13.297159 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-33.ec2.internal" podStartSLOduration=22.297143083 podStartE2EDuration="22.297143083s" podCreationTimestamp="2026-04-17 17:23:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:24:13.297116316 +0000 UTC m=+22.822796956" watchObservedRunningTime="2026-04-17 17:24:13.297143083 +0000 UTC m=+22.822823703" Apr 17 17:24:14.089631 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:14.089455 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:24:14.089859 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:14.089736 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqlsd" podUID="41eeea20-b1c0-4cb6-8da8-a4a26a60423d" Apr 17 17:24:14.089859 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:14.089455 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:24:14.089859 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:14.089846 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tlwhr" podUID="e4fd5db6-98ca-462f-b950-10c0fe775718" Apr 17 17:24:14.276430 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:14.276394 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" event={"ID":"3ff73a5e-853e-4f01-b8d7-e995977da39f","Type":"ContainerStarted","Data":"ebc6d2731016a2461804e0237adec92194991c17d12d4eb0dcd3a675ecfe5957"} Apr 17 17:24:14.278040 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:14.278009 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbqb6" event={"ID":"6c57d472-0514-46ce-bf7a-f6af067b3f5d","Type":"ContainerStarted","Data":"8e29b8a2ad464eef1d5b2efc5e64bf176a5c3e5623d0e4496427fed4e224ad7d"} Apr 17 17:24:15.092297 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:15.092271 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:24:15.092439 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:15.092391 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xkj74" podUID="909ae085-82dd-4695-a517-7bf565c61c39" Apr 17 17:24:15.557842 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:15.557788 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-nhs5m" Apr 17 17:24:15.558575 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:15.558554 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-nhs5m" Apr 17 17:24:15.573139 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:15.573094 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbqb6" podStartSLOduration=3.256245253 podStartE2EDuration="24.573077075s" podCreationTimestamp="2026-04-17 17:23:51 +0000 UTC" firstStartedPulling="2026-04-17 17:23:52.326587421 +0000 UTC m=+1.852268018" lastFinishedPulling="2026-04-17 17:24:13.643419237 +0000 UTC m=+23.169099840" observedRunningTime="2026-04-17 17:24:14.295262272 +0000 UTC m=+23.820942891" watchObservedRunningTime="2026-04-17 17:24:15.573077075 +0000 UTC m=+25.098757694" Apr 17 17:24:16.089759 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:16.089728 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:24:16.089944 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:16.089735 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:24:16.089944 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:16.089859 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tlwhr" podUID="e4fd5db6-98ca-462f-b950-10c0fe775718" Apr 17 17:24:16.090032 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:16.089955 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqlsd" podUID="41eeea20-b1c0-4cb6-8da8-a4a26a60423d" Apr 17 17:24:16.281421 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:16.281391 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-nhs5m" Apr 17 17:24:16.282056 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:16.281906 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-nhs5m" Apr 17 17:24:17.092894 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:17.092723 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:24:17.092894 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:17.092813 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xkj74" podUID="909ae085-82dd-4695-a517-7bf565c61c39" Apr 17 17:24:17.286485 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:17.286273 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" event={"ID":"3ff73a5e-853e-4f01-b8d7-e995977da39f","Type":"ContainerStarted","Data":"a5799077b4b478319f963d5ed095a943d3434cc9870831b78daca27b139c7dec"} Apr 17 17:24:17.286650 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:17.286531 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:24:17.286650 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:17.286570 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:24:17.300491 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:17.300471 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:24:17.317946 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:17.317904 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" podStartSLOduration=7.549370028 podStartE2EDuration="26.317888955s" podCreationTimestamp="2026-04-17 17:23:51 +0000 UTC" firstStartedPulling="2026-04-17 17:23:52.378850789 +0000 UTC m=+1.904531386" lastFinishedPulling="2026-04-17 17:24:11.147369716 +0000 UTC m=+20.673050313" observedRunningTime="2026-04-17 17:24:17.316874633 +0000 UTC m=+26.842555253" watchObservedRunningTime="2026-04-17 17:24:17.317888955 +0000 UTC m=+26.843569574" Apr 17 17:24:18.089788 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:18.089750 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:24:18.089949 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:18.089761 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:24:18.089949 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:18.089863 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tlwhr" podUID="e4fd5db6-98ca-462f-b950-10c0fe775718" Apr 17 17:24:18.089949 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:18.089926 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqlsd" podUID="41eeea20-b1c0-4cb6-8da8-a4a26a60423d" Apr 17 17:24:18.290498 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:18.290462 2575 generic.go:358] "Generic (PLEG): container finished" podID="adeb035c-390a-4439-9413-491fc20cac69" containerID="75186a33406792aba592e274868d2baaae2f1e91634a435055d2ecbf6a241d90" exitCode=0 Apr 17 17:24:18.290975 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:18.290545 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hprdr" event={"ID":"adeb035c-390a-4439-9413-491fc20cac69","Type":"ContainerDied","Data":"75186a33406792aba592e274868d2baaae2f1e91634a435055d2ecbf6a241d90"} Apr 17 17:24:18.291255 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:18.291237 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:24:18.307815 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:18.307795 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:24:19.060638 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:19.060614 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xkj74"] Apr 17 17:24:19.060752 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:19.060745 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:24:19.060876 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:19.060858 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xkj74" podUID="909ae085-82dd-4695-a517-7bf565c61c39" Apr 17 17:24:19.061196 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:19.061177 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mqlsd"] Apr 17 17:24:19.061262 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:19.061251 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:24:19.061356 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:19.061340 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqlsd" podUID="41eeea20-b1c0-4cb6-8da8-a4a26a60423d" Apr 17 17:24:19.063035 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:19.062867 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-tlwhr"] Apr 17 17:24:19.063035 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:19.062948 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:24:19.063117 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:19.063039 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tlwhr" podUID="e4fd5db6-98ca-462f-b950-10c0fe775718" Apr 17 17:24:19.293877 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:19.293841 2575 generic.go:358] "Generic (PLEG): container finished" podID="adeb035c-390a-4439-9413-491fc20cac69" containerID="0073e9445f5a7bfebe08ae6345004a405a841d4f9a686deba20729fcb2d096cc" exitCode=0 Apr 17 17:24:19.294311 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:19.293930 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hprdr" event={"ID":"adeb035c-390a-4439-9413-491fc20cac69","Type":"ContainerDied","Data":"0073e9445f5a7bfebe08ae6345004a405a841d4f9a686deba20729fcb2d096cc"} Apr 17 17:24:20.297745 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:20.297514 2575 generic.go:358] "Generic (PLEG): container finished" podID="adeb035c-390a-4439-9413-491fc20cac69" containerID="1e428a69152c6548355c07987c13801ff67505f550d2eda62358f4628513891b" exitCode=0 Apr 17 17:24:20.298115 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:20.297591 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hprdr" event={"ID":"adeb035c-390a-4439-9413-491fc20cac69","Type":"ContainerDied","Data":"1e428a69152c6548355c07987c13801ff67505f550d2eda62358f4628513891b"} Apr 17 17:24:21.090198 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:21.090172 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:24:21.090370 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:21.090252 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqlsd" podUID="41eeea20-b1c0-4cb6-8da8-a4a26a60423d" Apr 17 17:24:21.090370 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:21.090349 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:24:21.090478 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:21.090455 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xkj74" podUID="909ae085-82dd-4695-a517-7bf565c61c39" Apr 17 17:24:21.090531 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:21.090509 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:24:21.090603 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:21.090586 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tlwhr" podUID="e4fd5db6-98ca-462f-b950-10c0fe775718" Apr 17 17:24:23.092956 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:23.092922 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:24:23.092956 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:23.092953 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:24:23.093737 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:23.092927 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:24:23.093737 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:23.093020 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-tlwhr" podUID="e4fd5db6-98ca-462f-b950-10c0fe775718" Apr 17 17:24:23.093737 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:23.093114 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqlsd" podUID="41eeea20-b1c0-4cb6-8da8-a4a26a60423d" Apr 17 17:24:23.093737 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:23.093187 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xkj74" podUID="909ae085-82dd-4695-a517-7bf565c61c39" Apr 17 17:24:23.728994 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:23.728958 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-metrics-certs\") pod \"network-metrics-daemon-mqlsd\" (UID: \"41eeea20-b1c0-4cb6-8da8-a4a26a60423d\") " pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:24:23.729150 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:23.729060 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:23.729150 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:23.729121 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-metrics-certs podName:41eeea20-b1c0-4cb6-8da8-a4a26a60423d nodeName:}" failed. No retries permitted until 2026-04-17 17:24:55.729104827 +0000 UTC m=+65.254785430 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-metrics-certs") pod "network-metrics-daemon-mqlsd" (UID: "41eeea20-b1c0-4cb6-8da8-a4a26a60423d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:23.788690 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:23.788665 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-33.ec2.internal" event="NodeReady" Apr 17 17:24:23.788850 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:23.788805 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 17:24:23.829959 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:23.829923 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2c6ch\" (UniqueName: \"kubernetes.io/projected/e4fd5db6-98ca-462f-b950-10c0fe775718-kube-api-access-2c6ch\") pod \"network-check-target-tlwhr\" (UID: \"e4fd5db6-98ca-462f-b950-10c0fe775718\") " pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:24:23.830124 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:23.830097 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:23.830124 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:23.830121 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:23.830248 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:23.830134 2575 projected.go:194] Error preparing data for projected volume kube-api-access-2c6ch for pod openshift-network-diagnostics/network-check-target-tlwhr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:23.830248 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:23.830192 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e4fd5db6-98ca-462f-b950-10c0fe775718-kube-api-access-2c6ch podName:e4fd5db6-98ca-462f-b950-10c0fe775718 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:55.830173858 +0000 UTC m=+65.355854475 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-2c6ch" (UniqueName: "kubernetes.io/projected/e4fd5db6-98ca-462f-b950-10c0fe775718-kube-api-access-2c6ch") pod "network-check-target-tlwhr" (UID: "e4fd5db6-98ca-462f-b950-10c0fe775718") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:23.831304 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:23.831279 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7788f749ff-bpwmb"] Apr 17 17:24:23.868755 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:23.868694 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wqv6n"] Apr 17 17:24:23.868903 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:23.868877 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:24:23.871233 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:23.871212 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 17:24:23.871479 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:23.871465 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jk79w\"" Apr 17 17:24:23.871533 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:23.871476 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 17:24:23.871579 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:23.871540 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 17:24:23.885689 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:23.885668 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 17:24:23.888885 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:23.888865 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7788f749ff-bpwmb"] Apr 17 17:24:23.888885 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:23.888887 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wqv6n"] Apr 17 17:24:23.889002 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:23.888969 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wqv6n" Apr 17 17:24:23.890832 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:23.890803 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 17:24:23.890906 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:23.890811 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 17:24:23.890906 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:23.890857 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qbw75\"" Apr 17 17:24:23.942046 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:23.942011 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6r887"] Apr 17 17:24:23.956702 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:23.956677 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6r887"] Apr 17 17:24:23.956834 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:23.956808 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6r887" Apr 17 17:24:23.959076 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:23.959057 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hd29f\"" Apr 17 17:24:23.959230 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:23.959057 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 17:24:23.959304 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:23.959090 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 17:24:23.959445 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:23.959165 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 17:24:24.031288 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.031255 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb895382-4679-4cac-97c0-92e3122b7ba0-config-volume\") pod \"dns-default-wqv6n\" (UID: \"cb895382-4679-4cac-97c0-92e3122b7ba0\") " pod="openshift-dns/dns-default-wqv6n" Apr 17 17:24:24.031288 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.031299 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb895382-4679-4cac-97c0-92e3122b7ba0-metrics-tls\") pod \"dns-default-wqv6n\" (UID: \"cb895382-4679-4cac-97c0-92e3122b7ba0\") " pod="openshift-dns/dns-default-wqv6n" Apr 17 17:24:24.031489 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.031326 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d29d8c9-146b-4e7e-988e-c3d984ff39e7-cert\") pod \"ingress-canary-6r887\" (UID: \"2d29d8c9-146b-4e7e-988e-c3d984ff39e7\") " pod="openshift-ingress-canary/ingress-canary-6r887" Apr 17 17:24:24.031489 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.031356 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt925\" (UniqueName: \"kubernetes.io/projected/cb895382-4679-4cac-97c0-92e3122b7ba0-kube-api-access-qt925\") pod \"dns-default-wqv6n\" (UID: \"cb895382-4679-4cac-97c0-92e3122b7ba0\") " pod="openshift-dns/dns-default-wqv6n" Apr 17 17:24:24.031489 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.031400 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cplp9\" (UniqueName: \"kubernetes.io/projected/2d29d8c9-146b-4e7e-988e-c3d984ff39e7-kube-api-access-cplp9\") pod \"ingress-canary-6r887\" (UID: \"2d29d8c9-146b-4e7e-988e-c3d984ff39e7\") " pod="openshift-ingress-canary/ingress-canary-6r887" Apr 17 17:24:24.031489 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.031425 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-tls\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:24:24.031681 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.031487 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-image-registry-private-configuration\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:24:24.031681 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.031589 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-ca-trust-extracted\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:24:24.031681 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.031615 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-bound-sa-token\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:24:24.031681 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.031647 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-certificates\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:24:24.031911 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.031728 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4prr\" (UniqueName: \"kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-kube-api-access-d4prr\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:24:24.031911 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.031797 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cb895382-4679-4cac-97c0-92e3122b7ba0-tmp-dir\") pod \"dns-default-wqv6n\" (UID: \"cb895382-4679-4cac-97c0-92e3122b7ba0\") " pod="openshift-dns/dns-default-wqv6n" Apr 17 17:24:24.031911 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.031848 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-installation-pull-secrets\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:24:24.031911 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.031894 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-trusted-ca\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:24:24.132797 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.132711 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-ca-trust-extracted\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:24:24.132797 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.132758 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-bound-sa-token\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:24:24.132797 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.132796 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-certificates\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:24:24.133446 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.132813 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4prr\" (UniqueName: \"kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-kube-api-access-d4prr\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:24:24.133446 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.132867 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cb895382-4679-4cac-97c0-92e3122b7ba0-tmp-dir\") pod \"dns-default-wqv6n\" (UID: \"cb895382-4679-4cac-97c0-92e3122b7ba0\") " pod="openshift-dns/dns-default-wqv6n" Apr 17 17:24:24.133446 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.132912 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-installation-pull-secrets\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:24:24.133446 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.132946 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-trusted-ca\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:24:24.133446 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.132971 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb895382-4679-4cac-97c0-92e3122b7ba0-config-volume\") pod \"dns-default-wqv6n\" (UID: \"cb895382-4679-4cac-97c0-92e3122b7ba0\") " pod="openshift-dns/dns-default-wqv6n" Apr 17 17:24:24.133446 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.132994 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb895382-4679-4cac-97c0-92e3122b7ba0-metrics-tls\") pod \"dns-default-wqv6n\" (UID: \"cb895382-4679-4cac-97c0-92e3122b7ba0\") " pod="openshift-dns/dns-default-wqv6n" Apr 17 17:24:24.133446 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.133193 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d29d8c9-146b-4e7e-988e-c3d984ff39e7-cert\") pod \"ingress-canary-6r887\" (UID: \"2d29d8c9-146b-4e7e-988e-c3d984ff39e7\") " pod="openshift-ingress-canary/ingress-canary-6r887" Apr 17 17:24:24.133446 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.133231 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qt925\" (UniqueName: \"kubernetes.io/projected/cb895382-4679-4cac-97c0-92e3122b7ba0-kube-api-access-qt925\") pod \"dns-default-wqv6n\" (UID: \"cb895382-4679-4cac-97c0-92e3122b7ba0\") " pod="openshift-dns/dns-default-wqv6n" Apr 17 17:24:24.133446 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.133260 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cb895382-4679-4cac-97c0-92e3122b7ba0-tmp-dir\") pod \"dns-default-wqv6n\" (UID: \"cb895382-4679-4cac-97c0-92e3122b7ba0\") " pod="openshift-dns/dns-default-wqv6n" Apr 17 17:24:24.133446 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.133270 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cplp9\" (UniqueName: \"kubernetes.io/projected/2d29d8c9-146b-4e7e-988e-c3d984ff39e7-kube-api-access-cplp9\") pod \"ingress-canary-6r887\" (UID: \"2d29d8c9-146b-4e7e-988e-c3d984ff39e7\") " pod="openshift-ingress-canary/ingress-canary-6r887" Apr 17 17:24:24.133446 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.133295 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-tls\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:24:24.133446 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.133329 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-image-registry-private-configuration\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:24:24.133446 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:24.133335 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:24.133446 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:24.133433 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:24.133952 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:24.133460 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb895382-4679-4cac-97c0-92e3122b7ba0-metrics-tls podName:cb895382-4679-4cac-97c0-92e3122b7ba0 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:24.633441268 +0000 UTC m=+34.159121870 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cb895382-4679-4cac-97c0-92e3122b7ba0-metrics-tls") pod "dns-default-wqv6n" (UID: "cb895382-4679-4cac-97c0-92e3122b7ba0") : secret "dns-default-metrics-tls" not found Apr 17 17:24:24.133952 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:24.133466 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:24:24.133952 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:24.133479 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7788f749ff-bpwmb: secret "image-registry-tls" not found Apr 17 17:24:24.133952 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:24.133483 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d29d8c9-146b-4e7e-988e-c3d984ff39e7-cert podName:2d29d8c9-146b-4e7e-988e-c3d984ff39e7 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:24.633465894 +0000 UTC m=+34.159146507 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2d29d8c9-146b-4e7e-988e-c3d984ff39e7-cert") pod "ingress-canary-6r887" (UID: "2d29d8c9-146b-4e7e-988e-c3d984ff39e7") : secret "canary-serving-cert" not found Apr 17 17:24:24.133952 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:24.133521 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-tls podName:5dd7aab7-9b6e-47df-9f4f-c7e5041066d0 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:24.633510529 +0000 UTC m=+34.159191139 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-tls") pod "image-registry-7788f749ff-bpwmb" (UID: "5dd7aab7-9b6e-47df-9f4f-c7e5041066d0") : secret "image-registry-tls" not found Apr 17 17:24:24.133952 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.133910 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb895382-4679-4cac-97c0-92e3122b7ba0-config-volume\") pod \"dns-default-wqv6n\" (UID: \"cb895382-4679-4cac-97c0-92e3122b7ba0\") " pod="openshift-dns/dns-default-wqv6n" Apr 17 17:24:24.139766 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.139740 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-ca-trust-extracted\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:24:24.141276 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.141072 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-trusted-ca\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:24:24.144282 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.144240 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-bound-sa-token\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:24:24.144368 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.144285 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt925\" (UniqueName: \"kubernetes.io/projected/cb895382-4679-4cac-97c0-92e3122b7ba0-kube-api-access-qt925\") pod \"dns-default-wqv6n\" (UID: \"cb895382-4679-4cac-97c0-92e3122b7ba0\") " pod="openshift-dns/dns-default-wqv6n" Apr 17 17:24:24.144368 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.144323 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-image-registry-private-configuration\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:24:24.144368 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.144349 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4prr\" (UniqueName: \"kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-kube-api-access-d4prr\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:24:24.144536 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.144387 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cplp9\" (UniqueName: \"kubernetes.io/projected/2d29d8c9-146b-4e7e-988e-c3d984ff39e7-kube-api-access-cplp9\") pod \"ingress-canary-6r887\" (UID: \"2d29d8c9-146b-4e7e-988e-c3d984ff39e7\") " pod="openshift-ingress-canary/ingress-canary-6r887" Apr 17 17:24:24.145164 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.145144 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-installation-pull-secrets\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:24:24.149556 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.149537 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-certificates\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:24:24.637093 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.637057 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb895382-4679-4cac-97c0-92e3122b7ba0-metrics-tls\") pod \"dns-default-wqv6n\" (UID: \"cb895382-4679-4cac-97c0-92e3122b7ba0\") " pod="openshift-dns/dns-default-wqv6n" Apr 17 17:24:24.637093 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.637102 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d29d8c9-146b-4e7e-988e-c3d984ff39e7-cert\") pod \"ingress-canary-6r887\" (UID: \"2d29d8c9-146b-4e7e-988e-c3d984ff39e7\") " pod="openshift-ingress-canary/ingress-canary-6r887" Apr 17 17:24:24.637337 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:24.637212 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:24.637337 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:24.637219 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:24.637337 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:24.637248 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-tls\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:24:24.637337 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:24.637292 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d29d8c9-146b-4e7e-988e-c3d984ff39e7-cert podName:2d29d8c9-146b-4e7e-988e-c3d984ff39e7 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:25.637273023 +0000 UTC m=+35.162953633 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2d29d8c9-146b-4e7e-988e-c3d984ff39e7-cert") pod "ingress-canary-6r887" (UID: "2d29d8c9-146b-4e7e-988e-c3d984ff39e7") : secret "canary-serving-cert" not found Apr 17 17:24:24.637337 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:24.637311 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:24:24.637337 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:24.637318 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb895382-4679-4cac-97c0-92e3122b7ba0-metrics-tls podName:cb895382-4679-4cac-97c0-92e3122b7ba0 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:25.637308547 +0000 UTC m=+35.162989157 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cb895382-4679-4cac-97c0-92e3122b7ba0-metrics-tls") pod "dns-default-wqv6n" (UID: "cb895382-4679-4cac-97c0-92e3122b7ba0") : secret "dns-default-metrics-tls" not found Apr 17 17:24:24.637337 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:24.637322 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7788f749ff-bpwmb: secret "image-registry-tls" not found Apr 17 17:24:24.637657 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:24.637369 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-tls podName:5dd7aab7-9b6e-47df-9f4f-c7e5041066d0 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:25.637353165 +0000 UTC m=+35.163033776 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-tls") pod "image-registry-7788f749ff-bpwmb" (UID: "5dd7aab7-9b6e-47df-9f4f-c7e5041066d0") : secret "image-registry-tls" not found Apr 17 17:24:25.089413 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:25.089380 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:24:25.089644 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:25.089380 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:24:25.089644 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:25.089380 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:24:25.092996 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:25.092972 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 17:24:25.093119 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:25.093074 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 17:24:25.093305 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:25.093289 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7c8rp\"" Apr 17 17:24:25.093356 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:25.092976 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-zp45b\"" Apr 17 17:24:25.093506 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:25.093485 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 17:24:25.093629 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:25.093515 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 17:24:25.646508 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:25.646467 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb895382-4679-4cac-97c0-92e3122b7ba0-metrics-tls\") pod \"dns-default-wqv6n\" (UID: \"cb895382-4679-4cac-97c0-92e3122b7ba0\") " pod="openshift-dns/dns-default-wqv6n" Apr 17 17:24:25.647051 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:25.646519 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d29d8c9-146b-4e7e-988e-c3d984ff39e7-cert\") pod \"ingress-canary-6r887\" (UID: \"2d29d8c9-146b-4e7e-988e-c3d984ff39e7\") " pod="openshift-ingress-canary/ingress-canary-6r887" Apr 17 17:24:25.647051 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:25.646560 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-tls\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:24:25.647051 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:25.646625 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:25.647051 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:25.646662 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:25.647051 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:25.646670 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:24:25.647051 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:25.646684 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7788f749ff-bpwmb: secret "image-registry-tls" not found Apr 17 17:24:25.647051 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:25.646713 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb895382-4679-4cac-97c0-92e3122b7ba0-metrics-tls podName:cb895382-4679-4cac-97c0-92e3122b7ba0 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:27.646691605 +0000 UTC m=+37.172372205 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cb895382-4679-4cac-97c0-92e3122b7ba0-metrics-tls") pod "dns-default-wqv6n" (UID: "cb895382-4679-4cac-97c0-92e3122b7ba0") : secret "dns-default-metrics-tls" not found Apr 17 17:24:25.647051 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:25.646730 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d29d8c9-146b-4e7e-988e-c3d984ff39e7-cert podName:2d29d8c9-146b-4e7e-988e-c3d984ff39e7 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:27.646722723 +0000 UTC m=+37.172403319 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2d29d8c9-146b-4e7e-988e-c3d984ff39e7-cert") pod "ingress-canary-6r887" (UID: "2d29d8c9-146b-4e7e-988e-c3d984ff39e7") : secret "canary-serving-cert" not found Apr 17 17:24:25.647051 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:25.646740 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-tls podName:5dd7aab7-9b6e-47df-9f4f-c7e5041066d0 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:27.646735157 +0000 UTC m=+37.172415755 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-tls") pod "image-registry-7788f749ff-bpwmb" (UID: "5dd7aab7-9b6e-47df-9f4f-c7e5041066d0") : secret "image-registry-tls" not found Apr 17 17:24:26.654555 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:26.654515 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/909ae085-82dd-4695-a517-7bf565c61c39-original-pull-secret\") pod \"global-pull-secret-syncer-xkj74\" (UID: \"909ae085-82dd-4695-a517-7bf565c61c39\") " pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:24:26.657579 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:26.657548 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/909ae085-82dd-4695-a517-7bf565c61c39-original-pull-secret\") pod \"global-pull-secret-syncer-xkj74\" (UID: \"909ae085-82dd-4695-a517-7bf565c61c39\") " pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:24:26.902644 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:26.902613 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xkj74" Apr 17 17:24:27.050617 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:27.050589 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xkj74"] Apr 17 17:24:27.054206 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:24:27.054183 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod909ae085_82dd_4695_a517_7bf565c61c39.slice/crio-c00ad86499b82cc0c767de1c5c68143ed845401a87802b6a91c0f8e04af74994 WatchSource:0}: Error finding container c00ad86499b82cc0c767de1c5c68143ed845401a87802b6a91c0f8e04af74994: Status 404 returned error can't find the container with id c00ad86499b82cc0c767de1c5c68143ed845401a87802b6a91c0f8e04af74994 Apr 17 17:24:27.312007 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:27.311852 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xkj74" event={"ID":"909ae085-82dd-4695-a517-7bf565c61c39","Type":"ContainerStarted","Data":"c00ad86499b82cc0c767de1c5c68143ed845401a87802b6a91c0f8e04af74994"} Apr 17 17:24:27.314333 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:27.314303 2575 generic.go:358] "Generic (PLEG): container finished" podID="adeb035c-390a-4439-9413-491fc20cac69" containerID="5c995b19f5c558bbc5e4c09a256b48e9348f800d3f4d2b8a6951c4d5d2168bb3" exitCode=0 Apr 17 17:24:27.314461 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:27.314349 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hprdr" event={"ID":"adeb035c-390a-4439-9413-491fc20cac69","Type":"ContainerDied","Data":"5c995b19f5c558bbc5e4c09a256b48e9348f800d3f4d2b8a6951c4d5d2168bb3"} Apr 17 17:24:27.659754 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:27.659669 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb895382-4679-4cac-97c0-92e3122b7ba0-metrics-tls\") pod \"dns-default-wqv6n\" (UID: \"cb895382-4679-4cac-97c0-92e3122b7ba0\") " pod="openshift-dns/dns-default-wqv6n" Apr 17 17:24:27.660200 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:27.659789 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d29d8c9-146b-4e7e-988e-c3d984ff39e7-cert\") pod \"ingress-canary-6r887\" (UID: \"2d29d8c9-146b-4e7e-988e-c3d984ff39e7\") " pod="openshift-ingress-canary/ingress-canary-6r887" Apr 17 17:24:27.660200 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:27.659801 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:27.660200 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:27.659818 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-tls\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:24:27.660200 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:27.659886 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb895382-4679-4cac-97c0-92e3122b7ba0-metrics-tls podName:cb895382-4679-4cac-97c0-92e3122b7ba0 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:31.659861651 +0000 UTC m=+41.185542249 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cb895382-4679-4cac-97c0-92e3122b7ba0-metrics-tls") pod "dns-default-wqv6n" (UID: "cb895382-4679-4cac-97c0-92e3122b7ba0") : secret "dns-default-metrics-tls" not found Apr 17 17:24:27.660200 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:27.659920 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:27.660200 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:27.659944 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:24:27.660200 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:27.659958 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7788f749ff-bpwmb: secret "image-registry-tls" not found Apr 17 17:24:27.660200 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:27.659961 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d29d8c9-146b-4e7e-988e-c3d984ff39e7-cert podName:2d29d8c9-146b-4e7e-988e-c3d984ff39e7 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:31.659949079 +0000 UTC m=+41.185629680 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2d29d8c9-146b-4e7e-988e-c3d984ff39e7-cert") pod "ingress-canary-6r887" (UID: "2d29d8c9-146b-4e7e-988e-c3d984ff39e7") : secret "canary-serving-cert" not found Apr 17 17:24:27.660200 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:27.660006 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-tls podName:5dd7aab7-9b6e-47df-9f4f-c7e5041066d0 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:31.65999549 +0000 UTC m=+41.185676088 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-tls") pod "image-registry-7788f749ff-bpwmb" (UID: "5dd7aab7-9b6e-47df-9f4f-c7e5041066d0") : secret "image-registry-tls" not found Apr 17 17:24:28.319385 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:28.319348 2575 generic.go:358] "Generic (PLEG): container finished" podID="adeb035c-390a-4439-9413-491fc20cac69" containerID="9b2839ca82e5fb83ba30721e63190fd3d2540d811b94c1cf20cb957cf785ccc9" exitCode=0 Apr 17 17:24:28.319538 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:28.319406 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hprdr" event={"ID":"adeb035c-390a-4439-9413-491fc20cac69","Type":"ContainerDied","Data":"9b2839ca82e5fb83ba30721e63190fd3d2540d811b94c1cf20cb957cf785ccc9"} Apr 17 17:24:29.324412 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:29.324377 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hprdr" event={"ID":"adeb035c-390a-4439-9413-491fc20cac69","Type":"ContainerStarted","Data":"7559b932fa20a05a929411b18008ccd49cff90442d86f75cab6edd3541c171f3"} Apr 17 17:24:29.349125 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:29.348173 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-hprdr" podStartSLOduration=4.285125658 podStartE2EDuration="38.348150722s" podCreationTimestamp="2026-04-17 17:23:51 +0000 UTC" firstStartedPulling="2026-04-17 17:23:52.405509803 +0000 UTC m=+1.931190403" lastFinishedPulling="2026-04-17 17:24:26.468534866 +0000 UTC m=+35.994215467" observedRunningTime="2026-04-17 17:24:29.34599923 +0000 UTC m=+38.871679853" watchObservedRunningTime="2026-04-17 17:24:29.348150722 +0000 UTC m=+38.873831343" Apr 17 17:24:31.693609 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:31.693562 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb895382-4679-4cac-97c0-92e3122b7ba0-metrics-tls\") pod \"dns-default-wqv6n\" (UID: \"cb895382-4679-4cac-97c0-92e3122b7ba0\") " pod="openshift-dns/dns-default-wqv6n" Apr 17 17:24:31.693609 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:31.693616 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d29d8c9-146b-4e7e-988e-c3d984ff39e7-cert\") pod \"ingress-canary-6r887\" (UID: \"2d29d8c9-146b-4e7e-988e-c3d984ff39e7\") " pod="openshift-ingress-canary/ingress-canary-6r887" Apr 17 17:24:31.694153 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:31.693652 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-tls\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:24:31.694153 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:31.693734 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:31.694153 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:31.693756 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:24:31.694153 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:31.693767 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7788f749ff-bpwmb: secret "image-registry-tls" not found Apr 17 17:24:31.694153 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:31.693793 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:31.694153 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:31.693809 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb895382-4679-4cac-97c0-92e3122b7ba0-metrics-tls podName:cb895382-4679-4cac-97c0-92e3122b7ba0 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:39.693790997 +0000 UTC m=+49.219471612 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cb895382-4679-4cac-97c0-92e3122b7ba0-metrics-tls") pod "dns-default-wqv6n" (UID: "cb895382-4679-4cac-97c0-92e3122b7ba0") : secret "dns-default-metrics-tls" not found Apr 17 17:24:31.694153 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:31.693844 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-tls podName:5dd7aab7-9b6e-47df-9f4f-c7e5041066d0 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:39.693819514 +0000 UTC m=+49.219500111 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-tls") pod "image-registry-7788f749ff-bpwmb" (UID: "5dd7aab7-9b6e-47df-9f4f-c7e5041066d0") : secret "image-registry-tls" not found Apr 17 17:24:31.694153 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:31.693861 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d29d8c9-146b-4e7e-988e-c3d984ff39e7-cert podName:2d29d8c9-146b-4e7e-988e-c3d984ff39e7 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:39.693852652 +0000 UTC m=+49.219533250 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2d29d8c9-146b-4e7e-988e-c3d984ff39e7-cert") pod "ingress-canary-6r887" (UID: "2d29d8c9-146b-4e7e-988e-c3d984ff39e7") : secret "canary-serving-cert" not found Apr 17 17:24:32.330817 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:32.330783 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xkj74" event={"ID":"909ae085-82dd-4695-a517-7bf565c61c39","Type":"ContainerStarted","Data":"7fd354fe1c1a3661a28bc4af961f16bf580b9c0bc05cb539ccac135e8b714b93"} Apr 17 17:24:32.346608 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:32.346558 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-xkj74" podStartSLOduration=33.284545712 podStartE2EDuration="38.346543977s" podCreationTimestamp="2026-04-17 17:23:54 +0000 UTC" firstStartedPulling="2026-04-17 17:24:27.055747563 +0000 UTC m=+36.581428160" lastFinishedPulling="2026-04-17 17:24:32.117745829 +0000 UTC m=+41.643426425" observedRunningTime="2026-04-17 17:24:32.345582114 +0000 UTC m=+41.871262733" watchObservedRunningTime="2026-04-17 17:24:32.346543977 +0000 UTC m=+41.872224598" Apr 17 17:24:39.752182 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:39.752135 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb895382-4679-4cac-97c0-92e3122b7ba0-metrics-tls\") pod \"dns-default-wqv6n\" (UID: \"cb895382-4679-4cac-97c0-92e3122b7ba0\") " pod="openshift-dns/dns-default-wqv6n" Apr 17 17:24:39.752182 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:39.752182 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d29d8c9-146b-4e7e-988e-c3d984ff39e7-cert\") pod \"ingress-canary-6r887\" (UID: \"2d29d8c9-146b-4e7e-988e-c3d984ff39e7\") " pod="openshift-ingress-canary/ingress-canary-6r887" Apr 17 17:24:39.752649 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:39.752209 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-tls\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:24:39.752649 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:39.752284 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:39.752649 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:39.752303 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:39.752649 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:39.752306 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:24:39.752649 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:39.752386 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7788f749ff-bpwmb: secret "image-registry-tls" not found Apr 17 17:24:39.752649 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:39.752373 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb895382-4679-4cac-97c0-92e3122b7ba0-metrics-tls podName:cb895382-4679-4cac-97c0-92e3122b7ba0 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:55.752357803 +0000 UTC m=+65.278038413 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cb895382-4679-4cac-97c0-92e3122b7ba0-metrics-tls") pod "dns-default-wqv6n" (UID: "cb895382-4679-4cac-97c0-92e3122b7ba0") : secret "dns-default-metrics-tls" not found Apr 17 17:24:39.752649 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:39.752423 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d29d8c9-146b-4e7e-988e-c3d984ff39e7-cert podName:2d29d8c9-146b-4e7e-988e-c3d984ff39e7 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:55.752411566 +0000 UTC m=+65.278092167 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2d29d8c9-146b-4e7e-988e-c3d984ff39e7-cert") pod "ingress-canary-6r887" (UID: "2d29d8c9-146b-4e7e-988e-c3d984ff39e7") : secret "canary-serving-cert" not found Apr 17 17:24:39.752649 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:39.752434 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-tls podName:5dd7aab7-9b6e-47df-9f4f-c7e5041066d0 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:55.752428001 +0000 UTC m=+65.278108598 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-tls") pod "image-registry-7788f749ff-bpwmb" (UID: "5dd7aab7-9b6e-47df-9f4f-c7e5041066d0") : secret "image-registry-tls" not found Apr 17 17:24:50.308195 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:50.308168 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p2pbl" Apr 17 17:24:55.762121 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:55.762080 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb895382-4679-4cac-97c0-92e3122b7ba0-metrics-tls\") pod \"dns-default-wqv6n\" (UID: \"cb895382-4679-4cac-97c0-92e3122b7ba0\") " pod="openshift-dns/dns-default-wqv6n" Apr 17 17:24:55.762121 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:55.762120 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d29d8c9-146b-4e7e-988e-c3d984ff39e7-cert\") pod \"ingress-canary-6r887\" (UID: \"2d29d8c9-146b-4e7e-988e-c3d984ff39e7\") " pod="openshift-ingress-canary/ingress-canary-6r887" Apr 17 17:24:55.762676 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:55.762139 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-metrics-certs\") pod \"network-metrics-daemon-mqlsd\" (UID: \"41eeea20-b1c0-4cb6-8da8-a4a26a60423d\") " pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:24:55.762676 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:55.762161 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-tls\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:24:55.762676 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:55.762227 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:55.762676 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:55.762276 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:24:55.762676 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:55.762289 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7788f749ff-bpwmb: secret "image-registry-tls" not found Apr 17 17:24:55.762676 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:55.762289 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb895382-4679-4cac-97c0-92e3122b7ba0-metrics-tls podName:cb895382-4679-4cac-97c0-92e3122b7ba0 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:27.762272486 +0000 UTC m=+97.287953087 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cb895382-4679-4cac-97c0-92e3122b7ba0-metrics-tls") pod "dns-default-wqv6n" (UID: "cb895382-4679-4cac-97c0-92e3122b7ba0") : secret "dns-default-metrics-tls" not found Apr 17 17:24:55.762676 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:55.762229 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:55.762676 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:55.762344 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-tls podName:5dd7aab7-9b6e-47df-9f4f-c7e5041066d0 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:27.762327954 +0000 UTC m=+97.288008551 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-tls") pod "image-registry-7788f749ff-bpwmb" (UID: "5dd7aab7-9b6e-47df-9f4f-c7e5041066d0") : secret "image-registry-tls" not found Apr 17 17:24:55.762676 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:55.762405 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d29d8c9-146b-4e7e-988e-c3d984ff39e7-cert podName:2d29d8c9-146b-4e7e-988e-c3d984ff39e7 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:27.762381941 +0000 UTC m=+97.288062548 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2d29d8c9-146b-4e7e-988e-c3d984ff39e7-cert") pod "ingress-canary-6r887" (UID: "2d29d8c9-146b-4e7e-988e-c3d984ff39e7") : secret "canary-serving-cert" not found Apr 17 17:24:55.764447 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:55.764428 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 17:24:55.773118 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:55.773102 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 17:24:55.773207 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:24:55.773147 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-metrics-certs podName:41eeea20-b1c0-4cb6-8da8-a4a26a60423d nodeName:}" failed. No retries permitted until 2026-04-17 17:25:59.773134705 +0000 UTC m=+129.298815305 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-metrics-certs") pod "network-metrics-daemon-mqlsd" (UID: "41eeea20-b1c0-4cb6-8da8-a4a26a60423d") : secret "metrics-daemon-secret" not found Apr 17 17:24:55.863337 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:55.863314 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2c6ch\" (UniqueName: \"kubernetes.io/projected/e4fd5db6-98ca-462f-b950-10c0fe775718-kube-api-access-2c6ch\") pod \"network-check-target-tlwhr\" (UID: \"e4fd5db6-98ca-462f-b950-10c0fe775718\") " pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:24:55.865980 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:55.865962 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 17:24:55.876375 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:55.876355 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 17:24:55.887399 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:55.887379 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c6ch\" (UniqueName: \"kubernetes.io/projected/e4fd5db6-98ca-462f-b950-10c0fe775718-kube-api-access-2c6ch\") pod \"network-check-target-tlwhr\" (UID: \"e4fd5db6-98ca-462f-b950-10c0fe775718\") " pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:24:56.019040 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:56.018967 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-zp45b\"" Apr 17 17:24:56.027811 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:56.027796 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:24:56.151426 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:56.151397 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-tlwhr"] Apr 17 17:24:56.155673 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:24:56.155643 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4fd5db6_98ca_462f_b950_10c0fe775718.slice/crio-85b29e56cc58c3be4aaad1e1cd48bedfe6ec0daed783d827af17e808a6e7fa17 WatchSource:0}: Error finding container 85b29e56cc58c3be4aaad1e1cd48bedfe6ec0daed783d827af17e808a6e7fa17: Status 404 returned error can't find the container with id 85b29e56cc58c3be4aaad1e1cd48bedfe6ec0daed783d827af17e808a6e7fa17 Apr 17 17:24:56.376791 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:56.376712 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-tlwhr" event={"ID":"e4fd5db6-98ca-462f-b950-10c0fe775718","Type":"ContainerStarted","Data":"85b29e56cc58c3be4aaad1e1cd48bedfe6ec0daed783d827af17e808a6e7fa17"} Apr 17 17:24:59.383420 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:59.383388 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-tlwhr" event={"ID":"e4fd5db6-98ca-462f-b950-10c0fe775718","Type":"ContainerStarted","Data":"2afea52e5010ea4c0c1c01af115e58f73c07234f936e955a1a7ef86ffdb203eb"} Apr 17 17:24:59.383781 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:59.383617 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:24:59.401157 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:24:59.401113 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-tlwhr" podStartSLOduration=65.721096924 podStartE2EDuration="1m8.401102132s" podCreationTimestamp="2026-04-17 17:23:51 +0000 UTC" firstStartedPulling="2026-04-17 17:24:56.157945996 +0000 UTC m=+65.683626609" lastFinishedPulling="2026-04-17 17:24:58.83795122 +0000 UTC m=+68.363631817" observedRunningTime="2026-04-17 17:24:59.400686152 +0000 UTC m=+68.926366770" watchObservedRunningTime="2026-04-17 17:24:59.401102132 +0000 UTC m=+68.926782799" Apr 17 17:25:27.790100 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:27.789979 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb895382-4679-4cac-97c0-92e3122b7ba0-metrics-tls\") pod \"dns-default-wqv6n\" (UID: \"cb895382-4679-4cac-97c0-92e3122b7ba0\") " pod="openshift-dns/dns-default-wqv6n" Apr 17 17:25:27.790100 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:27.790022 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d29d8c9-146b-4e7e-988e-c3d984ff39e7-cert\") pod \"ingress-canary-6r887\" (UID: \"2d29d8c9-146b-4e7e-988e-c3d984ff39e7\") " pod="openshift-ingress-canary/ingress-canary-6r887" Apr 17 17:25:27.790100 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:27.790065 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-tls\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:25:27.790733 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:27.790137 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:25:27.790733 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:27.790150 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:25:27.790733 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:27.790161 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7788f749ff-bpwmb: secret "image-registry-tls" not found Apr 17 17:25:27.790733 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:27.790193 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:25:27.790733 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:27.790222 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb895382-4679-4cac-97c0-92e3122b7ba0-metrics-tls podName:cb895382-4679-4cac-97c0-92e3122b7ba0 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:31.790205625 +0000 UTC m=+161.315886225 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cb895382-4679-4cac-97c0-92e3122b7ba0-metrics-tls") pod "dns-default-wqv6n" (UID: "cb895382-4679-4cac-97c0-92e3122b7ba0") : secret "dns-default-metrics-tls" not found Apr 17 17:25:27.790733 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:27.790239 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-tls podName:5dd7aab7-9b6e-47df-9f4f-c7e5041066d0 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:31.790232309 +0000 UTC m=+161.315912906 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-tls") pod "image-registry-7788f749ff-bpwmb" (UID: "5dd7aab7-9b6e-47df-9f4f-c7e5041066d0") : secret "image-registry-tls" not found Apr 17 17:25:27.790733 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:27.790252 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d29d8c9-146b-4e7e-988e-c3d984ff39e7-cert podName:2d29d8c9-146b-4e7e-988e-c3d984ff39e7 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:31.790245347 +0000 UTC m=+161.315925944 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2d29d8c9-146b-4e7e-988e-c3d984ff39e7-cert") pod "ingress-canary-6r887" (UID: "2d29d8c9-146b-4e7e-988e-c3d984ff39e7") : secret "canary-serving-cert" not found Apr 17 17:25:30.387982 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:30.387951 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-tlwhr" Apr 17 17:25:50.467252 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.467215 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nk6g8"] Apr 17 17:25:50.469372 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.469349 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nk6g8" Apr 17 17:25:50.470811 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.470784 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrgfm"] Apr 17 17:25:50.471668 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.471649 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 17:25:50.471732 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.471665 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-ljt6z\"" Apr 17 17:25:50.471732 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.471649 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:25:50.472683 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.472667 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-9dmmr"] Apr 17 17:25:50.472801 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.472787 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrgfm" Apr 17 17:25:50.474363 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.474347 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-9dmmr" Apr 17 17:25:50.475068 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.475048 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 17:25:50.475264 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.475247 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:25:50.475650 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.475632 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-j4xhs\"" Apr 17 17:25:50.475737 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.475646 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 17:25:50.475737 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.475632 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 17:25:50.476486 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.476471 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 17:25:50.476789 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.476776 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 17:25:50.476853 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.476780 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 17:25:50.476934 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.476918 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 17:25:50.477052 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.477039 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-vxbr4\"" Apr 17 17:25:50.481043 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.481024 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nk6g8"] Apr 17 17:25:50.481413 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.481391 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 17:25:50.485177 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.485052 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrgfm"] Apr 17 17:25:50.485876 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.485857 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-9dmmr"] Apr 17 17:25:50.571524 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.571499 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7cjnp"] Apr 17 17:25:50.573243 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.573229 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-hclv4"] Apr 17 17:25:50.573377 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.573361 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7cjnp" Apr 17 17:25:50.575237 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.575218 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hclv4" Apr 17 17:25:50.575915 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.575891 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 17:25:50.575915 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.575910 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:25:50.576063 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.575897 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 17:25:50.576119 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.576068 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-ctj42"] Apr 17 17:25:50.576343 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.576327 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-fdgkw\"" Apr 17 17:25:50.576419 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.576330 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 17:25:50.578031 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.578014 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-ctj42" Apr 17 17:25:50.580800 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.580778 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 17:25:50.580800 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.580800 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 17:25:50.581053 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.581037 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 17:25:50.581197 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.581176 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 17:25:50.581197 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.581192 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-6dpg5\"" Apr 17 17:25:50.581544 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.581525 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 17:25:50.581626 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.581555 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 17:25:50.581626 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.581527 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:25:50.581626 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.581596 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-4558x\"" Apr 17 17:25:50.582085 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.582069 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 17:25:50.587207 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.587186 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 17:25:50.588582 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.588566 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7cjnp"] Apr 17 17:25:50.590905 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.590888 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-ctj42"] Apr 17 17:25:50.602978 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.602961 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-hclv4"] Apr 17 17:25:50.643707 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.643683 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3cccdeb-674a-4c4a-882c-679c52c9c0a9-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-qrgfm\" (UID: \"f3cccdeb-674a-4c4a-882c-679c52c9c0a9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrgfm" Apr 17 17:25:50.643808 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.643718 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3cccdeb-674a-4c4a-882c-679c52c9c0a9-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-qrgfm\" (UID: \"f3cccdeb-674a-4c4a-882c-679c52c9c0a9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrgfm" Apr 17 17:25:50.643808 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.643736 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvwxb\" (UniqueName: \"kubernetes.io/projected/f3cccdeb-674a-4c4a-882c-679c52c9c0a9-kube-api-access-mvwxb\") pod \"kube-storage-version-migrator-operator-6769c5d45-qrgfm\" (UID: \"f3cccdeb-674a-4c4a-882c-679c52c9c0a9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrgfm" Apr 17 17:25:50.643808 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.643755 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq4t5\" (UniqueName: \"kubernetes.io/projected/fc8eeee7-5da1-42b9-bbb1-9eaca3ec2a13-kube-api-access-dq4t5\") pod \"volume-data-source-validator-7c6cbb6c87-nk6g8\" (UID: \"fc8eeee7-5da1-42b9-bbb1-9eaca3ec2a13\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nk6g8" Apr 17 17:25:50.643930 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.643815 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb7ea451-8e84-4c6c-9ca4-85e14c54d30a-service-ca-bundle\") pod \"insights-operator-585dfdc468-9dmmr\" (UID: \"eb7ea451-8e84-4c6c-9ca4-85e14c54d30a\") " pod="openshift-insights/insights-operator-585dfdc468-9dmmr" Apr 17 17:25:50.643930 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.643881 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb7ea451-8e84-4c6c-9ca4-85e14c54d30a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-9dmmr\" (UID: \"eb7ea451-8e84-4c6c-9ca4-85e14c54d30a\") " pod="openshift-insights/insights-operator-585dfdc468-9dmmr" Apr 17 17:25:50.643930 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.643905 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb7ea451-8e84-4c6c-9ca4-85e14c54d30a-serving-cert\") pod \"insights-operator-585dfdc468-9dmmr\" (UID: \"eb7ea451-8e84-4c6c-9ca4-85e14c54d30a\") " pod="openshift-insights/insights-operator-585dfdc468-9dmmr" Apr 17 17:25:50.644024 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.643965 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/eb7ea451-8e84-4c6c-9ca4-85e14c54d30a-tmp\") pod \"insights-operator-585dfdc468-9dmmr\" (UID: \"eb7ea451-8e84-4c6c-9ca4-85e14c54d30a\") " pod="openshift-insights/insights-operator-585dfdc468-9dmmr" Apr 17 17:25:50.644024 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.643981 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/eb7ea451-8e84-4c6c-9ca4-85e14c54d30a-snapshots\") pod \"insights-operator-585dfdc468-9dmmr\" (UID: \"eb7ea451-8e84-4c6c-9ca4-85e14c54d30a\") " pod="openshift-insights/insights-operator-585dfdc468-9dmmr" Apr 17 17:25:50.644024 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.643996 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f2kc\" (UniqueName: \"kubernetes.io/projected/eb7ea451-8e84-4c6c-9ca4-85e14c54d30a-kube-api-access-6f2kc\") pod \"insights-operator-585dfdc468-9dmmr\" (UID: \"eb7ea451-8e84-4c6c-9ca4-85e14c54d30a\") " pod="openshift-insights/insights-operator-585dfdc468-9dmmr" Apr 17 17:25:50.671274 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.671248 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-6c56c99d76-p95sx"] Apr 17 17:25:50.673195 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.673181 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:25:50.675044 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.675027 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 17:25:50.675222 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.675206 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 17:25:50.675483 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.675468 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 17:25:50.675483 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.675480 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 17:25:50.675613 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.675546 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 17:25:50.675613 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.675554 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-hndhk\"" Apr 17 17:25:50.675613 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.675583 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 17:25:50.689113 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.689093 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-6c56c99d76-p95sx"] Apr 17 17:25:50.744712 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.744661 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3cccdeb-674a-4c4a-882c-679c52c9c0a9-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-qrgfm\" (UID: \"f3cccdeb-674a-4c4a-882c-679c52c9c0a9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrgfm" Apr 17 17:25:50.744712 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.744692 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5549e89-45e0-41a3-8e5a-7a240546ad14-config\") pod \"service-ca-operator-d6fc45fc5-7cjnp\" (UID: \"b5549e89-45e0-41a3-8e5a-7a240546ad14\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7cjnp" Apr 17 17:25:50.744878 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.744714 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5549e89-45e0-41a3-8e5a-7a240546ad14-serving-cert\") pod \"service-ca-operator-d6fc45fc5-7cjnp\" (UID: \"b5549e89-45e0-41a3-8e5a-7a240546ad14\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7cjnp" Apr 17 17:25:50.744878 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.744759 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb7ea451-8e84-4c6c-9ca4-85e14c54d30a-service-ca-bundle\") pod \"insights-operator-585dfdc468-9dmmr\" (UID: \"eb7ea451-8e84-4c6c-9ca4-85e14c54d30a\") " pod="openshift-insights/insights-operator-585dfdc468-9dmmr" Apr 17 17:25:50.744986 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.744900 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9e0a184-477b-45d6-835a-1606a973a5cf-config\") pod \"console-operator-9d4b6777b-ctj42\" (UID: \"a9e0a184-477b-45d6-835a-1606a973a5cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctj42" Apr 17 17:25:50.744986 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.744932 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glr6v\" (UniqueName: \"kubernetes.io/projected/a9e0a184-477b-45d6-835a-1606a973a5cf-kube-api-access-glr6v\") pod \"console-operator-9d4b6777b-ctj42\" (UID: \"a9e0a184-477b-45d6-835a-1606a973a5cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctj42" Apr 17 17:25:50.745058 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.745016 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb7ea451-8e84-4c6c-9ca4-85e14c54d30a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-9dmmr\" (UID: \"eb7ea451-8e84-4c6c-9ca4-85e14c54d30a\") " pod="openshift-insights/insights-operator-585dfdc468-9dmmr" Apr 17 17:25:50.745058 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.745046 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb7ea451-8e84-4c6c-9ca4-85e14c54d30a-serving-cert\") pod \"insights-operator-585dfdc468-9dmmr\" (UID: \"eb7ea451-8e84-4c6c-9ca4-85e14c54d30a\") " pod="openshift-insights/insights-operator-585dfdc468-9dmmr" Apr 17 17:25:50.745136 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.745064 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a9e0a184-477b-45d6-835a-1606a973a5cf-trusted-ca\") pod \"console-operator-9d4b6777b-ctj42\" (UID: \"a9e0a184-477b-45d6-835a-1606a973a5cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctj42" Apr 17 17:25:50.745136 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.745105 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9e0a184-477b-45d6-835a-1606a973a5cf-serving-cert\") pod \"console-operator-9d4b6777b-ctj42\" (UID: \"a9e0a184-477b-45d6-835a-1606a973a5cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctj42" Apr 17 17:25:50.745231 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.745135 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/eb7ea451-8e84-4c6c-9ca4-85e14c54d30a-snapshots\") pod \"insights-operator-585dfdc468-9dmmr\" (UID: \"eb7ea451-8e84-4c6c-9ca4-85e14c54d30a\") " pod="openshift-insights/insights-operator-585dfdc468-9dmmr" Apr 17 17:25:50.745231 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.745161 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6f2kc\" (UniqueName: \"kubernetes.io/projected/eb7ea451-8e84-4c6c-9ca4-85e14c54d30a-kube-api-access-6f2kc\") pod \"insights-operator-585dfdc468-9dmmr\" (UID: \"eb7ea451-8e84-4c6c-9ca4-85e14c54d30a\") " pod="openshift-insights/insights-operator-585dfdc468-9dmmr" Apr 17 17:25:50.745231 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.745178 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/eb7ea451-8e84-4c6c-9ca4-85e14c54d30a-tmp\") pod \"insights-operator-585dfdc468-9dmmr\" (UID: \"eb7ea451-8e84-4c6c-9ca4-85e14c54d30a\") " pod="openshift-insights/insights-operator-585dfdc468-9dmmr" Apr 17 17:25:50.745231 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.745212 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3cccdeb-674a-4c4a-882c-679c52c9c0a9-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-qrgfm\" (UID: \"f3cccdeb-674a-4c4a-882c-679c52c9c0a9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrgfm" Apr 17 17:25:50.745461 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.745436 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb7ea451-8e84-4c6c-9ca4-85e14c54d30a-service-ca-bundle\") pod \"insights-operator-585dfdc468-9dmmr\" (UID: \"eb7ea451-8e84-4c6c-9ca4-85e14c54d30a\") " pod="openshift-insights/insights-operator-585dfdc468-9dmmr" Apr 17 17:25:50.745593 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.745572 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mvwxb\" (UniqueName: \"kubernetes.io/projected/f3cccdeb-674a-4c4a-882c-679c52c9c0a9-kube-api-access-mvwxb\") pod \"kube-storage-version-migrator-operator-6769c5d45-qrgfm\" (UID: \"f3cccdeb-674a-4c4a-882c-679c52c9c0a9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrgfm" Apr 17 17:25:50.745712 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.745691 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dq4t5\" (UniqueName: \"kubernetes.io/projected/fc8eeee7-5da1-42b9-bbb1-9eaca3ec2a13-kube-api-access-dq4t5\") pod \"volume-data-source-validator-7c6cbb6c87-nk6g8\" (UID: \"fc8eeee7-5da1-42b9-bbb1-9eaca3ec2a13\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nk6g8" Apr 17 17:25:50.745896 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.745801 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3d9c3135-c5a6-4882-8da7-486b724f3469-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hclv4\" (UID: \"3d9c3135-c5a6-4882-8da7-486b724f3469\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hclv4" Apr 17 17:25:50.745896 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.745814 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/eb7ea451-8e84-4c6c-9ca4-85e14c54d30a-snapshots\") pod \"insights-operator-585dfdc468-9dmmr\" (UID: \"eb7ea451-8e84-4c6c-9ca4-85e14c54d30a\") " pod="openshift-insights/insights-operator-585dfdc468-9dmmr" Apr 17 17:25:50.745896 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.745752 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/eb7ea451-8e84-4c6c-9ca4-85e14c54d30a-tmp\") pod \"insights-operator-585dfdc468-9dmmr\" (UID: \"eb7ea451-8e84-4c6c-9ca4-85e14c54d30a\") " pod="openshift-insights/insights-operator-585dfdc468-9dmmr" Apr 17 17:25:50.745896 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.745811 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb7ea451-8e84-4c6c-9ca4-85e14c54d30a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-9dmmr\" (UID: \"eb7ea451-8e84-4c6c-9ca4-85e14c54d30a\") " pod="openshift-insights/insights-operator-585dfdc468-9dmmr" Apr 17 17:25:50.745896 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.745894 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv7kb\" (UniqueName: \"kubernetes.io/projected/b5549e89-45e0-41a3-8e5a-7a240546ad14-kube-api-access-jv7kb\") pod \"service-ca-operator-d6fc45fc5-7cjnp\" (UID: \"b5549e89-45e0-41a3-8e5a-7a240546ad14\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7cjnp" Apr 17 17:25:50.746208 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.745923 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/3d9c3135-c5a6-4882-8da7-486b724f3469-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-hclv4\" (UID: \"3d9c3135-c5a6-4882-8da7-486b724f3469\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hclv4" Apr 17 17:25:50.746208 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.745971 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnx5t\" (UniqueName: \"kubernetes.io/projected/3d9c3135-c5a6-4882-8da7-486b724f3469-kube-api-access-gnx5t\") pod \"cluster-monitoring-operator-75587bd455-hclv4\" (UID: \"3d9c3135-c5a6-4882-8da7-486b724f3469\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hclv4" Apr 17 17:25:50.746320 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.746304 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3cccdeb-674a-4c4a-882c-679c52c9c0a9-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-qrgfm\" (UID: \"f3cccdeb-674a-4c4a-882c-679c52c9c0a9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrgfm" Apr 17 17:25:50.747126 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.747106 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3cccdeb-674a-4c4a-882c-679c52c9c0a9-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-qrgfm\" (UID: \"f3cccdeb-674a-4c4a-882c-679c52c9c0a9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrgfm" Apr 17 17:25:50.747357 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.747343 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb7ea451-8e84-4c6c-9ca4-85e14c54d30a-serving-cert\") pod \"insights-operator-585dfdc468-9dmmr\" (UID: \"eb7ea451-8e84-4c6c-9ca4-85e14c54d30a\") " pod="openshift-insights/insights-operator-585dfdc468-9dmmr" Apr 17 17:25:50.753870 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.753819 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f2kc\" (UniqueName: \"kubernetes.io/projected/eb7ea451-8e84-4c6c-9ca4-85e14c54d30a-kube-api-access-6f2kc\") pod \"insights-operator-585dfdc468-9dmmr\" (UID: \"eb7ea451-8e84-4c6c-9ca4-85e14c54d30a\") " pod="openshift-insights/insights-operator-585dfdc468-9dmmr" Apr 17 17:25:50.755302 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.755281 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvwxb\" (UniqueName: \"kubernetes.io/projected/f3cccdeb-674a-4c4a-882c-679c52c9c0a9-kube-api-access-mvwxb\") pod \"kube-storage-version-migrator-operator-6769c5d45-qrgfm\" (UID: \"f3cccdeb-674a-4c4a-882c-679c52c9c0a9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrgfm" Apr 17 17:25:50.756128 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.756108 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq4t5\" (UniqueName: \"kubernetes.io/projected/fc8eeee7-5da1-42b9-bbb1-9eaca3ec2a13-kube-api-access-dq4t5\") pod \"volume-data-source-validator-7c6cbb6c87-nk6g8\" (UID: \"fc8eeee7-5da1-42b9-bbb1-9eaca3ec2a13\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nk6g8" Apr 17 17:25:50.782037 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.782019 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nk6g8" Apr 17 17:25:50.787624 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.787609 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrgfm" Apr 17 17:25:50.793150 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.793130 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-9dmmr" Apr 17 17:25:50.849861 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.846960 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a9e0a184-477b-45d6-835a-1606a973a5cf-trusted-ca\") pod \"console-operator-9d4b6777b-ctj42\" (UID: \"a9e0a184-477b-45d6-835a-1606a973a5cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctj42" Apr 17 17:25:50.849861 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.847018 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9a77eac-7075-491c-a2b9-080151e1cac9-service-ca-bundle\") pod \"router-default-6c56c99d76-p95sx\" (UID: \"e9a77eac-7075-491c-a2b9-080151e1cac9\") " pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:25:50.849861 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.847073 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9e0a184-477b-45d6-835a-1606a973a5cf-serving-cert\") pod \"console-operator-9d4b6777b-ctj42\" (UID: \"a9e0a184-477b-45d6-835a-1606a973a5cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctj42" Apr 17 17:25:50.849861 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.847100 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9a77eac-7075-491c-a2b9-080151e1cac9-metrics-certs\") pod \"router-default-6c56c99d76-p95sx\" (UID: \"e9a77eac-7075-491c-a2b9-080151e1cac9\") " pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:25:50.849861 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.847131 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e9a77eac-7075-491c-a2b9-080151e1cac9-stats-auth\") pod \"router-default-6c56c99d76-p95sx\" (UID: \"e9a77eac-7075-491c-a2b9-080151e1cac9\") " pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:25:50.849861 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.847165 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5d46\" (UniqueName: \"kubernetes.io/projected/e9a77eac-7075-491c-a2b9-080151e1cac9-kube-api-access-q5d46\") pod \"router-default-6c56c99d76-p95sx\" (UID: \"e9a77eac-7075-491c-a2b9-080151e1cac9\") " pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:25:50.849861 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.847191 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e9a77eac-7075-491c-a2b9-080151e1cac9-default-certificate\") pod \"router-default-6c56c99d76-p95sx\" (UID: \"e9a77eac-7075-491c-a2b9-080151e1cac9\") " pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:25:50.849861 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.847228 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3d9c3135-c5a6-4882-8da7-486b724f3469-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hclv4\" (UID: \"3d9c3135-c5a6-4882-8da7-486b724f3469\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hclv4" Apr 17 17:25:50.849861 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.847263 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jv7kb\" (UniqueName: \"kubernetes.io/projected/b5549e89-45e0-41a3-8e5a-7a240546ad14-kube-api-access-jv7kb\") pod \"service-ca-operator-d6fc45fc5-7cjnp\" (UID: \"b5549e89-45e0-41a3-8e5a-7a240546ad14\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7cjnp" Apr 17 17:25:50.849861 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.847291 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/3d9c3135-c5a6-4882-8da7-486b724f3469-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-hclv4\" (UID: \"3d9c3135-c5a6-4882-8da7-486b724f3469\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hclv4" Apr 17 17:25:50.849861 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.847347 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gnx5t\" (UniqueName: \"kubernetes.io/projected/3d9c3135-c5a6-4882-8da7-486b724f3469-kube-api-access-gnx5t\") pod \"cluster-monitoring-operator-75587bd455-hclv4\" (UID: \"3d9c3135-c5a6-4882-8da7-486b724f3469\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hclv4" Apr 17 17:25:50.849861 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.847377 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5549e89-45e0-41a3-8e5a-7a240546ad14-config\") pod \"service-ca-operator-d6fc45fc5-7cjnp\" (UID: \"b5549e89-45e0-41a3-8e5a-7a240546ad14\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7cjnp" Apr 17 17:25:50.849861 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.847411 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5549e89-45e0-41a3-8e5a-7a240546ad14-serving-cert\") pod \"service-ca-operator-d6fc45fc5-7cjnp\" (UID: \"b5549e89-45e0-41a3-8e5a-7a240546ad14\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7cjnp" Apr 17 17:25:50.849861 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.847447 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9e0a184-477b-45d6-835a-1606a973a5cf-config\") pod \"console-operator-9d4b6777b-ctj42\" (UID: \"a9e0a184-477b-45d6-835a-1606a973a5cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctj42" Apr 17 17:25:50.849861 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.847476 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-glr6v\" (UniqueName: \"kubernetes.io/projected/a9e0a184-477b-45d6-835a-1606a973a5cf-kube-api-access-glr6v\") pod \"console-operator-9d4b6777b-ctj42\" (UID: \"a9e0a184-477b-45d6-835a-1606a973a5cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctj42" Apr 17 17:25:50.849861 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:50.848236 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:25:50.850810 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:50.848317 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d9c3135-c5a6-4882-8da7-486b724f3469-cluster-monitoring-operator-tls podName:3d9c3135-c5a6-4882-8da7-486b724f3469 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:51.3482955 +0000 UTC m=+120.873976097 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3d9c3135-c5a6-4882-8da7-486b724f3469-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-hclv4" (UID: "3d9c3135-c5a6-4882-8da7-486b724f3469") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:25:50.850810 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.849768 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a9e0a184-477b-45d6-835a-1606a973a5cf-trusted-ca\") pod \"console-operator-9d4b6777b-ctj42\" (UID: \"a9e0a184-477b-45d6-835a-1606a973a5cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctj42" Apr 17 17:25:50.850810 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.849770 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/3d9c3135-c5a6-4882-8da7-486b724f3469-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-hclv4\" (UID: \"3d9c3135-c5a6-4882-8da7-486b724f3469\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hclv4" Apr 17 17:25:50.850810 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.850136 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5549e89-45e0-41a3-8e5a-7a240546ad14-config\") pod \"service-ca-operator-d6fc45fc5-7cjnp\" (UID: \"b5549e89-45e0-41a3-8e5a-7a240546ad14\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7cjnp" Apr 17 17:25:50.851201 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.851179 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9e0a184-477b-45d6-835a-1606a973a5cf-config\") pod \"console-operator-9d4b6777b-ctj42\" (UID: \"a9e0a184-477b-45d6-835a-1606a973a5cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctj42" Apr 17 17:25:50.859555 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.859327 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9e0a184-477b-45d6-835a-1606a973a5cf-serving-cert\") pod \"console-operator-9d4b6777b-ctj42\" (UID: \"a9e0a184-477b-45d6-835a-1606a973a5cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctj42" Apr 17 17:25:50.860474 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.860436 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnx5t\" (UniqueName: \"kubernetes.io/projected/3d9c3135-c5a6-4882-8da7-486b724f3469-kube-api-access-gnx5t\") pod \"cluster-monitoring-operator-75587bd455-hclv4\" (UID: \"3d9c3135-c5a6-4882-8da7-486b724f3469\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hclv4" Apr 17 17:25:50.860561 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.860523 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-glr6v\" (UniqueName: \"kubernetes.io/projected/a9e0a184-477b-45d6-835a-1606a973a5cf-kube-api-access-glr6v\") pod \"console-operator-9d4b6777b-ctj42\" (UID: \"a9e0a184-477b-45d6-835a-1606a973a5cf\") " pod="openshift-console-operator/console-operator-9d4b6777b-ctj42" Apr 17 17:25:50.860710 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.860684 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5549e89-45e0-41a3-8e5a-7a240546ad14-serving-cert\") pod \"service-ca-operator-d6fc45fc5-7cjnp\" (UID: \"b5549e89-45e0-41a3-8e5a-7a240546ad14\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7cjnp" Apr 17 17:25:50.862332 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.862153 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv7kb\" (UniqueName: \"kubernetes.io/projected/b5549e89-45e0-41a3-8e5a-7a240546ad14-kube-api-access-jv7kb\") pod \"service-ca-operator-d6fc45fc5-7cjnp\" (UID: \"b5549e89-45e0-41a3-8e5a-7a240546ad14\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7cjnp" Apr 17 17:25:50.883517 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.883492 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7cjnp" Apr 17 17:25:50.895142 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.894922 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-ctj42" Apr 17 17:25:50.920875 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.917526 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrgfm"] Apr 17 17:25:50.926909 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:25:50.926877 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3cccdeb_674a_4c4a_882c_679c52c9c0a9.slice/crio-64321682c2a410b6390a53959e4c81e5ff95c89f845e8b3f63240f38264cc33c WatchSource:0}: Error finding container 64321682c2a410b6390a53959e4c81e5ff95c89f845e8b3f63240f38264cc33c: Status 404 returned error can't find the container with id 64321682c2a410b6390a53959e4c81e5ff95c89f845e8b3f63240f38264cc33c Apr 17 17:25:50.949151 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.949097 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9a77eac-7075-491c-a2b9-080151e1cac9-service-ca-bundle\") pod \"router-default-6c56c99d76-p95sx\" (UID: \"e9a77eac-7075-491c-a2b9-080151e1cac9\") " pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:25:50.949272 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.949163 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9a77eac-7075-491c-a2b9-080151e1cac9-metrics-certs\") pod \"router-default-6c56c99d76-p95sx\" (UID: \"e9a77eac-7075-491c-a2b9-080151e1cac9\") " pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:25:50.949272 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.949236 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e9a77eac-7075-491c-a2b9-080151e1cac9-stats-auth\") pod \"router-default-6c56c99d76-p95sx\" (UID: \"e9a77eac-7075-491c-a2b9-080151e1cac9\") " pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:25:50.949389 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.949281 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q5d46\" (UniqueName: \"kubernetes.io/projected/e9a77eac-7075-491c-a2b9-080151e1cac9-kube-api-access-q5d46\") pod \"router-default-6c56c99d76-p95sx\" (UID: \"e9a77eac-7075-491c-a2b9-080151e1cac9\") " pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:25:50.949389 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.949310 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e9a77eac-7075-491c-a2b9-080151e1cac9-default-certificate\") pod \"router-default-6c56c99d76-p95sx\" (UID: \"e9a77eac-7075-491c-a2b9-080151e1cac9\") " pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:25:50.949389 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:50.949364 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e9a77eac-7075-491c-a2b9-080151e1cac9-service-ca-bundle podName:e9a77eac-7075-491c-a2b9-080151e1cac9 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:51.449340526 +0000 UTC m=+120.975021149 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e9a77eac-7075-491c-a2b9-080151e1cac9-service-ca-bundle") pod "router-default-6c56c99d76-p95sx" (UID: "e9a77eac-7075-491c-a2b9-080151e1cac9") : configmap references non-existent config key: service-ca.crt Apr 17 17:25:50.951570 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.951549 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 17:25:50.951647 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.951635 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 17:25:50.951713 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.951554 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 17:25:50.959981 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:50.959937 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:25:50.960085 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:50.960011 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9a77eac-7075-491c-a2b9-080151e1cac9-metrics-certs podName:e9a77eac-7075-491c-a2b9-080151e1cac9 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:51.459988734 +0000 UTC m=+120.985669351 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e9a77eac-7075-491c-a2b9-080151e1cac9-metrics-certs") pod "router-default-6c56c99d76-p95sx" (UID: "e9a77eac-7075-491c-a2b9-080151e1cac9") : secret "router-metrics-certs-default" not found Apr 17 17:25:50.962234 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.962048 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 17:25:50.963883 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.963808 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e9a77eac-7075-491c-a2b9-080151e1cac9-stats-auth\") pod \"router-default-6c56c99d76-p95sx\" (UID: \"e9a77eac-7075-491c-a2b9-080151e1cac9\") " pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:25:50.964360 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.964332 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e9a77eac-7075-491c-a2b9-080151e1cac9-default-certificate\") pod \"router-default-6c56c99d76-p95sx\" (UID: \"e9a77eac-7075-491c-a2b9-080151e1cac9\") " pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:25:50.972799 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.972762 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 17:25:50.983981 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:50.983962 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5d46\" (UniqueName: \"kubernetes.io/projected/e9a77eac-7075-491c-a2b9-080151e1cac9-kube-api-access-q5d46\") pod \"router-default-6c56c99d76-p95sx\" (UID: \"e9a77eac-7075-491c-a2b9-080151e1cac9\") " pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:25:51.018650 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:51.018622 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7cjnp"] Apr 17 17:25:51.021261 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:25:51.021235 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5549e89_45e0_41a3_8e5a_7a240546ad14.slice/crio-e8d4c7fafb1f0ea4a9ca6b4a04259cad467ef72c662f67def4435c5d9b1493a9 WatchSource:0}: Error finding container e8d4c7fafb1f0ea4a9ca6b4a04259cad467ef72c662f67def4435c5d9b1493a9: Status 404 returned error can't find the container with id e8d4c7fafb1f0ea4a9ca6b4a04259cad467ef72c662f67def4435c5d9b1493a9 Apr 17 17:25:51.031737 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:51.031714 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-ctj42"] Apr 17 17:25:51.034634 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:25:51.034613 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9e0a184_477b_45d6_835a_1606a973a5cf.slice/crio-162b2262f8ec4bcedb85c884d1f0370ae8de4e080e72b2f6877cb7ec885cf655 WatchSource:0}: Error finding container 162b2262f8ec4bcedb85c884d1f0370ae8de4e080e72b2f6877cb7ec885cf655: Status 404 returned error can't find the container with id 162b2262f8ec4bcedb85c884d1f0370ae8de4e080e72b2f6877cb7ec885cf655 Apr 17 17:25:51.142679 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:51.142651 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-9dmmr"] Apr 17 17:25:51.145535 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:25:51.145510 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb7ea451_8e84_4c6c_9ca4_85e14c54d30a.slice/crio-2590aa80897509e437c5f4b513046d378da00487d2684e6d8967578abd8adb51 WatchSource:0}: Error finding container 2590aa80897509e437c5f4b513046d378da00487d2684e6d8967578abd8adb51: Status 404 returned error can't find the container with id 2590aa80897509e437c5f4b513046d378da00487d2684e6d8967578abd8adb51 Apr 17 17:25:51.155134 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:51.155113 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nk6g8"] Apr 17 17:25:51.157649 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:25:51.157628 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc8eeee7_5da1_42b9_bbb1_9eaca3ec2a13.slice/crio-92a0af0f583b75759efcab29b5acd3dc64bf8c2d76a2fd17effaf45aefa575e2 WatchSource:0}: Error finding container 92a0af0f583b75759efcab29b5acd3dc64bf8c2d76a2fd17effaf45aefa575e2: Status 404 returned error can't find the container with id 92a0af0f583b75759efcab29b5acd3dc64bf8c2d76a2fd17effaf45aefa575e2 Apr 17 17:25:51.352483 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:51.352400 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3d9c3135-c5a6-4882-8da7-486b724f3469-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hclv4\" (UID: \"3d9c3135-c5a6-4882-8da7-486b724f3469\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hclv4" Apr 17 17:25:51.352607 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:51.352521 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:25:51.352607 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:51.352575 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d9c3135-c5a6-4882-8da7-486b724f3469-cluster-monitoring-operator-tls podName:3d9c3135-c5a6-4882-8da7-486b724f3469 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:52.352561496 +0000 UTC m=+121.878242093 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3d9c3135-c5a6-4882-8da7-486b724f3469-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-hclv4" (UID: "3d9c3135-c5a6-4882-8da7-486b724f3469") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:25:51.453535 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:51.453504 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9a77eac-7075-491c-a2b9-080151e1cac9-service-ca-bundle\") pod \"router-default-6c56c99d76-p95sx\" (UID: \"e9a77eac-7075-491c-a2b9-080151e1cac9\") " pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:25:51.453704 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:51.453688 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e9a77eac-7075-491c-a2b9-080151e1cac9-service-ca-bundle podName:e9a77eac-7075-491c-a2b9-080151e1cac9 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:52.453667589 +0000 UTC m=+121.979348200 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e9a77eac-7075-491c-a2b9-080151e1cac9-service-ca-bundle") pod "router-default-6c56c99d76-p95sx" (UID: "e9a77eac-7075-491c-a2b9-080151e1cac9") : configmap references non-existent config key: service-ca.crt Apr 17 17:25:51.485737 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:51.485702 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ctj42" event={"ID":"a9e0a184-477b-45d6-835a-1606a973a5cf","Type":"ContainerStarted","Data":"162b2262f8ec4bcedb85c884d1f0370ae8de4e080e72b2f6877cb7ec885cf655"} Apr 17 17:25:51.486683 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:51.486654 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrgfm" event={"ID":"f3cccdeb-674a-4c4a-882c-679c52c9c0a9","Type":"ContainerStarted","Data":"64321682c2a410b6390a53959e4c81e5ff95c89f845e8b3f63240f38264cc33c"} Apr 17 17:25:51.487626 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:51.487598 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-9dmmr" event={"ID":"eb7ea451-8e84-4c6c-9ca4-85e14c54d30a","Type":"ContainerStarted","Data":"2590aa80897509e437c5f4b513046d378da00487d2684e6d8967578abd8adb51"} Apr 17 17:25:51.488542 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:51.488523 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7cjnp" event={"ID":"b5549e89-45e0-41a3-8e5a-7a240546ad14","Type":"ContainerStarted","Data":"e8d4c7fafb1f0ea4a9ca6b4a04259cad467ef72c662f67def4435c5d9b1493a9"} Apr 17 17:25:51.489339 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:51.489323 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nk6g8" event={"ID":"fc8eeee7-5da1-42b9-bbb1-9eaca3ec2a13","Type":"ContainerStarted","Data":"92a0af0f583b75759efcab29b5acd3dc64bf8c2d76a2fd17effaf45aefa575e2"} Apr 17 17:25:51.555142 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:51.555106 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9a77eac-7075-491c-a2b9-080151e1cac9-metrics-certs\") pod \"router-default-6c56c99d76-p95sx\" (UID: \"e9a77eac-7075-491c-a2b9-080151e1cac9\") " pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:25:51.555322 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:51.555268 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:25:51.555409 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:51.555353 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9a77eac-7075-491c-a2b9-080151e1cac9-metrics-certs podName:e9a77eac-7075-491c-a2b9-080151e1cac9 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:52.55533053 +0000 UTC m=+122.081011133 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e9a77eac-7075-491c-a2b9-080151e1cac9-metrics-certs") pod "router-default-6c56c99d76-p95sx" (UID: "e9a77eac-7075-491c-a2b9-080151e1cac9") : secret "router-metrics-certs-default" not found Apr 17 17:25:52.364378 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:52.364332 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3d9c3135-c5a6-4882-8da7-486b724f3469-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hclv4\" (UID: \"3d9c3135-c5a6-4882-8da7-486b724f3469\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hclv4" Apr 17 17:25:52.364576 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:52.364512 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:25:52.364644 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:52.364579 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d9c3135-c5a6-4882-8da7-486b724f3469-cluster-monitoring-operator-tls podName:3d9c3135-c5a6-4882-8da7-486b724f3469 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:54.364560733 +0000 UTC m=+123.890241335 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3d9c3135-c5a6-4882-8da7-486b724f3469-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-hclv4" (UID: "3d9c3135-c5a6-4882-8da7-486b724f3469") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:25:52.465941 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:52.465215 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9a77eac-7075-491c-a2b9-080151e1cac9-service-ca-bundle\") pod \"router-default-6c56c99d76-p95sx\" (UID: \"e9a77eac-7075-491c-a2b9-080151e1cac9\") " pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:25:52.465941 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:52.465528 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e9a77eac-7075-491c-a2b9-080151e1cac9-service-ca-bundle podName:e9a77eac-7075-491c-a2b9-080151e1cac9 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:54.465507472 +0000 UTC m=+123.991188073 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e9a77eac-7075-491c-a2b9-080151e1cac9-service-ca-bundle") pod "router-default-6c56c99d76-p95sx" (UID: "e9a77eac-7075-491c-a2b9-080151e1cac9") : configmap references non-existent config key: service-ca.crt Apr 17 17:25:52.566523 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:52.566450 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9a77eac-7075-491c-a2b9-080151e1cac9-metrics-certs\") pod \"router-default-6c56c99d76-p95sx\" (UID: \"e9a77eac-7075-491c-a2b9-080151e1cac9\") " pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:25:52.566963 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:52.566676 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:25:52.566963 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:52.566741 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9a77eac-7075-491c-a2b9-080151e1cac9-metrics-certs podName:e9a77eac-7075-491c-a2b9-080151e1cac9 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:54.566720985 +0000 UTC m=+124.092401588 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e9a77eac-7075-491c-a2b9-080151e1cac9-metrics-certs") pod "router-default-6c56c99d76-p95sx" (UID: "e9a77eac-7075-491c-a2b9-080151e1cac9") : secret "router-metrics-certs-default" not found Apr 17 17:25:54.383362 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:54.383315 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3d9c3135-c5a6-4882-8da7-486b724f3469-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hclv4\" (UID: \"3d9c3135-c5a6-4882-8da7-486b724f3469\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hclv4" Apr 17 17:25:54.383772 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:54.383492 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:25:54.383772 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:54.383575 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d9c3135-c5a6-4882-8da7-486b724f3469-cluster-monitoring-operator-tls podName:3d9c3135-c5a6-4882-8da7-486b724f3469 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:58.383553972 +0000 UTC m=+127.909234576 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3d9c3135-c5a6-4882-8da7-486b724f3469-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-hclv4" (UID: "3d9c3135-c5a6-4882-8da7-486b724f3469") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:25:54.484443 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:54.484404 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9a77eac-7075-491c-a2b9-080151e1cac9-service-ca-bundle\") pod \"router-default-6c56c99d76-p95sx\" (UID: \"e9a77eac-7075-491c-a2b9-080151e1cac9\") " pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:25:54.484637 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:54.484609 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e9a77eac-7075-491c-a2b9-080151e1cac9-service-ca-bundle podName:e9a77eac-7075-491c-a2b9-080151e1cac9 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:58.484586863 +0000 UTC m=+128.010267481 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e9a77eac-7075-491c-a2b9-080151e1cac9-service-ca-bundle") pod "router-default-6c56c99d76-p95sx" (UID: "e9a77eac-7075-491c-a2b9-080151e1cac9") : configmap references non-existent config key: service-ca.crt Apr 17 17:25:54.584941 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:54.584906 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9a77eac-7075-491c-a2b9-080151e1cac9-metrics-certs\") pod \"router-default-6c56c99d76-p95sx\" (UID: \"e9a77eac-7075-491c-a2b9-080151e1cac9\") " pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:25:54.585110 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:54.585064 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:25:54.585166 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:54.585135 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9a77eac-7075-491c-a2b9-080151e1cac9-metrics-certs podName:e9a77eac-7075-491c-a2b9-080151e1cac9 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:58.585114317 +0000 UTC m=+128.110794931 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e9a77eac-7075-491c-a2b9-080151e1cac9-metrics-certs") pod "router-default-6c56c99d76-p95sx" (UID: "e9a77eac-7075-491c-a2b9-080151e1cac9") : secret "router-metrics-certs-default" not found Apr 17 17:25:55.500694 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:55.500653 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7cjnp" event={"ID":"b5549e89-45e0-41a3-8e5a-7a240546ad14","Type":"ContainerStarted","Data":"3ee937c73571907a5956021a05b6c28f85576b32e89a0b1c97409bfc6425deaf"} Apr 17 17:25:55.502016 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:55.501988 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nk6g8" event={"ID":"fc8eeee7-5da1-42b9-bbb1-9eaca3ec2a13","Type":"ContainerStarted","Data":"0a3569a03ba55a2396073131f4389161ed33f64c4919992c161c6d42fd85b688"} Apr 17 17:25:55.503380 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:55.503363 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctj42_a9e0a184-477b-45d6-835a-1606a973a5cf/console-operator/0.log" Apr 17 17:25:55.503473 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:55.503398 2575 generic.go:358] "Generic (PLEG): container finished" podID="a9e0a184-477b-45d6-835a-1606a973a5cf" containerID="3dd8e0bf64479e55fb67e91b12996570ddc0c95770beffc36fead859f43a87a6" exitCode=255 Apr 17 17:25:55.503526 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:55.503468 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ctj42" event={"ID":"a9e0a184-477b-45d6-835a-1606a973a5cf","Type":"ContainerDied","Data":"3dd8e0bf64479e55fb67e91b12996570ddc0c95770beffc36fead859f43a87a6"} Apr 17 17:25:55.503665 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:55.503647 2575 scope.go:117] "RemoveContainer" containerID="3dd8e0bf64479e55fb67e91b12996570ddc0c95770beffc36fead859f43a87a6" Apr 17 17:25:55.504794 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:55.504765 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrgfm" event={"ID":"f3cccdeb-674a-4c4a-882c-679c52c9c0a9","Type":"ContainerStarted","Data":"57ae9ca5e13d13fb29c5178e98a94d4e4644b20a2435d7fd76b6f5dc18b2547f"} Apr 17 17:25:55.506448 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:55.506424 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-9dmmr" event={"ID":"eb7ea451-8e84-4c6c-9ca4-85e14c54d30a","Type":"ContainerStarted","Data":"7d402a66352796fcfcefca93db1bfc371fb6c36f68c15c5ae0ba5b462156104f"} Apr 17 17:25:55.560073 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:55.560015 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-nk6g8" podStartSLOduration=1.885337528 podStartE2EDuration="5.559994602s" podCreationTimestamp="2026-04-17 17:25:50 +0000 UTC" firstStartedPulling="2026-04-17 17:25:51.159297204 +0000 UTC m=+120.684977802" lastFinishedPulling="2026-04-17 17:25:54.833954278 +0000 UTC m=+124.359634876" observedRunningTime="2026-04-17 17:25:55.558140785 +0000 UTC m=+125.083821405" watchObservedRunningTime="2026-04-17 17:25:55.559994602 +0000 UTC m=+125.085675222" Apr 17 17:25:55.560665 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:55.560633 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7cjnp" podStartSLOduration=1.742491089 podStartE2EDuration="5.560625224s" podCreationTimestamp="2026-04-17 17:25:50 +0000 UTC" firstStartedPulling="2026-04-17 17:25:51.023093132 +0000 UTC m=+120.548773729" lastFinishedPulling="2026-04-17 17:25:54.841227268 +0000 UTC m=+124.366907864" observedRunningTime="2026-04-17 17:25:55.534497167 +0000 UTC m=+125.060177785" watchObservedRunningTime="2026-04-17 17:25:55.560625224 +0000 UTC m=+125.086305844" Apr 17 17:25:55.585892 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:55.585812 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrgfm" podStartSLOduration=1.680603963 podStartE2EDuration="5.58579544s" podCreationTimestamp="2026-04-17 17:25:50 +0000 UTC" firstStartedPulling="2026-04-17 17:25:50.92895308 +0000 UTC m=+120.454633677" lastFinishedPulling="2026-04-17 17:25:54.834144549 +0000 UTC m=+124.359825154" observedRunningTime="2026-04-17 17:25:55.584492349 +0000 UTC m=+125.110172970" watchObservedRunningTime="2026-04-17 17:25:55.58579544 +0000 UTC m=+125.111476059" Apr 17 17:25:55.626677 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:55.626622 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-9dmmr" podStartSLOduration=1.936898984 podStartE2EDuration="5.626605026s" podCreationTimestamp="2026-04-17 17:25:50 +0000 UTC" firstStartedPulling="2026-04-17 17:25:51.147327558 +0000 UTC m=+120.673008155" lastFinishedPulling="2026-04-17 17:25:54.8370336 +0000 UTC m=+124.362714197" observedRunningTime="2026-04-17 17:25:55.624151907 +0000 UTC m=+125.149832527" watchObservedRunningTime="2026-04-17 17:25:55.626605026 +0000 UTC m=+125.152285646" Apr 17 17:25:56.510632 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:56.510604 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctj42_a9e0a184-477b-45d6-835a-1606a973a5cf/console-operator/1.log" Apr 17 17:25:56.511103 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:56.510983 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctj42_a9e0a184-477b-45d6-835a-1606a973a5cf/console-operator/0.log" Apr 17 17:25:56.511103 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:56.511014 2575 generic.go:358] "Generic (PLEG): container finished" podID="a9e0a184-477b-45d6-835a-1606a973a5cf" containerID="c1a440c86559a5a524f9fea61137f15a601b86b4d351acf598a369909de64350" exitCode=255 Apr 17 17:25:56.511220 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:56.511140 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ctj42" event={"ID":"a9e0a184-477b-45d6-835a-1606a973a5cf","Type":"ContainerDied","Data":"c1a440c86559a5a524f9fea61137f15a601b86b4d351acf598a369909de64350"} Apr 17 17:25:56.511220 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:56.511199 2575 scope.go:117] "RemoveContainer" containerID="3dd8e0bf64479e55fb67e91b12996570ddc0c95770beffc36fead859f43a87a6" Apr 17 17:25:56.511408 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:56.511388 2575 scope.go:117] "RemoveContainer" containerID="c1a440c86559a5a524f9fea61137f15a601b86b4d351acf598a369909de64350" Apr 17 17:25:56.511638 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:56.511617 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-ctj42_openshift-console-operator(a9e0a184-477b-45d6-835a-1606a973a5cf)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ctj42" podUID="a9e0a184-477b-45d6-835a-1606a973a5cf" Apr 17 17:25:56.694255 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:56.694224 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-gbxth"] Apr 17 17:25:56.696898 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:56.696882 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gbxth" Apr 17 17:25:56.700342 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:56.700315 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-ltgmw\"" Apr 17 17:25:56.704729 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:56.704707 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6x2g\" (UniqueName: \"kubernetes.io/projected/931285f5-ffa4-47a1-9453-a80716fcd1e5-kube-api-access-k6x2g\") pod \"network-check-source-8894fc9bd-gbxth\" (UID: \"931285f5-ffa4-47a1-9453-a80716fcd1e5\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gbxth" Apr 17 17:25:56.709096 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:56.709076 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-gbxth"] Apr 17 17:25:56.805281 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:56.805211 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k6x2g\" (UniqueName: \"kubernetes.io/projected/931285f5-ffa4-47a1-9453-a80716fcd1e5-kube-api-access-k6x2g\") pod \"network-check-source-8894fc9bd-gbxth\" (UID: \"931285f5-ffa4-47a1-9453-a80716fcd1e5\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gbxth" Apr 17 17:25:56.812570 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:56.812545 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6x2g\" (UniqueName: \"kubernetes.io/projected/931285f5-ffa4-47a1-9453-a80716fcd1e5-kube-api-access-k6x2g\") pod \"network-check-source-8894fc9bd-gbxth\" (UID: \"931285f5-ffa4-47a1-9453-a80716fcd1e5\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gbxth" Apr 17 17:25:57.004596 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:57.004562 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gbxth" Apr 17 17:25:57.120474 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:57.120446 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-gbxth"] Apr 17 17:25:57.123638 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:25:57.123613 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod931285f5_ffa4_47a1_9453_a80716fcd1e5.slice/crio-1796329b8002b47df2e02a5b5dc0981e058193e81faa54350d1565a492fe3988 WatchSource:0}: Error finding container 1796329b8002b47df2e02a5b5dc0981e058193e81faa54350d1565a492fe3988: Status 404 returned error can't find the container with id 1796329b8002b47df2e02a5b5dc0981e058193e81faa54350d1565a492fe3988 Apr 17 17:25:57.515248 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:57.515222 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctj42_a9e0a184-477b-45d6-835a-1606a973a5cf/console-operator/1.log" Apr 17 17:25:57.515715 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:57.515606 2575 scope.go:117] "RemoveContainer" containerID="c1a440c86559a5a524f9fea61137f15a601b86b4d351acf598a369909de64350" Apr 17 17:25:57.515837 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:57.515807 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-ctj42_openshift-console-operator(a9e0a184-477b-45d6-835a-1606a973a5cf)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ctj42" podUID="a9e0a184-477b-45d6-835a-1606a973a5cf" Apr 17 17:25:57.516641 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:57.516619 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gbxth" event={"ID":"931285f5-ffa4-47a1-9453-a80716fcd1e5","Type":"ContainerStarted","Data":"45bca523d45f9d8903a2d2793148d6797a46b3e5090555d3906ebd5c1f01c289"} Apr 17 17:25:57.516707 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:57.516652 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gbxth" event={"ID":"931285f5-ffa4-47a1-9453-a80716fcd1e5","Type":"ContainerStarted","Data":"1796329b8002b47df2e02a5b5dc0981e058193e81faa54350d1565a492fe3988"} Apr 17 17:25:57.556303 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:57.556254 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-gbxth" podStartSLOduration=1.556238703 podStartE2EDuration="1.556238703s" podCreationTimestamp="2026-04-17 17:25:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:25:57.556071936 +0000 UTC m=+127.081752547" watchObservedRunningTime="2026-04-17 17:25:57.556238703 +0000 UTC m=+127.081919323" Apr 17 17:25:58.416870 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:58.416833 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3d9c3135-c5a6-4882-8da7-486b724f3469-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hclv4\" (UID: \"3d9c3135-c5a6-4882-8da7-486b724f3469\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hclv4" Apr 17 17:25:58.417066 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:58.416952 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:25:58.417066 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:58.417008 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d9c3135-c5a6-4882-8da7-486b724f3469-cluster-monitoring-operator-tls podName:3d9c3135-c5a6-4882-8da7-486b724f3469 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:06.416994036 +0000 UTC m=+135.942674633 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3d9c3135-c5a6-4882-8da7-486b724f3469-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-hclv4" (UID: "3d9c3135-c5a6-4882-8da7-486b724f3469") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:25:58.517264 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:58.517235 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9a77eac-7075-491c-a2b9-080151e1cac9-service-ca-bundle\") pod \"router-default-6c56c99d76-p95sx\" (UID: \"e9a77eac-7075-491c-a2b9-080151e1cac9\") " pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:25:58.517625 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:58.517399 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e9a77eac-7075-491c-a2b9-080151e1cac9-service-ca-bundle podName:e9a77eac-7075-491c-a2b9-080151e1cac9 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:06.517381766 +0000 UTC m=+136.043062391 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e9a77eac-7075-491c-a2b9-080151e1cac9-service-ca-bundle") pod "router-default-6c56c99d76-p95sx" (UID: "e9a77eac-7075-491c-a2b9-080151e1cac9") : configmap references non-existent config key: service-ca.crt Apr 17 17:25:58.617927 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:58.617894 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9a77eac-7075-491c-a2b9-080151e1cac9-metrics-certs\") pod \"router-default-6c56c99d76-p95sx\" (UID: \"e9a77eac-7075-491c-a2b9-080151e1cac9\") " pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:25:58.618044 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:58.617976 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:25:58.618044 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:58.618031 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9a77eac-7075-491c-a2b9-080151e1cac9-metrics-certs podName:e9a77eac-7075-491c-a2b9-080151e1cac9 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:06.61801835 +0000 UTC m=+136.143698952 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e9a77eac-7075-491c-a2b9-080151e1cac9-metrics-certs") pod "router-default-6c56c99d76-p95sx" (UID: "e9a77eac-7075-491c-a2b9-080151e1cac9") : secret "router-metrics-certs-default" not found Apr 17 17:25:59.006637 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:59.006610 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qd2qr_adfae8ba-8d02-4f3c-85a7-b2ae828b0579/dns-node-resolver/0.log" Apr 17 17:25:59.794675 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:59.794649 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fxfbb_70cf02df-8c16-47aa-8b0e-1b1ee895fe07/node-ca/0.log" Apr 17 17:25:59.827015 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:25:59.826990 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-metrics-certs\") pod \"network-metrics-daemon-mqlsd\" (UID: \"41eeea20-b1c0-4cb6-8da8-a4a26a60423d\") " pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:25:59.827139 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:59.827122 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 17:25:59.827203 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:25:59.827194 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-metrics-certs podName:41eeea20-b1c0-4cb6-8da8-a4a26a60423d nodeName:}" failed. No retries permitted until 2026-04-17 17:28:01.827178589 +0000 UTC m=+251.352859185 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-metrics-certs") pod "network-metrics-daemon-mqlsd" (UID: "41eeea20-b1c0-4cb6-8da8-a4a26a60423d") : secret "metrics-daemon-secret" not found Apr 17 17:26:00.896053 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:00.896015 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-ctj42" Apr 17 17:26:00.896053 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:00.896058 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-ctj42" Apr 17 17:26:00.896458 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:00.896401 2575 scope.go:117] "RemoveContainer" containerID="c1a440c86559a5a524f9fea61137f15a601b86b4d351acf598a369909de64350" Apr 17 17:26:00.896590 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:26:00.896572 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-ctj42_openshift-console-operator(a9e0a184-477b-45d6-835a-1606a973a5cf)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ctj42" podUID="a9e0a184-477b-45d6-835a-1606a973a5cf" Apr 17 17:26:01.407603 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:01.407573 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-qrgfm_f3cccdeb-674a-4c4a-882c-679c52c9c0a9/kube-storage-version-migrator-operator/0.log" Apr 17 17:26:06.481332 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:06.481297 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3d9c3135-c5a6-4882-8da7-486b724f3469-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hclv4\" (UID: \"3d9c3135-c5a6-4882-8da7-486b724f3469\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hclv4" Apr 17 17:26:06.481680 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:26:06.481433 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 17:26:06.481680 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:26:06.481505 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d9c3135-c5a6-4882-8da7-486b724f3469-cluster-monitoring-operator-tls podName:3d9c3135-c5a6-4882-8da7-486b724f3469 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:22.481490558 +0000 UTC m=+152.007171160 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/3d9c3135-c5a6-4882-8da7-486b724f3469-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-hclv4" (UID: "3d9c3135-c5a6-4882-8da7-486b724f3469") : secret "cluster-monitoring-operator-tls" not found Apr 17 17:26:06.582407 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:06.582377 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9a77eac-7075-491c-a2b9-080151e1cac9-service-ca-bundle\") pod \"router-default-6c56c99d76-p95sx\" (UID: \"e9a77eac-7075-491c-a2b9-080151e1cac9\") " pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:26:06.582562 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:26:06.582548 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e9a77eac-7075-491c-a2b9-080151e1cac9-service-ca-bundle podName:e9a77eac-7075-491c-a2b9-080151e1cac9 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:22.582532646 +0000 UTC m=+152.108213260 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e9a77eac-7075-491c-a2b9-080151e1cac9-service-ca-bundle") pod "router-default-6c56c99d76-p95sx" (UID: "e9a77eac-7075-491c-a2b9-080151e1cac9") : configmap references non-existent config key: service-ca.crt Apr 17 17:26:06.683136 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:06.683101 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9a77eac-7075-491c-a2b9-080151e1cac9-metrics-certs\") pod \"router-default-6c56c99d76-p95sx\" (UID: \"e9a77eac-7075-491c-a2b9-080151e1cac9\") " pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:26:06.683281 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:26:06.683239 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 17:26:06.683317 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:26:06.683302 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9a77eac-7075-491c-a2b9-080151e1cac9-metrics-certs podName:e9a77eac-7075-491c-a2b9-080151e1cac9 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:22.683287078 +0000 UTC m=+152.208967674 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e9a77eac-7075-491c-a2b9-080151e1cac9-metrics-certs") pod "router-default-6c56c99d76-p95sx" (UID: "e9a77eac-7075-491c-a2b9-080151e1cac9") : secret "router-metrics-certs-default" not found Apr 17 17:26:12.089955 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:12.089924 2575 scope.go:117] "RemoveContainer" containerID="c1a440c86559a5a524f9fea61137f15a601b86b4d351acf598a369909de64350" Apr 17 17:26:12.556837 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:12.556794 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctj42_a9e0a184-477b-45d6-835a-1606a973a5cf/console-operator/1.log" Apr 17 17:26:12.556993 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:12.556884 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ctj42" event={"ID":"a9e0a184-477b-45d6-835a-1606a973a5cf","Type":"ContainerStarted","Data":"bd90777af2de46265ca52c105d202b35e657ad97624c569f7fd6f20d5509e0ca"} Apr 17 17:26:12.557171 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:12.557151 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-ctj42" Apr 17 17:26:12.573390 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:12.573277 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-ctj42" podStartSLOduration=18.768443391 podStartE2EDuration="22.57326493s" podCreationTimestamp="2026-04-17 17:25:50 +0000 UTC" firstStartedPulling="2026-04-17 17:25:51.036300937 +0000 UTC m=+120.561981537" lastFinishedPulling="2026-04-17 17:25:54.841122465 +0000 UTC m=+124.366803076" observedRunningTime="2026-04-17 17:26:12.573068995 +0000 UTC m=+142.098749615" watchObservedRunningTime="2026-04-17 17:26:12.57326493 +0000 UTC m=+142.098945549" Apr 17 17:26:13.100686 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:13.100656 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-ctj42" Apr 17 17:26:20.528839 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.528789 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-5kx2x"] Apr 17 17:26:20.530790 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.530772 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-5kx2x" Apr 17 17:26:20.533619 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.533596 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-pks5m\"" Apr 17 17:26:20.533720 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.533600 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 17:26:20.533720 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.533634 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 17:26:20.541427 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.541408 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-5kx2x"] Apr 17 17:26:20.597874 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.597843 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/663900d4-af09-4c17-bf7a-cf4e08cc616a-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-5kx2x\" (UID: \"663900d4-af09-4c17-bf7a-cf4e08cc616a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5kx2x" Apr 17 17:26:20.597995 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.597881 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/663900d4-af09-4c17-bf7a-cf4e08cc616a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-5kx2x\" (UID: \"663900d4-af09-4c17-bf7a-cf4e08cc616a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5kx2x" Apr 17 17:26:20.629458 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.629428 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-gp2f2"] Apr 17 17:26:20.631670 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.631646 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-js8tl"] Apr 17 17:26:20.631843 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.631809 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-gp2f2" Apr 17 17:26:20.633916 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.633898 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-js8tl" Apr 17 17:26:20.634354 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.634339 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 17:26:20.634886 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.634868 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 17:26:20.634972 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.634892 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-qw9f2\"" Apr 17 17:26:20.635846 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.635812 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 17:26:20.635934 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.635857 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 17:26:20.635934 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.635928 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-fcb89\"" Apr 17 17:26:20.654253 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.654228 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-gp2f2"] Apr 17 17:26:20.655403 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.655380 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-js8tl"] Apr 17 17:26:20.698427 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.698396 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/75caf319-8363-49ec-8384-f25af95fb9d1-crio-socket\") pod \"insights-runtime-extractor-js8tl\" (UID: \"75caf319-8363-49ec-8384-f25af95fb9d1\") " pod="openshift-insights/insights-runtime-extractor-js8tl" Apr 17 17:26:20.698564 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.698443 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8gtq\" (UniqueName: \"kubernetes.io/projected/6e03d97b-c9c0-40c2-bba9-99c7c0c1b298-kube-api-access-v8gtq\") pod \"downloads-6bcc868b7-gp2f2\" (UID: \"6e03d97b-c9c0-40c2-bba9-99c7c0c1b298\") " pod="openshift-console/downloads-6bcc868b7-gp2f2" Apr 17 17:26:20.698564 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.698470 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/663900d4-af09-4c17-bf7a-cf4e08cc616a-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-5kx2x\" (UID: \"663900d4-af09-4c17-bf7a-cf4e08cc616a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5kx2x" Apr 17 17:26:20.698564 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.698508 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/663900d4-af09-4c17-bf7a-cf4e08cc616a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-5kx2x\" (UID: \"663900d4-af09-4c17-bf7a-cf4e08cc616a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5kx2x" Apr 17 17:26:20.698661 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.698571 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/75caf319-8363-49ec-8384-f25af95fb9d1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-js8tl\" (UID: \"75caf319-8363-49ec-8384-f25af95fb9d1\") " pod="openshift-insights/insights-runtime-extractor-js8tl" Apr 17 17:26:20.698661 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.698592 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/75caf319-8363-49ec-8384-f25af95fb9d1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-js8tl\" (UID: \"75caf319-8363-49ec-8384-f25af95fb9d1\") " pod="openshift-insights/insights-runtime-extractor-js8tl" Apr 17 17:26:20.698661 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.698613 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlh5m\" (UniqueName: \"kubernetes.io/projected/75caf319-8363-49ec-8384-f25af95fb9d1-kube-api-access-rlh5m\") pod \"insights-runtime-extractor-js8tl\" (UID: \"75caf319-8363-49ec-8384-f25af95fb9d1\") " pod="openshift-insights/insights-runtime-extractor-js8tl" Apr 17 17:26:20.698661 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.698650 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/75caf319-8363-49ec-8384-f25af95fb9d1-data-volume\") pod \"insights-runtime-extractor-js8tl\" (UID: \"75caf319-8363-49ec-8384-f25af95fb9d1\") " pod="openshift-insights/insights-runtime-extractor-js8tl" Apr 17 17:26:20.699119 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.699094 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/663900d4-af09-4c17-bf7a-cf4e08cc616a-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-5kx2x\" (UID: \"663900d4-af09-4c17-bf7a-cf4e08cc616a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5kx2x" Apr 17 17:26:20.700882 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.700865 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/663900d4-af09-4c17-bf7a-cf4e08cc616a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-5kx2x\" (UID: \"663900d4-af09-4c17-bf7a-cf4e08cc616a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-5kx2x" Apr 17 17:26:20.799296 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.799203 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/75caf319-8363-49ec-8384-f25af95fb9d1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-js8tl\" (UID: \"75caf319-8363-49ec-8384-f25af95fb9d1\") " pod="openshift-insights/insights-runtime-extractor-js8tl" Apr 17 17:26:20.799296 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.799244 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/75caf319-8363-49ec-8384-f25af95fb9d1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-js8tl\" (UID: \"75caf319-8363-49ec-8384-f25af95fb9d1\") " pod="openshift-insights/insights-runtime-extractor-js8tl" Apr 17 17:26:20.799296 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.799261 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rlh5m\" (UniqueName: \"kubernetes.io/projected/75caf319-8363-49ec-8384-f25af95fb9d1-kube-api-access-rlh5m\") pod \"insights-runtime-extractor-js8tl\" (UID: \"75caf319-8363-49ec-8384-f25af95fb9d1\") " pod="openshift-insights/insights-runtime-extractor-js8tl" Apr 17 17:26:20.799296 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.799290 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/75caf319-8363-49ec-8384-f25af95fb9d1-data-volume\") pod \"insights-runtime-extractor-js8tl\" (UID: \"75caf319-8363-49ec-8384-f25af95fb9d1\") " pod="openshift-insights/insights-runtime-extractor-js8tl" Apr 17 17:26:20.799603 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.799499 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/75caf319-8363-49ec-8384-f25af95fb9d1-crio-socket\") pod \"insights-runtime-extractor-js8tl\" (UID: \"75caf319-8363-49ec-8384-f25af95fb9d1\") " pod="openshift-insights/insights-runtime-extractor-js8tl" Apr 17 17:26:20.799603 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.799568 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8gtq\" (UniqueName: \"kubernetes.io/projected/6e03d97b-c9c0-40c2-bba9-99c7c0c1b298-kube-api-access-v8gtq\") pod \"downloads-6bcc868b7-gp2f2\" (UID: \"6e03d97b-c9c0-40c2-bba9-99c7c0c1b298\") " pod="openshift-console/downloads-6bcc868b7-gp2f2" Apr 17 17:26:20.799603 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.799593 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/75caf319-8363-49ec-8384-f25af95fb9d1-data-volume\") pod \"insights-runtime-extractor-js8tl\" (UID: \"75caf319-8363-49ec-8384-f25af95fb9d1\") " pod="openshift-insights/insights-runtime-extractor-js8tl" Apr 17 17:26:20.799743 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.799617 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/75caf319-8363-49ec-8384-f25af95fb9d1-crio-socket\") pod \"insights-runtime-extractor-js8tl\" (UID: \"75caf319-8363-49ec-8384-f25af95fb9d1\") " pod="openshift-insights/insights-runtime-extractor-js8tl" Apr 17 17:26:20.799743 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.799728 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/75caf319-8363-49ec-8384-f25af95fb9d1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-js8tl\" (UID: \"75caf319-8363-49ec-8384-f25af95fb9d1\") " pod="openshift-insights/insights-runtime-extractor-js8tl" Apr 17 17:26:20.801646 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.801615 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/75caf319-8363-49ec-8384-f25af95fb9d1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-js8tl\" (UID: \"75caf319-8363-49ec-8384-f25af95fb9d1\") " pod="openshift-insights/insights-runtime-extractor-js8tl" Apr 17 17:26:20.808222 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.808197 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlh5m\" (UniqueName: \"kubernetes.io/projected/75caf319-8363-49ec-8384-f25af95fb9d1-kube-api-access-rlh5m\") pod \"insights-runtime-extractor-js8tl\" (UID: \"75caf319-8363-49ec-8384-f25af95fb9d1\") " pod="openshift-insights/insights-runtime-extractor-js8tl" Apr 17 17:26:20.810330 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.810312 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8gtq\" (UniqueName: \"kubernetes.io/projected/6e03d97b-c9c0-40c2-bba9-99c7c0c1b298-kube-api-access-v8gtq\") pod \"downloads-6bcc868b7-gp2f2\" (UID: \"6e03d97b-c9c0-40c2-bba9-99c7c0c1b298\") " pod="openshift-console/downloads-6bcc868b7-gp2f2" Apr 17 17:26:20.839741 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.839715 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-5kx2x" Apr 17 17:26:20.941326 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.941299 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-gp2f2" Apr 17 17:26:20.947029 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.947009 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-js8tl" Apr 17 17:26:20.962737 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:20.962717 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-5kx2x"] Apr 17 17:26:20.965639 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:26:20.965607 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod663900d4_af09_4c17_bf7a_cf4e08cc616a.slice/crio-27a04aceb0f3e1ac8cbe1d4eb81c10dd72cc864d4d2c0bb04cba1d3b1a7a653a WatchSource:0}: Error finding container 27a04aceb0f3e1ac8cbe1d4eb81c10dd72cc864d4d2c0bb04cba1d3b1a7a653a: Status 404 returned error can't find the container with id 27a04aceb0f3e1ac8cbe1d4eb81c10dd72cc864d4d2c0bb04cba1d3b1a7a653a Apr 17 17:26:21.089331 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:21.089301 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-gp2f2"] Apr 17 17:26:21.093474 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:26:21.093450 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e03d97b_c9c0_40c2_bba9_99c7c0c1b298.slice/crio-c02d0fbbf575a1b5a838d12842382fdd09b057382d5c1255d944bc8231ad603d WatchSource:0}: Error finding container c02d0fbbf575a1b5a838d12842382fdd09b057382d5c1255d944bc8231ad603d: Status 404 returned error can't find the container with id c02d0fbbf575a1b5a838d12842382fdd09b057382d5c1255d944bc8231ad603d Apr 17 17:26:21.106211 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:21.106189 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-js8tl"] Apr 17 17:26:21.109237 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:26:21.109215 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75caf319_8363_49ec_8384_f25af95fb9d1.slice/crio-5ce3f84d897ba8e82276872312eb25fb2fdfb26bce9b64e052b363b7dc421093 WatchSource:0}: Error finding container 5ce3f84d897ba8e82276872312eb25fb2fdfb26bce9b64e052b363b7dc421093: Status 404 returned error can't find the container with id 5ce3f84d897ba8e82276872312eb25fb2fdfb26bce9b64e052b363b7dc421093 Apr 17 17:26:21.582728 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:21.582685 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-gp2f2" event={"ID":"6e03d97b-c9c0-40c2-bba9-99c7c0c1b298","Type":"ContainerStarted","Data":"c02d0fbbf575a1b5a838d12842382fdd09b057382d5c1255d944bc8231ad603d"} Apr 17 17:26:21.584365 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:21.584328 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-js8tl" event={"ID":"75caf319-8363-49ec-8384-f25af95fb9d1","Type":"ContainerStarted","Data":"64d8f439759a716ac024f47b54f2745513610a7c38502f43b1f07b6e79a41530"} Apr 17 17:26:21.584365 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:21.584364 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-js8tl" event={"ID":"75caf319-8363-49ec-8384-f25af95fb9d1","Type":"ContainerStarted","Data":"5ce3f84d897ba8e82276872312eb25fb2fdfb26bce9b64e052b363b7dc421093"} Apr 17 17:26:21.585445 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:21.585414 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-5kx2x" event={"ID":"663900d4-af09-4c17-bf7a-cf4e08cc616a","Type":"ContainerStarted","Data":"27a04aceb0f3e1ac8cbe1d4eb81c10dd72cc864d4d2c0bb04cba1d3b1a7a653a"} Apr 17 17:26:22.519783 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:22.519713 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3d9c3135-c5a6-4882-8da7-486b724f3469-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hclv4\" (UID: \"3d9c3135-c5a6-4882-8da7-486b724f3469\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hclv4" Apr 17 17:26:22.522766 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:22.522736 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3d9c3135-c5a6-4882-8da7-486b724f3469-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-hclv4\" (UID: \"3d9c3135-c5a6-4882-8da7-486b724f3469\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hclv4" Apr 17 17:26:22.590762 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:22.590719 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-js8tl" event={"ID":"75caf319-8363-49ec-8384-f25af95fb9d1","Type":"ContainerStarted","Data":"e879129616810ebff15c228f9cc123db1974ed090ff193cc254201a22db80dfd"} Apr 17 17:26:22.592485 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:22.592450 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-5kx2x" event={"ID":"663900d4-af09-4c17-bf7a-cf4e08cc616a","Type":"ContainerStarted","Data":"92fbbfcc59431020713a83310f5ed7598cdf66ec3efa101ce594b8283fc34826"} Apr 17 17:26:22.608773 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:22.608318 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-5kx2x" podStartSLOduration=1.57923138 podStartE2EDuration="2.608300288s" podCreationTimestamp="2026-04-17 17:26:20 +0000 UTC" firstStartedPulling="2026-04-17 17:26:20.967420463 +0000 UTC m=+150.493101063" lastFinishedPulling="2026-04-17 17:26:21.996489367 +0000 UTC m=+151.522169971" observedRunningTime="2026-04-17 17:26:22.607486437 +0000 UTC m=+152.133167058" watchObservedRunningTime="2026-04-17 17:26:22.608300288 +0000 UTC m=+152.133980908" Apr 17 17:26:22.621580 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:22.621215 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9a77eac-7075-491c-a2b9-080151e1cac9-service-ca-bundle\") pod \"router-default-6c56c99d76-p95sx\" (UID: \"e9a77eac-7075-491c-a2b9-080151e1cac9\") " pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:26:22.622053 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:22.622028 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9a77eac-7075-491c-a2b9-080151e1cac9-service-ca-bundle\") pod \"router-default-6c56c99d76-p95sx\" (UID: \"e9a77eac-7075-491c-a2b9-080151e1cac9\") " pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:26:22.691938 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:22.691870 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-6dpg5\"" Apr 17 17:26:22.699860 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:22.699812 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hclv4" Apr 17 17:26:22.722073 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:22.722039 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9a77eac-7075-491c-a2b9-080151e1cac9-metrics-certs\") pod \"router-default-6c56c99d76-p95sx\" (UID: \"e9a77eac-7075-491c-a2b9-080151e1cac9\") " pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:26:22.724858 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:22.724793 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e9a77eac-7075-491c-a2b9-080151e1cac9-metrics-certs\") pod \"router-default-6c56c99d76-p95sx\" (UID: \"e9a77eac-7075-491c-a2b9-080151e1cac9\") " pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:26:22.785000 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:22.784648 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-hndhk\"" Apr 17 17:26:22.792409 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:22.792384 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:26:22.839212 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:22.839186 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-hclv4"] Apr 17 17:26:23.199510 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:26:23.199436 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d9c3135_c5a6_4882_8da7_486b724f3469.slice/crio-ee1fd8fc0c7475ed4bb062692d7bdc6d7c301ba7a16b3fd39d8b8ad242c9b75f WatchSource:0}: Error finding container ee1fd8fc0c7475ed4bb062692d7bdc6d7c301ba7a16b3fd39d8b8ad242c9b75f: Status 404 returned error can't find the container with id ee1fd8fc0c7475ed4bb062692d7bdc6d7c301ba7a16b3fd39d8b8ad242c9b75f Apr 17 17:26:23.334075 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:23.334050 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-6c56c99d76-p95sx"] Apr 17 17:26:23.338298 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:26:23.338271 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9a77eac_7075_491c_a2b9_080151e1cac9.slice/crio-3b3a41cda3e291aadd1bb975b0e741470bcb770b7661d8608098ea9f647a95dc WatchSource:0}: Error finding container 3b3a41cda3e291aadd1bb975b0e741470bcb770b7661d8608098ea9f647a95dc: Status 404 returned error can't find the container with id 3b3a41cda3e291aadd1bb975b0e741470bcb770b7661d8608098ea9f647a95dc Apr 17 17:26:23.597343 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:23.597306 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-js8tl" event={"ID":"75caf319-8363-49ec-8384-f25af95fb9d1","Type":"ContainerStarted","Data":"673f05af497ec016114f8fb5645c11a2fb4d16943fcf931cd2bc86712c1fb4c5"} Apr 17 17:26:23.598692 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:23.598663 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-6c56c99d76-p95sx" event={"ID":"e9a77eac-7075-491c-a2b9-080151e1cac9","Type":"ContainerStarted","Data":"bfa9de0a47bf32e829b4b1cb3e9550a15bb18469bf6bc743762d4d5a8203e5a1"} Apr 17 17:26:23.598818 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:23.598699 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-6c56c99d76-p95sx" event={"ID":"e9a77eac-7075-491c-a2b9-080151e1cac9","Type":"ContainerStarted","Data":"3b3a41cda3e291aadd1bb975b0e741470bcb770b7661d8608098ea9f647a95dc"} Apr 17 17:26:23.599885 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:23.599858 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hclv4" event={"ID":"3d9c3135-c5a6-4882-8da7-486b724f3469","Type":"ContainerStarted","Data":"ee1fd8fc0c7475ed4bb062692d7bdc6d7c301ba7a16b3fd39d8b8ad242c9b75f"} Apr 17 17:26:23.614752 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:23.614554 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-js8tl" podStartSLOduration=1.5459230609999999 podStartE2EDuration="3.614536623s" podCreationTimestamp="2026-04-17 17:26:20 +0000 UTC" firstStartedPulling="2026-04-17 17:26:21.163648261 +0000 UTC m=+150.689328861" lastFinishedPulling="2026-04-17 17:26:23.232261819 +0000 UTC m=+152.757942423" observedRunningTime="2026-04-17 17:26:23.613853152 +0000 UTC m=+153.139533772" watchObservedRunningTime="2026-04-17 17:26:23.614536623 +0000 UTC m=+153.140217245" Apr 17 17:26:23.635574 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:23.635532 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-6c56c99d76-p95sx" podStartSLOduration=33.635521001 podStartE2EDuration="33.635521001s" podCreationTimestamp="2026-04-17 17:25:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:26:23.634841777 +0000 UTC m=+153.160522392" watchObservedRunningTime="2026-04-17 17:26:23.635521001 +0000 UTC m=+153.161201619" Apr 17 17:26:23.793601 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:23.793564 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:26:23.796653 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:23.796628 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:26:24.604181 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:24.603720 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:26:24.607271 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:24.605739 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-6c56c99d76-p95sx" Apr 17 17:26:25.608038 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:25.608000 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hclv4" event={"ID":"3d9c3135-c5a6-4882-8da7-486b724f3469","Type":"ContainerStarted","Data":"6e6737ecb2deca048fefb7d808544c83d0c92c7260fbee3d664e6d4e3d70675c"} Apr 17 17:26:25.625813 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:25.625765 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-hclv4" podStartSLOduration=33.803462761 podStartE2EDuration="35.625748379s" podCreationTimestamp="2026-04-17 17:25:50 +0000 UTC" firstStartedPulling="2026-04-17 17:26:23.20197894 +0000 UTC m=+152.727659539" lastFinishedPulling="2026-04-17 17:26:25.024264555 +0000 UTC m=+154.549945157" observedRunningTime="2026-04-17 17:26:25.624346045 +0000 UTC m=+155.150026663" watchObservedRunningTime="2026-04-17 17:26:25.625748379 +0000 UTC m=+155.151429001" Apr 17 17:26:26.879596 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:26:26.879521 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" podUID="5dd7aab7-9b6e-47df-9f4f-c7e5041066d0" Apr 17 17:26:26.896669 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:26:26.896631 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-wqv6n" podUID="cb895382-4679-4cac-97c0-92e3122b7ba0" Apr 17 17:26:26.968244 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:26:26.968198 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-6r887" podUID="2d29d8c9-146b-4e7e-988e-c3d984ff39e7" Apr 17 17:26:27.613302 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:27.613270 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:26:28.110216 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:26:28.110169 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-mqlsd" podUID="41eeea20-b1c0-4cb6-8da8-a4a26a60423d" Apr 17 17:26:28.604701 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:28.604671 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-m7vkk"] Apr 17 17:26:28.630672 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:28.630642 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-m7vkk"] Apr 17 17:26:28.631050 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:28.630964 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-m7vkk" Apr 17 17:26:28.633542 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:28.633512 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 17:26:28.633542 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:28.633531 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 17:26:28.633993 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:28.633803 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 17:26:28.633993 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:28.633895 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-9kpvv\"" Apr 17 17:26:28.780460 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:28.780415 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/15d3f0c8-e1a6-41dc-b07d-e2ec68fa4a9c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-m7vkk\" (UID: \"15d3f0c8-e1a6-41dc-b07d-e2ec68fa4a9c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m7vkk" Apr 17 17:26:28.780695 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:28.780478 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp7vp\" (UniqueName: \"kubernetes.io/projected/15d3f0c8-e1a6-41dc-b07d-e2ec68fa4a9c-kube-api-access-wp7vp\") pod \"prometheus-operator-5676c8c784-m7vkk\" (UID: \"15d3f0c8-e1a6-41dc-b07d-e2ec68fa4a9c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m7vkk" Apr 17 17:26:28.780695 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:28.780507 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/15d3f0c8-e1a6-41dc-b07d-e2ec68fa4a9c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-m7vkk\" (UID: \"15d3f0c8-e1a6-41dc-b07d-e2ec68fa4a9c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m7vkk" Apr 17 17:26:28.780695 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:28.780626 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/15d3f0c8-e1a6-41dc-b07d-e2ec68fa4a9c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-m7vkk\" (UID: \"15d3f0c8-e1a6-41dc-b07d-e2ec68fa4a9c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m7vkk" Apr 17 17:26:28.881728 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:28.881646 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/15d3f0c8-e1a6-41dc-b07d-e2ec68fa4a9c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-m7vkk\" (UID: \"15d3f0c8-e1a6-41dc-b07d-e2ec68fa4a9c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m7vkk" Apr 17 17:26:28.881728 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:28.881710 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wp7vp\" (UniqueName: \"kubernetes.io/projected/15d3f0c8-e1a6-41dc-b07d-e2ec68fa4a9c-kube-api-access-wp7vp\") pod \"prometheus-operator-5676c8c784-m7vkk\" (UID: \"15d3f0c8-e1a6-41dc-b07d-e2ec68fa4a9c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m7vkk" Apr 17 17:26:28.881978 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:28.881743 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/15d3f0c8-e1a6-41dc-b07d-e2ec68fa4a9c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-m7vkk\" (UID: \"15d3f0c8-e1a6-41dc-b07d-e2ec68fa4a9c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m7vkk" Apr 17 17:26:28.881978 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:28.881902 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/15d3f0c8-e1a6-41dc-b07d-e2ec68fa4a9c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-m7vkk\" (UID: \"15d3f0c8-e1a6-41dc-b07d-e2ec68fa4a9c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m7vkk" Apr 17 17:26:28.882481 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:28.882421 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/15d3f0c8-e1a6-41dc-b07d-e2ec68fa4a9c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-m7vkk\" (UID: \"15d3f0c8-e1a6-41dc-b07d-e2ec68fa4a9c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m7vkk" Apr 17 17:26:28.895219 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:28.895189 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/15d3f0c8-e1a6-41dc-b07d-e2ec68fa4a9c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-m7vkk\" (UID: \"15d3f0c8-e1a6-41dc-b07d-e2ec68fa4a9c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m7vkk" Apr 17 17:26:28.895219 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:28.895207 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/15d3f0c8-e1a6-41dc-b07d-e2ec68fa4a9c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-m7vkk\" (UID: \"15d3f0c8-e1a6-41dc-b07d-e2ec68fa4a9c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m7vkk" Apr 17 17:26:28.897040 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:28.897018 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp7vp\" (UniqueName: \"kubernetes.io/projected/15d3f0c8-e1a6-41dc-b07d-e2ec68fa4a9c-kube-api-access-wp7vp\") pod \"prometheus-operator-5676c8c784-m7vkk\" (UID: \"15d3f0c8-e1a6-41dc-b07d-e2ec68fa4a9c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m7vkk" Apr 17 17:26:28.943103 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:28.943077 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-m7vkk" Apr 17 17:26:29.085542 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:29.085507 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-m7vkk"] Apr 17 17:26:29.088921 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:26:29.088885 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15d3f0c8_e1a6_41dc_b07d_e2ec68fa4a9c.slice/crio-d74cdf649a950dff4035ff52c8e6e35456786eb8b5a55c84dd99db11ff52d8b5 WatchSource:0}: Error finding container d74cdf649a950dff4035ff52c8e6e35456786eb8b5a55c84dd99db11ff52d8b5: Status 404 returned error can't find the container with id d74cdf649a950dff4035ff52c8e6e35456786eb8b5a55c84dd99db11ff52d8b5 Apr 17 17:26:29.620326 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:29.620276 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-m7vkk" event={"ID":"15d3f0c8-e1a6-41dc-b07d-e2ec68fa4a9c","Type":"ContainerStarted","Data":"d74cdf649a950dff4035ff52c8e6e35456786eb8b5a55c84dd99db11ff52d8b5"} Apr 17 17:26:31.807980 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:31.807941 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb895382-4679-4cac-97c0-92e3122b7ba0-metrics-tls\") pod \"dns-default-wqv6n\" (UID: \"cb895382-4679-4cac-97c0-92e3122b7ba0\") " pod="openshift-dns/dns-default-wqv6n" Apr 17 17:26:31.808449 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:31.808039 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d29d8c9-146b-4e7e-988e-c3d984ff39e7-cert\") pod \"ingress-canary-6r887\" (UID: \"2d29d8c9-146b-4e7e-988e-c3d984ff39e7\") " pod="openshift-ingress-canary/ingress-canary-6r887" Apr 17 17:26:31.808449 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:31.808064 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-tls\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:26:31.810623 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:31.810601 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb895382-4679-4cac-97c0-92e3122b7ba0-metrics-tls\") pod \"dns-default-wqv6n\" (UID: \"cb895382-4679-4cac-97c0-92e3122b7ba0\") " pod="openshift-dns/dns-default-wqv6n" Apr 17 17:26:31.810942 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:31.810922 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-tls\") pod \"image-registry-7788f749ff-bpwmb\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:26:31.811001 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:31.810921 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d29d8c9-146b-4e7e-988e-c3d984ff39e7-cert\") pod \"ingress-canary-6r887\" (UID: \"2d29d8c9-146b-4e7e-988e-c3d984ff39e7\") " pod="openshift-ingress-canary/ingress-canary-6r887" Apr 17 17:26:31.815895 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:31.815876 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-jk79w\"" Apr 17 17:26:31.823880 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:31.823861 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:26:37.211543 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:37.211513 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7788f749ff-bpwmb"] Apr 17 17:26:37.214601 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:26:37.214571 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dd7aab7_9b6e_47df_9f4f_c7e5041066d0.slice/crio-388d8a0690ca17906d435059d0950eff2fb749013a4e36b0f6bbedae8333fecd WatchSource:0}: Error finding container 388d8a0690ca17906d435059d0950eff2fb749013a4e36b0f6bbedae8333fecd: Status 404 returned error can't find the container with id 388d8a0690ca17906d435059d0950eff2fb749013a4e36b0f6bbedae8333fecd Apr 17 17:26:37.644851 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:37.644787 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-m7vkk" event={"ID":"15d3f0c8-e1a6-41dc-b07d-e2ec68fa4a9c","Type":"ContainerStarted","Data":"6e85369a974dd4361da8b2c901812fae241c2a10f4687ce0f416eacf41919205"} Apr 17 17:26:37.644851 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:37.644847 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-m7vkk" event={"ID":"15d3f0c8-e1a6-41dc-b07d-e2ec68fa4a9c","Type":"ContainerStarted","Data":"39a7fca890ce92580f99724ae1b8ccdc20c1123565cf15c1e8f26251c1c98b99"} Apr 17 17:26:37.646493 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:37.646468 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-gp2f2" event={"ID":"6e03d97b-c9c0-40c2-bba9-99c7c0c1b298","Type":"ContainerStarted","Data":"cfa4cfdad58193e8aaeb4a6d84909453e449c8f34857dde3472f8dd21e6144f8"} Apr 17 17:26:37.646754 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:37.646730 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-gp2f2" Apr 17 17:26:37.648231 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:37.648203 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" event={"ID":"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0","Type":"ContainerStarted","Data":"ba532f764adf55077f1a8df8d63997fae72c314ecae9ce8da098ed4b770ec288"} Apr 17 17:26:37.648339 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:37.648240 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" event={"ID":"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0","Type":"ContainerStarted","Data":"388d8a0690ca17906d435059d0950eff2fb749013a4e36b0f6bbedae8333fecd"} Apr 17 17:26:37.648339 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:37.648312 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:26:37.652765 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:37.652743 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-gp2f2" Apr 17 17:26:37.662910 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:37.662869 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-m7vkk" podStartSLOduration=1.673679102 podStartE2EDuration="9.662855679s" podCreationTimestamp="2026-04-17 17:26:28 +0000 UTC" firstStartedPulling="2026-04-17 17:26:29.090974424 +0000 UTC m=+158.616655021" lastFinishedPulling="2026-04-17 17:26:37.080151 +0000 UTC m=+166.605831598" observedRunningTime="2026-04-17 17:26:37.662341501 +0000 UTC m=+167.188022142" watchObservedRunningTime="2026-04-17 17:26:37.662855679 +0000 UTC m=+167.188536300" Apr 17 17:26:37.683113 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:37.683070 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" podStartSLOduration=166.683055516 podStartE2EDuration="2m46.683055516s" podCreationTimestamp="2026-04-17 17:23:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:26:37.681742626 +0000 UTC m=+167.207423259" watchObservedRunningTime="2026-04-17 17:26:37.683055516 +0000 UTC m=+167.208736138" Apr 17 17:26:37.698055 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:37.698012 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-gp2f2" podStartSLOduration=1.663518029 podStartE2EDuration="17.697997782s" podCreationTimestamp="2026-04-17 17:26:20 +0000 UTC" firstStartedPulling="2026-04-17 17:26:21.095318621 +0000 UTC m=+150.620999217" lastFinishedPulling="2026-04-17 17:26:37.129798356 +0000 UTC m=+166.655478970" observedRunningTime="2026-04-17 17:26:37.696881354 +0000 UTC m=+167.222561978" watchObservedRunningTime="2026-04-17 17:26:37.697997782 +0000 UTC m=+167.223678400" Apr 17 17:26:39.090370 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:39.090315 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:26:39.985387 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:39.985354 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-x2x5j"] Apr 17 17:26:40.021051 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.021021 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-x2x5j" Apr 17 17:26:40.023176 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.023080 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 17:26:40.023176 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.023085 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 17:26:40.023176 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.023163 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-s79cd\"" Apr 17 17:26:40.023464 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.023447 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 17:26:40.089722 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.089695 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wqv6n" Apr 17 17:26:40.091995 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.091970 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qbw75\"" Apr 17 17:26:40.100784 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.100758 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wqv6n" Apr 17 17:26:40.185012 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.184970 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjppp\" (UniqueName: \"kubernetes.io/projected/3af1ae5b-3d27-42ae-84c6-15885f57a08c-kube-api-access-sjppp\") pod \"node-exporter-x2x5j\" (UID: \"3af1ae5b-3d27-42ae-84c6-15885f57a08c\") " pod="openshift-monitoring/node-exporter-x2x5j" Apr 17 17:26:40.185159 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.185017 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3af1ae5b-3d27-42ae-84c6-15885f57a08c-sys\") pod \"node-exporter-x2x5j\" (UID: \"3af1ae5b-3d27-42ae-84c6-15885f57a08c\") " pod="openshift-monitoring/node-exporter-x2x5j" Apr 17 17:26:40.185225 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.185152 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3af1ae5b-3d27-42ae-84c6-15885f57a08c-node-exporter-accelerators-collector-config\") pod \"node-exporter-x2x5j\" (UID: \"3af1ae5b-3d27-42ae-84c6-15885f57a08c\") " pod="openshift-monitoring/node-exporter-x2x5j" Apr 17 17:26:40.185225 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.185200 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3af1ae5b-3d27-42ae-84c6-15885f57a08c-root\") pod \"node-exporter-x2x5j\" (UID: \"3af1ae5b-3d27-42ae-84c6-15885f57a08c\") " pod="openshift-monitoring/node-exporter-x2x5j" Apr 17 17:26:40.185331 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.185242 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3af1ae5b-3d27-42ae-84c6-15885f57a08c-node-exporter-tls\") pod \"node-exporter-x2x5j\" (UID: \"3af1ae5b-3d27-42ae-84c6-15885f57a08c\") " pod="openshift-monitoring/node-exporter-x2x5j" Apr 17 17:26:40.185331 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.185323 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3af1ae5b-3d27-42ae-84c6-15885f57a08c-metrics-client-ca\") pod \"node-exporter-x2x5j\" (UID: \"3af1ae5b-3d27-42ae-84c6-15885f57a08c\") " pod="openshift-monitoring/node-exporter-x2x5j" Apr 17 17:26:40.185423 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.185356 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3af1ae5b-3d27-42ae-84c6-15885f57a08c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-x2x5j\" (UID: \"3af1ae5b-3d27-42ae-84c6-15885f57a08c\") " pod="openshift-monitoring/node-exporter-x2x5j" Apr 17 17:26:40.185476 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.185455 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3af1ae5b-3d27-42ae-84c6-15885f57a08c-node-exporter-textfile\") pod \"node-exporter-x2x5j\" (UID: \"3af1ae5b-3d27-42ae-84c6-15885f57a08c\") " pod="openshift-monitoring/node-exporter-x2x5j" Apr 17 17:26:40.185514 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.185485 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3af1ae5b-3d27-42ae-84c6-15885f57a08c-node-exporter-wtmp\") pod \"node-exporter-x2x5j\" (UID: \"3af1ae5b-3d27-42ae-84c6-15885f57a08c\") " pod="openshift-monitoring/node-exporter-x2x5j" Apr 17 17:26:40.238089 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.238011 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wqv6n"] Apr 17 17:26:40.243225 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:26:40.243194 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb895382_4679_4cac_97c0_92e3122b7ba0.slice/crio-e43699deb871ebba79eb558985341414a2d2f6f84e137dacb8adb44f1342eb1a WatchSource:0}: Error finding container e43699deb871ebba79eb558985341414a2d2f6f84e137dacb8adb44f1342eb1a: Status 404 returned error can't find the container with id e43699deb871ebba79eb558985341414a2d2f6f84e137dacb8adb44f1342eb1a Apr 17 17:26:40.286597 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.286569 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3af1ae5b-3d27-42ae-84c6-15885f57a08c-node-exporter-accelerators-collector-config\") pod \"node-exporter-x2x5j\" (UID: \"3af1ae5b-3d27-42ae-84c6-15885f57a08c\") " pod="openshift-monitoring/node-exporter-x2x5j" Apr 17 17:26:40.286688 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.286612 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3af1ae5b-3d27-42ae-84c6-15885f57a08c-root\") pod \"node-exporter-x2x5j\" (UID: \"3af1ae5b-3d27-42ae-84c6-15885f57a08c\") " pod="openshift-monitoring/node-exporter-x2x5j" Apr 17 17:26:40.286688 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.286647 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3af1ae5b-3d27-42ae-84c6-15885f57a08c-node-exporter-tls\") pod \"node-exporter-x2x5j\" (UID: \"3af1ae5b-3d27-42ae-84c6-15885f57a08c\") " pod="openshift-monitoring/node-exporter-x2x5j" Apr 17 17:26:40.286688 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.286683 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3af1ae5b-3d27-42ae-84c6-15885f57a08c-metrics-client-ca\") pod \"node-exporter-x2x5j\" (UID: \"3af1ae5b-3d27-42ae-84c6-15885f57a08c\") " pod="openshift-monitoring/node-exporter-x2x5j" Apr 17 17:26:40.287182 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.286713 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3af1ae5b-3d27-42ae-84c6-15885f57a08c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-x2x5j\" (UID: \"3af1ae5b-3d27-42ae-84c6-15885f57a08c\") " pod="openshift-monitoring/node-exporter-x2x5j" Apr 17 17:26:40.287182 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.286765 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3af1ae5b-3d27-42ae-84c6-15885f57a08c-node-exporter-textfile\") pod \"node-exporter-x2x5j\" (UID: \"3af1ae5b-3d27-42ae-84c6-15885f57a08c\") " pod="openshift-monitoring/node-exporter-x2x5j" Apr 17 17:26:40.287182 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.286785 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3af1ae5b-3d27-42ae-84c6-15885f57a08c-root\") pod \"node-exporter-x2x5j\" (UID: \"3af1ae5b-3d27-42ae-84c6-15885f57a08c\") " pod="openshift-monitoring/node-exporter-x2x5j" Apr 17 17:26:40.287182 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.286791 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3af1ae5b-3d27-42ae-84c6-15885f57a08c-node-exporter-wtmp\") pod \"node-exporter-x2x5j\" (UID: \"3af1ae5b-3d27-42ae-84c6-15885f57a08c\") " pod="openshift-monitoring/node-exporter-x2x5j" Apr 17 17:26:40.287182 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.286877 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjppp\" (UniqueName: \"kubernetes.io/projected/3af1ae5b-3d27-42ae-84c6-15885f57a08c-kube-api-access-sjppp\") pod \"node-exporter-x2x5j\" (UID: \"3af1ae5b-3d27-42ae-84c6-15885f57a08c\") " pod="openshift-monitoring/node-exporter-x2x5j" Apr 17 17:26:40.287182 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.286912 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3af1ae5b-3d27-42ae-84c6-15885f57a08c-sys\") pod \"node-exporter-x2x5j\" (UID: \"3af1ae5b-3d27-42ae-84c6-15885f57a08c\") " pod="openshift-monitoring/node-exporter-x2x5j" Apr 17 17:26:40.287182 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.286927 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3af1ae5b-3d27-42ae-84c6-15885f57a08c-node-exporter-wtmp\") pod \"node-exporter-x2x5j\" (UID: \"3af1ae5b-3d27-42ae-84c6-15885f57a08c\") " pod="openshift-monitoring/node-exporter-x2x5j" Apr 17 17:26:40.287182 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:26:40.286964 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 17:26:40.287182 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.287011 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3af1ae5b-3d27-42ae-84c6-15885f57a08c-sys\") pod \"node-exporter-x2x5j\" (UID: \"3af1ae5b-3d27-42ae-84c6-15885f57a08c\") " pod="openshift-monitoring/node-exporter-x2x5j" Apr 17 17:26:40.287182 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:26:40.287043 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3af1ae5b-3d27-42ae-84c6-15885f57a08c-node-exporter-tls podName:3af1ae5b-3d27-42ae-84c6-15885f57a08c nodeName:}" failed. No retries permitted until 2026-04-17 17:26:40.787021775 +0000 UTC m=+170.312702392 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/3af1ae5b-3d27-42ae-84c6-15885f57a08c-node-exporter-tls") pod "node-exporter-x2x5j" (UID: "3af1ae5b-3d27-42ae-84c6-15885f57a08c") : secret "node-exporter-tls" not found Apr 17 17:26:40.287630 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.287187 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3af1ae5b-3d27-42ae-84c6-15885f57a08c-node-exporter-textfile\") pod \"node-exporter-x2x5j\" (UID: \"3af1ae5b-3d27-42ae-84c6-15885f57a08c\") " pod="openshift-monitoring/node-exporter-x2x5j" Apr 17 17:26:40.287630 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.287271 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3af1ae5b-3d27-42ae-84c6-15885f57a08c-node-exporter-accelerators-collector-config\") pod \"node-exporter-x2x5j\" (UID: \"3af1ae5b-3d27-42ae-84c6-15885f57a08c\") " pod="openshift-monitoring/node-exporter-x2x5j" Apr 17 17:26:40.287630 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.287428 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3af1ae5b-3d27-42ae-84c6-15885f57a08c-metrics-client-ca\") pod \"node-exporter-x2x5j\" (UID: \"3af1ae5b-3d27-42ae-84c6-15885f57a08c\") " pod="openshift-monitoring/node-exporter-x2x5j" Apr 17 17:26:40.289350 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.289328 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3af1ae5b-3d27-42ae-84c6-15885f57a08c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-x2x5j\" (UID: \"3af1ae5b-3d27-42ae-84c6-15885f57a08c\") " pod="openshift-monitoring/node-exporter-x2x5j" Apr 17 17:26:40.297594 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.297570 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjppp\" (UniqueName: \"kubernetes.io/projected/3af1ae5b-3d27-42ae-84c6-15885f57a08c-kube-api-access-sjppp\") pod \"node-exporter-x2x5j\" (UID: \"3af1ae5b-3d27-42ae-84c6-15885f57a08c\") " pod="openshift-monitoring/node-exporter-x2x5j" Apr 17 17:26:40.659150 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.659096 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wqv6n" event={"ID":"cb895382-4679-4cac-97c0-92e3122b7ba0","Type":"ContainerStarted","Data":"e43699deb871ebba79eb558985341414a2d2f6f84e137dacb8adb44f1342eb1a"} Apr 17 17:26:40.792432 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.792389 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3af1ae5b-3d27-42ae-84c6-15885f57a08c-node-exporter-tls\") pod \"node-exporter-x2x5j\" (UID: \"3af1ae5b-3d27-42ae-84c6-15885f57a08c\") " pod="openshift-monitoring/node-exporter-x2x5j" Apr 17 17:26:40.794967 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.794941 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3af1ae5b-3d27-42ae-84c6-15885f57a08c-node-exporter-tls\") pod \"node-exporter-x2x5j\" (UID: \"3af1ae5b-3d27-42ae-84c6-15885f57a08c\") " pod="openshift-monitoring/node-exporter-x2x5j" Apr 17 17:26:40.931854 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:40.931769 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-x2x5j" Apr 17 17:26:40.945185 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:26:40.945141 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3af1ae5b_3d27_42ae_84c6_15885f57a08c.slice/crio-37f9d33a4dfe95be99e688d8382785fd1b17404266adba2cd60a985061a17215 WatchSource:0}: Error finding container 37f9d33a4dfe95be99e688d8382785fd1b17404266adba2cd60a985061a17215: Status 404 returned error can't find the container with id 37f9d33a4dfe95be99e688d8382785fd1b17404266adba2cd60a985061a17215 Apr 17 17:26:41.093180 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:41.092982 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6r887" Apr 17 17:26:41.096161 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:41.096131 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hd29f\"" Apr 17 17:26:41.104200 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:41.104174 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6r887" Apr 17 17:26:41.280557 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:41.278850 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6r887"] Apr 17 17:26:41.285612 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:26:41.285582 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d29d8c9_146b_4e7e_988e_c3d984ff39e7.slice/crio-f914ea4d71c40af26f1761305b37f7a97d0599720a48f412cf6a2598f0353ff8 WatchSource:0}: Error finding container f914ea4d71c40af26f1761305b37f7a97d0599720a48f412cf6a2598f0353ff8: Status 404 returned error can't find the container with id f914ea4d71c40af26f1761305b37f7a97d0599720a48f412cf6a2598f0353ff8 Apr 17 17:26:41.662886 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:41.662844 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6r887" event={"ID":"2d29d8c9-146b-4e7e-988e-c3d984ff39e7","Type":"ContainerStarted","Data":"f914ea4d71c40af26f1761305b37f7a97d0599720a48f412cf6a2598f0353ff8"} Apr 17 17:26:41.664047 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:41.664018 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-x2x5j" event={"ID":"3af1ae5b-3d27-42ae-84c6-15885f57a08c","Type":"ContainerStarted","Data":"37f9d33a4dfe95be99e688d8382785fd1b17404266adba2cd60a985061a17215"} Apr 17 17:26:42.044903 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.044836 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-dcc87df79-scgw5"] Apr 17 17:26:42.064583 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.063900 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-dcc87df79-scgw5"] Apr 17 17:26:42.064583 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.064071 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" Apr 17 17:26:42.066784 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.066756 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 17:26:42.067014 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.066993 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 17:26:42.067265 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.067246 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 17:26:42.067482 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.067463 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-5v2u6bdo5dj8a\"" Apr 17 17:26:42.067734 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.067716 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-d4f5t\"" Apr 17 17:26:42.068007 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.067984 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 17:26:42.068216 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.068193 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 17:26:42.103662 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.103521 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/f5901f9d-b096-438d-a602-d00595295e12-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-dcc87df79-scgw5\" (UID: \"f5901f9d-b096-438d-a602-d00595295e12\") " pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" Apr 17 17:26:42.103662 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.103564 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f5901f9d-b096-438d-a602-d00595295e12-metrics-client-ca\") pod \"thanos-querier-dcc87df79-scgw5\" (UID: \"f5901f9d-b096-438d-a602-d00595295e12\") " pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" Apr 17 17:26:42.103662 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.103615 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/f5901f9d-b096-438d-a602-d00595295e12-secret-thanos-querier-tls\") pod \"thanos-querier-dcc87df79-scgw5\" (UID: \"f5901f9d-b096-438d-a602-d00595295e12\") " pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" Apr 17 17:26:42.104173 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.103691 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8wtb\" (UniqueName: \"kubernetes.io/projected/f5901f9d-b096-438d-a602-d00595295e12-kube-api-access-l8wtb\") pod \"thanos-querier-dcc87df79-scgw5\" (UID: \"f5901f9d-b096-438d-a602-d00595295e12\") " pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" Apr 17 17:26:42.104173 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.103754 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f5901f9d-b096-438d-a602-d00595295e12-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-dcc87df79-scgw5\" (UID: \"f5901f9d-b096-438d-a602-d00595295e12\") " pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" Apr 17 17:26:42.104173 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.103787 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f5901f9d-b096-438d-a602-d00595295e12-secret-grpc-tls\") pod \"thanos-querier-dcc87df79-scgw5\" (UID: \"f5901f9d-b096-438d-a602-d00595295e12\") " pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" Apr 17 17:26:42.104173 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.103841 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f5901f9d-b096-438d-a602-d00595295e12-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-dcc87df79-scgw5\" (UID: \"f5901f9d-b096-438d-a602-d00595295e12\") " pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" Apr 17 17:26:42.104173 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.103886 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/f5901f9d-b096-438d-a602-d00595295e12-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-dcc87df79-scgw5\" (UID: \"f5901f9d-b096-438d-a602-d00595295e12\") " pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" Apr 17 17:26:42.205042 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.204384 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f5901f9d-b096-438d-a602-d00595295e12-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-dcc87df79-scgw5\" (UID: \"f5901f9d-b096-438d-a602-d00595295e12\") " pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" Apr 17 17:26:42.205042 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.204439 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f5901f9d-b096-438d-a602-d00595295e12-secret-grpc-tls\") pod \"thanos-querier-dcc87df79-scgw5\" (UID: \"f5901f9d-b096-438d-a602-d00595295e12\") " pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" Apr 17 17:26:42.205042 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.204479 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f5901f9d-b096-438d-a602-d00595295e12-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-dcc87df79-scgw5\" (UID: \"f5901f9d-b096-438d-a602-d00595295e12\") " pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" Apr 17 17:26:42.205042 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.204507 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/f5901f9d-b096-438d-a602-d00595295e12-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-dcc87df79-scgw5\" (UID: \"f5901f9d-b096-438d-a602-d00595295e12\") " pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" Apr 17 17:26:42.205042 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.204600 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/f5901f9d-b096-438d-a602-d00595295e12-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-dcc87df79-scgw5\" (UID: \"f5901f9d-b096-438d-a602-d00595295e12\") " pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" Apr 17 17:26:42.205042 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.204624 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f5901f9d-b096-438d-a602-d00595295e12-metrics-client-ca\") pod \"thanos-querier-dcc87df79-scgw5\" (UID: \"f5901f9d-b096-438d-a602-d00595295e12\") " pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" Apr 17 17:26:42.205042 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.204663 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/f5901f9d-b096-438d-a602-d00595295e12-secret-thanos-querier-tls\") pod \"thanos-querier-dcc87df79-scgw5\" (UID: \"f5901f9d-b096-438d-a602-d00595295e12\") " pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" Apr 17 17:26:42.205042 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.204692 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8wtb\" (UniqueName: \"kubernetes.io/projected/f5901f9d-b096-438d-a602-d00595295e12-kube-api-access-l8wtb\") pod \"thanos-querier-dcc87df79-scgw5\" (UID: \"f5901f9d-b096-438d-a602-d00595295e12\") " pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" Apr 17 17:26:42.205928 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.205900 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f5901f9d-b096-438d-a602-d00595295e12-metrics-client-ca\") pod \"thanos-querier-dcc87df79-scgw5\" (UID: \"f5901f9d-b096-438d-a602-d00595295e12\") " pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" Apr 17 17:26:42.208368 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.208317 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f5901f9d-b096-438d-a602-d00595295e12-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-dcc87df79-scgw5\" (UID: \"f5901f9d-b096-438d-a602-d00595295e12\") " pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" Apr 17 17:26:42.208612 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.208588 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f5901f9d-b096-438d-a602-d00595295e12-secret-grpc-tls\") pod \"thanos-querier-dcc87df79-scgw5\" (UID: \"f5901f9d-b096-438d-a602-d00595295e12\") " pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" Apr 17 17:26:42.209535 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.209476 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/f5901f9d-b096-438d-a602-d00595295e12-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-dcc87df79-scgw5\" (UID: \"f5901f9d-b096-438d-a602-d00595295e12\") " pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" Apr 17 17:26:42.211104 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.211055 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/f5901f9d-b096-438d-a602-d00595295e12-secret-thanos-querier-tls\") pod \"thanos-querier-dcc87df79-scgw5\" (UID: \"f5901f9d-b096-438d-a602-d00595295e12\") " pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" Apr 17 17:26:42.211748 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.211722 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f5901f9d-b096-438d-a602-d00595295e12-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-dcc87df79-scgw5\" (UID: \"f5901f9d-b096-438d-a602-d00595295e12\") " pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" Apr 17 17:26:42.212922 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.212895 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/f5901f9d-b096-438d-a602-d00595295e12-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-dcc87df79-scgw5\" (UID: \"f5901f9d-b096-438d-a602-d00595295e12\") " pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" Apr 17 17:26:42.215324 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.215304 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8wtb\" (UniqueName: \"kubernetes.io/projected/f5901f9d-b096-438d-a602-d00595295e12-kube-api-access-l8wtb\") pod \"thanos-querier-dcc87df79-scgw5\" (UID: \"f5901f9d-b096-438d-a602-d00595295e12\") " pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" Apr 17 17:26:42.378869 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.378767 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" Apr 17 17:26:42.545290 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:42.545230 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-dcc87df79-scgw5"] Apr 17 17:26:42.699307 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:26:42.699235 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5901f9d_b096_438d_a602_d00595295e12.slice/crio-9925090cc836fc8b755be6ac2c47a310af42492b7a16c7e2588e4929bc5601a3 WatchSource:0}: Error finding container 9925090cc836fc8b755be6ac2c47a310af42492b7a16c7e2588e4929bc5601a3: Status 404 returned error can't find the container with id 9925090cc836fc8b755be6ac2c47a310af42492b7a16c7e2588e4929bc5601a3 Apr 17 17:26:43.671630 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:43.671558 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wqv6n" event={"ID":"cb895382-4679-4cac-97c0-92e3122b7ba0","Type":"ContainerStarted","Data":"f6808ea412f5d4f5ac31d94908629cb550a276b8de6030e0ae626d8f06b45954"} Apr 17 17:26:43.672756 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:43.672727 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" event={"ID":"f5901f9d-b096-438d-a602-d00595295e12","Type":"ContainerStarted","Data":"9925090cc836fc8b755be6ac2c47a310af42492b7a16c7e2588e4929bc5601a3"} Apr 17 17:26:44.677332 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:44.677293 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6r887" event={"ID":"2d29d8c9-146b-4e7e-988e-c3d984ff39e7","Type":"ContainerStarted","Data":"77f1ede4c8b23bb825f81411be9fa89ca544d1d560931d3c0c34bde04000c946"} Apr 17 17:26:44.679275 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:44.679245 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wqv6n" event={"ID":"cb895382-4679-4cac-97c0-92e3122b7ba0","Type":"ContainerStarted","Data":"b1f79b81a8543a7096a2224fd0b7eaf8986f4091a642883d3ce7471648ae5fc3"} Apr 17 17:26:44.679424 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:44.679386 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-wqv6n" Apr 17 17:26:44.698136 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:44.698068 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6r887" podStartSLOduration=139.295427866 podStartE2EDuration="2m21.698047608s" podCreationTimestamp="2026-04-17 17:24:23 +0000 UTC" firstStartedPulling="2026-04-17 17:26:41.288246693 +0000 UTC m=+170.813927299" lastFinishedPulling="2026-04-17 17:26:43.690866437 +0000 UTC m=+173.216547041" observedRunningTime="2026-04-17 17:26:44.696765896 +0000 UTC m=+174.222446526" watchObservedRunningTime="2026-04-17 17:26:44.698047608 +0000 UTC m=+174.223728228" Apr 17 17:26:44.715247 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:44.715181 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wqv6n" podStartSLOduration=138.629113183 podStartE2EDuration="2m21.715160136s" podCreationTimestamp="2026-04-17 17:24:23 +0000 UTC" firstStartedPulling="2026-04-17 17:26:40.245999039 +0000 UTC m=+169.771679648" lastFinishedPulling="2026-04-17 17:26:43.332045978 +0000 UTC m=+172.857726601" observedRunningTime="2026-04-17 17:26:44.714374079 +0000 UTC m=+174.240054713" watchObservedRunningTime="2026-04-17 17:26:44.715160136 +0000 UTC m=+174.240840754" Apr 17 17:26:44.854217 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:44.854182 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-94d6497b6-5v4vj"] Apr 17 17:26:44.877545 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:44.877501 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-94d6497b6-5v4vj"] Apr 17 17:26:44.877715 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:44.877622 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-94d6497b6-5v4vj" Apr 17 17:26:44.879798 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:44.879773 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 17:26:44.880255 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:44.880088 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 17:26:44.880255 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:44.880247 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 17:26:44.880255 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:44.880256 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 17:26:44.880463 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:44.880430 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-7hcbl\"" Apr 17 17:26:44.880704 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:44.880688 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 17:26:44.885308 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:44.885197 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 17:26:44.931867 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:44.931744 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-console-serving-cert\") pod \"console-94d6497b6-5v4vj\" (UID: \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\") " pod="openshift-console/console-94d6497b6-5v4vj" Apr 17 17:26:44.931867 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:44.931797 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-service-ca\") pod \"console-94d6497b6-5v4vj\" (UID: \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\") " pod="openshift-console/console-94d6497b6-5v4vj" Apr 17 17:26:44.932045 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:44.931888 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khsg6\" (UniqueName: \"kubernetes.io/projected/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-kube-api-access-khsg6\") pod \"console-94d6497b6-5v4vj\" (UID: \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\") " pod="openshift-console/console-94d6497b6-5v4vj" Apr 17 17:26:44.932045 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:44.931964 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-trusted-ca-bundle\") pod \"console-94d6497b6-5v4vj\" (UID: \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\") " pod="openshift-console/console-94d6497b6-5v4vj" Apr 17 17:26:44.932045 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:44.932030 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-console-config\") pod \"console-94d6497b6-5v4vj\" (UID: \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\") " pod="openshift-console/console-94d6497b6-5v4vj" Apr 17 17:26:44.932251 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:44.932058 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-console-oauth-config\") pod \"console-94d6497b6-5v4vj\" (UID: \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\") " pod="openshift-console/console-94d6497b6-5v4vj" Apr 17 17:26:44.932251 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:44.932087 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-oauth-serving-cert\") pod \"console-94d6497b6-5v4vj\" (UID: \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\") " pod="openshift-console/console-94d6497b6-5v4vj" Apr 17 17:26:45.033626 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:45.033583 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-trusted-ca-bundle\") pod \"console-94d6497b6-5v4vj\" (UID: \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\") " pod="openshift-console/console-94d6497b6-5v4vj" Apr 17 17:26:45.033800 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:45.033660 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-console-config\") pod \"console-94d6497b6-5v4vj\" (UID: \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\") " pod="openshift-console/console-94d6497b6-5v4vj" Apr 17 17:26:45.033800 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:45.033702 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-console-oauth-config\") pod \"console-94d6497b6-5v4vj\" (UID: \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\") " pod="openshift-console/console-94d6497b6-5v4vj" Apr 17 17:26:45.033800 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:45.033732 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-oauth-serving-cert\") pod \"console-94d6497b6-5v4vj\" (UID: \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\") " pod="openshift-console/console-94d6497b6-5v4vj" Apr 17 17:26:45.033800 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:45.033795 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-console-serving-cert\") pod \"console-94d6497b6-5v4vj\" (UID: \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\") " pod="openshift-console/console-94d6497b6-5v4vj" Apr 17 17:26:45.034289 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:45.033813 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-service-ca\") pod \"console-94d6497b6-5v4vj\" (UID: \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\") " pod="openshift-console/console-94d6497b6-5v4vj" Apr 17 17:26:45.034289 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:45.033875 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khsg6\" (UniqueName: \"kubernetes.io/projected/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-kube-api-access-khsg6\") pod \"console-94d6497b6-5v4vj\" (UID: \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\") " pod="openshift-console/console-94d6497b6-5v4vj" Apr 17 17:26:45.035612 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:45.035585 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-console-config\") pod \"console-94d6497b6-5v4vj\" (UID: \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\") " pod="openshift-console/console-94d6497b6-5v4vj" Apr 17 17:26:45.035612 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:45.035597 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-trusted-ca-bundle\") pod \"console-94d6497b6-5v4vj\" (UID: \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\") " pod="openshift-console/console-94d6497b6-5v4vj" Apr 17 17:26:45.035788 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:45.035633 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-oauth-serving-cert\") pod \"console-94d6497b6-5v4vj\" (UID: \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\") " pod="openshift-console/console-94d6497b6-5v4vj" Apr 17 17:26:45.036266 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:45.036216 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-service-ca\") pod \"console-94d6497b6-5v4vj\" (UID: \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\") " pod="openshift-console/console-94d6497b6-5v4vj" Apr 17 17:26:45.039029 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:45.038931 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-console-serving-cert\") pod \"console-94d6497b6-5v4vj\" (UID: \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\") " pod="openshift-console/console-94d6497b6-5v4vj" Apr 17 17:26:45.039029 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:45.039003 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-console-oauth-config\") pod \"console-94d6497b6-5v4vj\" (UID: \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\") " pod="openshift-console/console-94d6497b6-5v4vj" Apr 17 17:26:45.051348 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:45.051304 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khsg6\" (UniqueName: \"kubernetes.io/projected/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-kube-api-access-khsg6\") pod \"console-94d6497b6-5v4vj\" (UID: \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\") " pod="openshift-console/console-94d6497b6-5v4vj" Apr 17 17:26:45.190010 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:45.189918 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-94d6497b6-5v4vj" Apr 17 17:26:45.566338 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:45.566305 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-94d6497b6-5v4vj"] Apr 17 17:26:45.581600 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:26:45.581572 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ff7bac2_ae68_4ee4_9ad8_1c56d1af0b2c.slice/crio-4888449be8621870e795ed15126ed8a36e630dfc7705599bdd55ac556c5a09d0 WatchSource:0}: Error finding container 4888449be8621870e795ed15126ed8a36e630dfc7705599bdd55ac556c5a09d0: Status 404 returned error can't find the container with id 4888449be8621870e795ed15126ed8a36e630dfc7705599bdd55ac556c5a09d0 Apr 17 17:26:45.683705 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:45.683667 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-94d6497b6-5v4vj" event={"ID":"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c","Type":"ContainerStarted","Data":"4888449be8621870e795ed15126ed8a36e630dfc7705599bdd55ac556c5a09d0"} Apr 17 17:26:46.690194 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:46.690111 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" event={"ID":"f5901f9d-b096-438d-a602-d00595295e12","Type":"ContainerStarted","Data":"e3a7e4764dc1c1453edf2e6036398a62aee6854043ea201a22e04ab922a3a7a7"} Apr 17 17:26:46.690194 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:46.690158 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" event={"ID":"f5901f9d-b096-438d-a602-d00595295e12","Type":"ContainerStarted","Data":"cd1859ebe32d3712bcfa0c0732650cba354dfa6f05441e938496e8ea2f18ce8d"} Apr 17 17:26:46.690194 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:46.690172 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" event={"ID":"f5901f9d-b096-438d-a602-d00595295e12","Type":"ContainerStarted","Data":"1d971e091cf96782c8443df3b5c72a93d9d6aaa3dd82d05afb7e34478a2cc0b7"} Apr 17 17:26:49.700495 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:49.700460 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-94d6497b6-5v4vj" event={"ID":"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c","Type":"ContainerStarted","Data":"ece0d8ff4ad7f188e01a40afe5533c1e2da72abf616dae15c3220a500ec542e9"} Apr 17 17:26:49.702983 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:49.702955 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" event={"ID":"f5901f9d-b096-438d-a602-d00595295e12","Type":"ContainerStarted","Data":"cd361a9e0a50018c75c4fcdeb6aa0312adfea460924c6a7950e75a8d925f0c7a"} Apr 17 17:26:49.702983 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:49.702989 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" event={"ID":"f5901f9d-b096-438d-a602-d00595295e12","Type":"ContainerStarted","Data":"a62cce8d61d15e23e679a736541cf35a175f7be54460c19919867936b04fb68f"} Apr 17 17:26:49.703137 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:49.703003 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" event={"ID":"f5901f9d-b096-438d-a602-d00595295e12","Type":"ContainerStarted","Data":"c314497126b46f729e141d5c94f93c7019ac22d01f3ab9d4a8f71c638656bc7b"} Apr 17 17:26:49.703177 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:49.703149 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" Apr 17 17:26:49.720350 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:49.720309 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-94d6497b6-5v4vj" podStartSLOduration=2.184137031 podStartE2EDuration="5.720296943s" podCreationTimestamp="2026-04-17 17:26:44 +0000 UTC" firstStartedPulling="2026-04-17 17:26:45.583797681 +0000 UTC m=+175.109478292" lastFinishedPulling="2026-04-17 17:26:49.119957587 +0000 UTC m=+178.645638204" observedRunningTime="2026-04-17 17:26:49.719386921 +0000 UTC m=+179.245067552" watchObservedRunningTime="2026-04-17 17:26:49.720296943 +0000 UTC m=+179.245977561" Apr 17 17:26:49.744260 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:49.744216 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" podStartSLOduration=1.6871988199999999 podStartE2EDuration="7.744203749s" podCreationTimestamp="2026-04-17 17:26:42 +0000 UTC" firstStartedPulling="2026-04-17 17:26:42.718732661 +0000 UTC m=+172.244413261" lastFinishedPulling="2026-04-17 17:26:48.775737582 +0000 UTC m=+178.301418190" observedRunningTime="2026-04-17 17:26:49.743256951 +0000 UTC m=+179.268937580" watchObservedRunningTime="2026-04-17 17:26:49.744203749 +0000 UTC m=+179.269884359" Apr 17 17:26:51.188742 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:26:51.188692 2575 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8b4514b20ab8125dc4f2ee9661f8363c837031926b40e7c54a36d1efa08456d1: reading manifest sha256:8b4514b20ab8125dc4f2ee9661f8363c837031926b40e7c54a36d1efa08456d1 in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8b4514b20ab8125dc4f2ee9661f8363c837031926b40e7c54a36d1efa08456d1" Apr 17 17:26:51.189150 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:26:51.188897 2575 kuberuntime_manager.go:1358] "Unhandled Error" err="init container &Container{Name:init-textfile,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8b4514b20ab8125dc4f2ee9661f8363c837031926b40e7c54a36d1efa08456d1,Command:[/bin/sh -c [[ ! -d /node_exporter/collectors/init ]] || find /node_exporter/collectors/init -perm /111 -type f -exec {} \\;],Args:[],WorkingDir:/var/node_exporter/textfile,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:TMPDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{1 -3} {} 1m DecimalSI},memory: {{1048576 0} {} 1Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:node-exporter-textfile,ReadOnly:false,MountPath:/var/node_exporter/textfile,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:node-exporter-wtmp,ReadOnly:true,MountPath:/var/log/wtmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sjppp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-exporter-x2x5j_openshift-monitoring(3af1ae5b-3d27-42ae-84c6-15885f57a08c): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8b4514b20ab8125dc4f2ee9661f8363c837031926b40e7c54a36d1efa08456d1: reading manifest sha256:8b4514b20ab8125dc4f2ee9661f8363c837031926b40e7c54a36d1efa08456d1 in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 17 17:26:51.190128 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:26:51.190084 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init-textfile\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8b4514b20ab8125dc4f2ee9661f8363c837031926b40e7c54a36d1efa08456d1: reading manifest sha256:8b4514b20ab8125dc4f2ee9661f8363c837031926b40e7c54a36d1efa08456d1 in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="openshift-monitoring/node-exporter-x2x5j" podUID="3af1ae5b-3d27-42ae-84c6-15885f57a08c" Apr 17 17:26:51.709479 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:26:51.709430 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init-textfile\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8b4514b20ab8125dc4f2ee9661f8363c837031926b40e7c54a36d1efa08456d1\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8b4514b20ab8125dc4f2ee9661f8363c837031926b40e7c54a36d1efa08456d1: reading manifest sha256:8b4514b20ab8125dc4f2ee9661f8363c837031926b40e7c54a36d1efa08456d1 in quay.io/openshift-release-dev/ocp-v4.0-art-dev: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="openshift-monitoring/node-exporter-x2x5j" podUID="3af1ae5b-3d27-42ae-84c6-15885f57a08c" Apr 17 17:26:51.829105 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:51.829076 2575 patch_prober.go:28] interesting pod/image-registry-7788f749ff-bpwmb container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 17:26:51.829256 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:51.829125 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" podUID="5dd7aab7-9b6e-47df-9f4f-c7e5041066d0" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:26:54.686104 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:54.686070 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wqv6n" Apr 17 17:26:55.191343 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:55.191309 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-94d6497b6-5v4vj" Apr 17 17:26:55.191343 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:55.191347 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-94d6497b6-5v4vj" Apr 17 17:26:55.196043 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:55.196020 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-94d6497b6-5v4vj" Apr 17 17:26:55.711490 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:55.711464 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-dcc87df79-scgw5" Apr 17 17:26:55.723783 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:55.723756 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-94d6497b6-5v4vj" Apr 17 17:26:58.655301 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:26:58.655272 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:27:00.289227 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:00.289198 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-94d6497b6-5v4vj"] Apr 17 17:27:00.734619 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:00.734589 2575 generic.go:358] "Generic (PLEG): container finished" podID="eb7ea451-8e84-4c6c-9ca4-85e14c54d30a" containerID="7d402a66352796fcfcefca93db1bfc371fb6c36f68c15c5ae0ba5b462156104f" exitCode=0 Apr 17 17:27:00.734765 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:00.734673 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-9dmmr" event={"ID":"eb7ea451-8e84-4c6c-9ca4-85e14c54d30a","Type":"ContainerDied","Data":"7d402a66352796fcfcefca93db1bfc371fb6c36f68c15c5ae0ba5b462156104f"} Apr 17 17:27:00.735030 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:00.735013 2575 scope.go:117] "RemoveContainer" containerID="7d402a66352796fcfcefca93db1bfc371fb6c36f68c15c5ae0ba5b462156104f" Apr 17 17:27:01.740066 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:01.740028 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-9dmmr" event={"ID":"eb7ea451-8e84-4c6c-9ca4-85e14c54d30a","Type":"ContainerStarted","Data":"b4f7970902b2e59d2f024b307a6f39da6751c53c1ef8236e2035d2e80cf268dc"} Apr 17 17:27:01.741653 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:01.741591 2575 generic.go:358] "Generic (PLEG): container finished" podID="b5549e89-45e0-41a3-8e5a-7a240546ad14" containerID="3ee937c73571907a5956021a05b6c28f85576b32e89a0b1c97409bfc6425deaf" exitCode=0 Apr 17 17:27:01.741996 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:01.741954 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7cjnp" event={"ID":"b5549e89-45e0-41a3-8e5a-7a240546ad14","Type":"ContainerDied","Data":"3ee937c73571907a5956021a05b6c28f85576b32e89a0b1c97409bfc6425deaf"} Apr 17 17:27:01.742499 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:01.742462 2575 scope.go:117] "RemoveContainer" containerID="3ee937c73571907a5956021a05b6c28f85576b32e89a0b1c97409bfc6425deaf" Apr 17 17:27:02.429897 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:02.429865 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wqv6n_cb895382-4679-4cac-97c0-92e3122b7ba0/dns/0.log" Apr 17 17:27:02.629030 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:02.629003 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wqv6n_cb895382-4679-4cac-97c0-92e3122b7ba0/kube-rbac-proxy/0.log" Apr 17 17:27:02.746047 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:02.745969 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-7cjnp" event={"ID":"b5549e89-45e0-41a3-8e5a-7a240546ad14","Type":"ContainerStarted","Data":"984d059fc7223aa1469badcbb5a84fd4487e95538c67e5522121874ba8f912c0"} Apr 17 17:27:03.228817 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:03.228784 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qd2qr_adfae8ba-8d02-4f3c-85a7-b2ae828b0579/dns-node-resolver/0.log" Apr 17 17:27:03.429798 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:03.429767 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-6c56c99d76-p95sx_e9a77eac-7075-491c-a2b9-080151e1cac9/router/0.log" Apr 17 17:27:03.628996 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:03.628971 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6r887_2d29d8c9-146b-4e7e-988e-c3d984ff39e7/serve-healthcheck-canary/0.log" Apr 17 17:27:05.756164 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:05.756130 2575 generic.go:358] "Generic (PLEG): container finished" podID="3af1ae5b-3d27-42ae-84c6-15885f57a08c" containerID="dc364d02fd8f97ee4fb75941aca27fb9446b57e081bb5f84c31a8afef14c589a" exitCode=0 Apr 17 17:27:05.756529 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:05.756206 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-x2x5j" event={"ID":"3af1ae5b-3d27-42ae-84c6-15885f57a08c","Type":"ContainerDied","Data":"dc364d02fd8f97ee4fb75941aca27fb9446b57e081bb5f84c31a8afef14c589a"} Apr 17 17:27:06.760950 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:06.760904 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-x2x5j" event={"ID":"3af1ae5b-3d27-42ae-84c6-15885f57a08c","Type":"ContainerStarted","Data":"cfb2ae382f33ef005cc384c8061f0b4e0ed475000b7c4bafa916b03e003a9214"} Apr 17 17:27:06.760950 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:06.760951 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-x2x5j" event={"ID":"3af1ae5b-3d27-42ae-84c6-15885f57a08c","Type":"ContainerStarted","Data":"14b574e513dea639d588eca6a47c27ff670f4890774bae8bddf5f50245b9237e"} Apr 17 17:27:06.780264 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:06.780220 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-x2x5j" podStartSLOduration=3.896359053 podStartE2EDuration="27.780207225s" podCreationTimestamp="2026-04-17 17:26:39 +0000 UTC" firstStartedPulling="2026-04-17 17:26:40.949100757 +0000 UTC m=+170.474781354" lastFinishedPulling="2026-04-17 17:27:04.832948915 +0000 UTC m=+194.358629526" observedRunningTime="2026-04-17 17:27:06.779346868 +0000 UTC m=+196.305027486" watchObservedRunningTime="2026-04-17 17:27:06.780207225 +0000 UTC m=+196.305887844" Apr 17 17:27:11.015732 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:11.015706 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7788f749ff-bpwmb"] Apr 17 17:27:25.309000 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.308937 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-94d6497b6-5v4vj" podUID="7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c" containerName="console" containerID="cri-o://ece0d8ff4ad7f188e01a40afe5533c1e2da72abf616dae15c3220a500ec542e9" gracePeriod=15 Apr 17 17:27:25.603206 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.603187 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-94d6497b6-5v4vj_7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c/console/0.log" Apr 17 17:27:25.603316 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.603261 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-94d6497b6-5v4vj" Apr 17 17:27:25.795770 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.795733 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-console-config\") pod \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\" (UID: \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\") " Apr 17 17:27:25.795981 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.795783 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-service-ca\") pod \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\" (UID: \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\") " Apr 17 17:27:25.795981 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.795810 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-oauth-serving-cert\") pod \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\" (UID: \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\") " Apr 17 17:27:25.795981 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.795848 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khsg6\" (UniqueName: \"kubernetes.io/projected/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-kube-api-access-khsg6\") pod \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\" (UID: \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\") " Apr 17 17:27:25.795981 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.795874 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-console-serving-cert\") pod \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\" (UID: \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\") " Apr 17 17:27:25.795981 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.795890 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-console-oauth-config\") pod \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\" (UID: \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\") " Apr 17 17:27:25.795981 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.795914 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-trusted-ca-bundle\") pod \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\" (UID: \"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c\") " Apr 17 17:27:25.796288 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.796206 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-console-config" (OuterVolumeSpecName: "console-config") pod "7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c" (UID: "7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:27:25.796373 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.796343 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c" (UID: "7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:27:25.796454 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.796426 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-service-ca" (OuterVolumeSpecName: "service-ca") pod "7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c" (UID: "7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:27:25.796509 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.796466 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c" (UID: "7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:27:25.798284 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.798259 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c" (UID: "7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:27:25.798368 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.798348 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-kube-api-access-khsg6" (OuterVolumeSpecName: "kube-api-access-khsg6") pod "7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c" (UID: "7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c"). InnerVolumeSpecName "kube-api-access-khsg6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:27:25.798450 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.798428 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c" (UID: "7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:27:25.831718 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.831678 2575 generic.go:358] "Generic (PLEG): container finished" podID="f3cccdeb-674a-4c4a-882c-679c52c9c0a9" containerID="57ae9ca5e13d13fb29c5178e98a94d4e4644b20a2435d7fd76b6f5dc18b2547f" exitCode=0 Apr 17 17:27:25.831819 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.831751 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrgfm" event={"ID":"f3cccdeb-674a-4c4a-882c-679c52c9c0a9","Type":"ContainerDied","Data":"57ae9ca5e13d13fb29c5178e98a94d4e4644b20a2435d7fd76b6f5dc18b2547f"} Apr 17 17:27:25.832120 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.832102 2575 scope.go:117] "RemoveContainer" containerID="57ae9ca5e13d13fb29c5178e98a94d4e4644b20a2435d7fd76b6f5dc18b2547f" Apr 17 17:27:25.832941 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.832928 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-94d6497b6-5v4vj_7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c/console/0.log" Apr 17 17:27:25.833020 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.832959 2575 generic.go:358] "Generic (PLEG): container finished" podID="7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c" containerID="ece0d8ff4ad7f188e01a40afe5533c1e2da72abf616dae15c3220a500ec542e9" exitCode=2 Apr 17 17:27:25.833020 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.832987 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-94d6497b6-5v4vj" event={"ID":"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c","Type":"ContainerDied","Data":"ece0d8ff4ad7f188e01a40afe5533c1e2da72abf616dae15c3220a500ec542e9"} Apr 17 17:27:25.833129 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.833022 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-94d6497b6-5v4vj" event={"ID":"7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c","Type":"ContainerDied","Data":"4888449be8621870e795ed15126ed8a36e630dfc7705599bdd55ac556c5a09d0"} Apr 17 17:27:25.833129 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.833035 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-94d6497b6-5v4vj" Apr 17 17:27:25.833129 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.833040 2575 scope.go:117] "RemoveContainer" containerID="ece0d8ff4ad7f188e01a40afe5533c1e2da72abf616dae15c3220a500ec542e9" Apr 17 17:27:25.842961 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.842947 2575 scope.go:117] "RemoveContainer" containerID="ece0d8ff4ad7f188e01a40afe5533c1e2da72abf616dae15c3220a500ec542e9" Apr 17 17:27:25.843210 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:27:25.843192 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ece0d8ff4ad7f188e01a40afe5533c1e2da72abf616dae15c3220a500ec542e9\": container with ID starting with ece0d8ff4ad7f188e01a40afe5533c1e2da72abf616dae15c3220a500ec542e9 not found: ID does not exist" containerID="ece0d8ff4ad7f188e01a40afe5533c1e2da72abf616dae15c3220a500ec542e9" Apr 17 17:27:25.843258 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.843219 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ece0d8ff4ad7f188e01a40afe5533c1e2da72abf616dae15c3220a500ec542e9"} err="failed to get container status \"ece0d8ff4ad7f188e01a40afe5533c1e2da72abf616dae15c3220a500ec542e9\": rpc error: code = NotFound desc = could not find container \"ece0d8ff4ad7f188e01a40afe5533c1e2da72abf616dae15c3220a500ec542e9\": container with ID starting with ece0d8ff4ad7f188e01a40afe5533c1e2da72abf616dae15c3220a500ec542e9 not found: ID does not exist" Apr 17 17:27:25.867768 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.867746 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-94d6497b6-5v4vj"] Apr 17 17:27:25.874620 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.874597 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-94d6497b6-5v4vj"] Apr 17 17:27:25.897428 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.897399 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-console-oauth-config\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:27:25.897428 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.897423 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-trusted-ca-bundle\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:27:25.897545 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.897438 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-console-config\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:27:25.897545 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.897448 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-service-ca\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:27:25.897545 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.897457 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-oauth-serving-cert\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:27:25.897545 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.897466 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-khsg6\" (UniqueName: \"kubernetes.io/projected/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-kube-api-access-khsg6\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:27:25.897545 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:25.897476 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c-console-serving-cert\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:27:26.838634 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:26.838595 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-qrgfm" event={"ID":"f3cccdeb-674a-4c4a-882c-679c52c9c0a9","Type":"ContainerStarted","Data":"487ae16f20a0ed35b9d5209dc8c8d1d94355285b9e8d3fc43dd416972406e5f4"} Apr 17 17:27:27.093476 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:27.093391 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c" path="/var/lib/kubelet/pods/7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c/volumes" Apr 17 17:27:36.033970 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.033902 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" podUID="5dd7aab7-9b6e-47df-9f4f-c7e5041066d0" containerName="registry" containerID="cri-o://ba532f764adf55077f1a8df8d63997fae72c314ecae9ce8da098ed4b770ec288" gracePeriod=30 Apr 17 17:27:36.259800 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.259775 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:27:36.274153 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.274133 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-image-registry-private-configuration\") pod \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " Apr 17 17:27:36.274224 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.274164 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-trusted-ca\") pod \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " Apr 17 17:27:36.274224 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.274210 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4prr\" (UniqueName: \"kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-kube-api-access-d4prr\") pod \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " Apr 17 17:27:36.274392 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.274374 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-ca-trust-extracted\") pod \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " Apr 17 17:27:36.274442 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.274412 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-bound-sa-token\") pod \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " Apr 17 17:27:36.274442 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.274436 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-installation-pull-secrets\") pod \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " Apr 17 17:27:36.274512 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.274475 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-certificates\") pod \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " Apr 17 17:27:36.274568 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.274513 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-tls\") pod \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\" (UID: \"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0\") " Apr 17 17:27:36.274692 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.274666 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "5dd7aab7-9b6e-47df-9f4f-c7e5041066d0" (UID: "5dd7aab7-9b6e-47df-9f4f-c7e5041066d0"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:27:36.274813 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.274794 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-trusted-ca\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:27:36.275117 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.275093 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "5dd7aab7-9b6e-47df-9f4f-c7e5041066d0" (UID: "5dd7aab7-9b6e-47df-9f4f-c7e5041066d0"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:27:36.276674 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.276650 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "5dd7aab7-9b6e-47df-9f4f-c7e5041066d0" (UID: "5dd7aab7-9b6e-47df-9f4f-c7e5041066d0"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:27:36.276856 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.276836 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "5dd7aab7-9b6e-47df-9f4f-c7e5041066d0" (UID: "5dd7aab7-9b6e-47df-9f4f-c7e5041066d0"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:27:36.277076 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.277057 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "5dd7aab7-9b6e-47df-9f4f-c7e5041066d0" (UID: "5dd7aab7-9b6e-47df-9f4f-c7e5041066d0"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:27:36.277169 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.277148 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "5dd7aab7-9b6e-47df-9f4f-c7e5041066d0" (UID: "5dd7aab7-9b6e-47df-9f4f-c7e5041066d0"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:27:36.277303 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.277289 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-kube-api-access-d4prr" (OuterVolumeSpecName: "kube-api-access-d4prr") pod "5dd7aab7-9b6e-47df-9f4f-c7e5041066d0" (UID: "5dd7aab7-9b6e-47df-9f4f-c7e5041066d0"). InnerVolumeSpecName "kube-api-access-d4prr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:27:36.283523 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.283497 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "5dd7aab7-9b6e-47df-9f4f-c7e5041066d0" (UID: "5dd7aab7-9b6e-47df-9f4f-c7e5041066d0"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:27:36.375194 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.375119 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d4prr\" (UniqueName: \"kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-kube-api-access-d4prr\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:27:36.375194 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.375146 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-ca-trust-extracted\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:27:36.375194 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.375157 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-bound-sa-token\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:27:36.375194 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.375167 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-installation-pull-secrets\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:27:36.375194 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.375176 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-certificates\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:27:36.375194 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.375186 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-registry-tls\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:27:36.375194 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.375195 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0-image-registry-private-configuration\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:27:36.868137 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.868100 2575 generic.go:358] "Generic (PLEG): container finished" podID="5dd7aab7-9b6e-47df-9f4f-c7e5041066d0" containerID="ba532f764adf55077f1a8df8d63997fae72c314ecae9ce8da098ed4b770ec288" exitCode=0 Apr 17 17:27:36.868299 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.868190 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" Apr 17 17:27:36.868299 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.868187 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" event={"ID":"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0","Type":"ContainerDied","Data":"ba532f764adf55077f1a8df8d63997fae72c314ecae9ce8da098ed4b770ec288"} Apr 17 17:27:36.868299 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.868296 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7788f749ff-bpwmb" event={"ID":"5dd7aab7-9b6e-47df-9f4f-c7e5041066d0","Type":"ContainerDied","Data":"388d8a0690ca17906d435059d0950eff2fb749013a4e36b0f6bbedae8333fecd"} Apr 17 17:27:36.868409 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.868311 2575 scope.go:117] "RemoveContainer" containerID="ba532f764adf55077f1a8df8d63997fae72c314ecae9ce8da098ed4b770ec288" Apr 17 17:27:36.876956 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.876937 2575 scope.go:117] "RemoveContainer" containerID="ba532f764adf55077f1a8df8d63997fae72c314ecae9ce8da098ed4b770ec288" Apr 17 17:27:36.877194 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:27:36.877175 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba532f764adf55077f1a8df8d63997fae72c314ecae9ce8da098ed4b770ec288\": container with ID starting with ba532f764adf55077f1a8df8d63997fae72c314ecae9ce8da098ed4b770ec288 not found: ID does not exist" containerID="ba532f764adf55077f1a8df8d63997fae72c314ecae9ce8da098ed4b770ec288" Apr 17 17:27:36.877237 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.877202 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba532f764adf55077f1a8df8d63997fae72c314ecae9ce8da098ed4b770ec288"} err="failed to get container status \"ba532f764adf55077f1a8df8d63997fae72c314ecae9ce8da098ed4b770ec288\": rpc error: code = NotFound desc = could not find container \"ba532f764adf55077f1a8df8d63997fae72c314ecae9ce8da098ed4b770ec288\": container with ID starting with ba532f764adf55077f1a8df8d63997fae72c314ecae9ce8da098ed4b770ec288 not found: ID does not exist" Apr 17 17:27:36.905075 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.905051 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7788f749ff-bpwmb"] Apr 17 17:27:36.927498 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:36.927473 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7788f749ff-bpwmb"] Apr 17 17:27:37.093615 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:27:37.093585 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dd7aab7-9b6e-47df-9f4f-c7e5041066d0" path="/var/lib/kubelet/pods/5dd7aab7-9b6e-47df-9f4f-c7e5041066d0/volumes" Apr 17 17:28:01.875538 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:01.875495 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-metrics-certs\") pod \"network-metrics-daemon-mqlsd\" (UID: \"41eeea20-b1c0-4cb6-8da8-a4a26a60423d\") " pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:28:01.877836 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:01.877800 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41eeea20-b1c0-4cb6-8da8-a4a26a60423d-metrics-certs\") pod \"network-metrics-daemon-mqlsd\" (UID: \"41eeea20-b1c0-4cb6-8da8-a4a26a60423d\") " pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:28:01.893021 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:01.893000 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7c8rp\"" Apr 17 17:28:01.901489 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:01.901472 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqlsd" Apr 17 17:28:02.022684 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:02.022652 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mqlsd"] Apr 17 17:28:02.026882 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:28:02.026848 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41eeea20_b1c0_4cb6_8da8_a4a26a60423d.slice/crio-7f1daef9a69fb6ec67c4f211dc3a786e126fe5333de80b494d4a2741ad07cfcc WatchSource:0}: Error finding container 7f1daef9a69fb6ec67c4f211dc3a786e126fe5333de80b494d4a2741ad07cfcc: Status 404 returned error can't find the container with id 7f1daef9a69fb6ec67c4f211dc3a786e126fe5333de80b494d4a2741ad07cfcc Apr 17 17:28:02.950030 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:02.949992 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mqlsd" event={"ID":"41eeea20-b1c0-4cb6-8da8-a4a26a60423d","Type":"ContainerStarted","Data":"7f1daef9a69fb6ec67c4f211dc3a786e126fe5333de80b494d4a2741ad07cfcc"} Apr 17 17:28:03.954350 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:03.954314 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mqlsd" event={"ID":"41eeea20-b1c0-4cb6-8da8-a4a26a60423d","Type":"ContainerStarted","Data":"3e19d4ca8f04dbf7925cc9ada56e19b6f03e9c74c72eedfed03999aa36c5289c"} Apr 17 17:28:03.954350 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:03.954352 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mqlsd" event={"ID":"41eeea20-b1c0-4cb6-8da8-a4a26a60423d","Type":"ContainerStarted","Data":"325713dd3618ea68d2caf2f9db3609aa437a3196821002661404dbeb8d22cb28"} Apr 17 17:28:03.975344 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:03.975293 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-mqlsd" podStartSLOduration=251.802594608 podStartE2EDuration="4m12.975280404s" podCreationTimestamp="2026-04-17 17:23:51 +0000 UTC" firstStartedPulling="2026-04-17 17:28:02.028610408 +0000 UTC m=+251.554291008" lastFinishedPulling="2026-04-17 17:28:03.201296204 +0000 UTC m=+252.726976804" observedRunningTime="2026-04-17 17:28:03.974338733 +0000 UTC m=+253.500019352" watchObservedRunningTime="2026-04-17 17:28:03.975280404 +0000 UTC m=+253.500961023" Apr 17 17:28:13.288137 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.288058 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-68b44597bb-6s4j7"] Apr 17 17:28:13.288648 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.288435 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5dd7aab7-9b6e-47df-9f4f-c7e5041066d0" containerName="registry" Apr 17 17:28:13.288648 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.288453 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd7aab7-9b6e-47df-9f4f-c7e5041066d0" containerName="registry" Apr 17 17:28:13.288648 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.288467 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c" containerName="console" Apr 17 17:28:13.288648 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.288473 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c" containerName="console" Apr 17 17:28:13.288648 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.288523 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5dd7aab7-9b6e-47df-9f4f-c7e5041066d0" containerName="registry" Apr 17 17:28:13.288648 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.288535 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="7ff7bac2-ae68-4ee4-9ad8-1c56d1af0b2c" containerName="console" Apr 17 17:28:13.291801 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.291779 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68b44597bb-6s4j7" Apr 17 17:28:13.294024 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.293998 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 17:28:13.294193 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.293996 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 17:28:13.294492 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.294470 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 17:28:13.294665 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.294507 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 17:28:13.295273 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.295224 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-7hcbl\"" Apr 17 17:28:13.295414 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.295308 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 17:28:13.301090 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.301063 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 17:28:13.301303 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.301281 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68b44597bb-6s4j7"] Apr 17 17:28:13.371680 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.371645 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3122264d-c697-4bd6-92fd-fa3f7076f5b4-console-oauth-config\") pod \"console-68b44597bb-6s4j7\" (UID: \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\") " pod="openshift-console/console-68b44597bb-6s4j7" Apr 17 17:28:13.371680 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.371680 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3122264d-c697-4bd6-92fd-fa3f7076f5b4-console-config\") pod \"console-68b44597bb-6s4j7\" (UID: \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\") " pod="openshift-console/console-68b44597bb-6s4j7" Apr 17 17:28:13.371923 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.371699 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3122264d-c697-4bd6-92fd-fa3f7076f5b4-service-ca\") pod \"console-68b44597bb-6s4j7\" (UID: \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\") " pod="openshift-console/console-68b44597bb-6s4j7" Apr 17 17:28:13.371923 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.371725 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqqg6\" (UniqueName: \"kubernetes.io/projected/3122264d-c697-4bd6-92fd-fa3f7076f5b4-kube-api-access-nqqg6\") pod \"console-68b44597bb-6s4j7\" (UID: \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\") " pod="openshift-console/console-68b44597bb-6s4j7" Apr 17 17:28:13.371923 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.371790 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3122264d-c697-4bd6-92fd-fa3f7076f5b4-oauth-serving-cert\") pod \"console-68b44597bb-6s4j7\" (UID: \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\") " pod="openshift-console/console-68b44597bb-6s4j7" Apr 17 17:28:13.371923 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.371849 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3122264d-c697-4bd6-92fd-fa3f7076f5b4-console-serving-cert\") pod \"console-68b44597bb-6s4j7\" (UID: \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\") " pod="openshift-console/console-68b44597bb-6s4j7" Apr 17 17:28:13.371923 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.371880 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3122264d-c697-4bd6-92fd-fa3f7076f5b4-trusted-ca-bundle\") pod \"console-68b44597bb-6s4j7\" (UID: \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\") " pod="openshift-console/console-68b44597bb-6s4j7" Apr 17 17:28:13.472924 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.472890 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3122264d-c697-4bd6-92fd-fa3f7076f5b4-console-serving-cert\") pod \"console-68b44597bb-6s4j7\" (UID: \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\") " pod="openshift-console/console-68b44597bb-6s4j7" Apr 17 17:28:13.473092 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.472935 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3122264d-c697-4bd6-92fd-fa3f7076f5b4-trusted-ca-bundle\") pod \"console-68b44597bb-6s4j7\" (UID: \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\") " pod="openshift-console/console-68b44597bb-6s4j7" Apr 17 17:28:13.473092 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.473052 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3122264d-c697-4bd6-92fd-fa3f7076f5b4-console-oauth-config\") pod \"console-68b44597bb-6s4j7\" (UID: \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\") " pod="openshift-console/console-68b44597bb-6s4j7" Apr 17 17:28:13.473177 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.473091 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3122264d-c697-4bd6-92fd-fa3f7076f5b4-console-config\") pod \"console-68b44597bb-6s4j7\" (UID: \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\") " pod="openshift-console/console-68b44597bb-6s4j7" Apr 17 17:28:13.473177 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.473114 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3122264d-c697-4bd6-92fd-fa3f7076f5b4-service-ca\") pod \"console-68b44597bb-6s4j7\" (UID: \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\") " pod="openshift-console/console-68b44597bb-6s4j7" Apr 17 17:28:13.473177 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.473156 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqqg6\" (UniqueName: \"kubernetes.io/projected/3122264d-c697-4bd6-92fd-fa3f7076f5b4-kube-api-access-nqqg6\") pod \"console-68b44597bb-6s4j7\" (UID: \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\") " pod="openshift-console/console-68b44597bb-6s4j7" Apr 17 17:28:13.473312 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.473230 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3122264d-c697-4bd6-92fd-fa3f7076f5b4-oauth-serving-cert\") pod \"console-68b44597bb-6s4j7\" (UID: \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\") " pod="openshift-console/console-68b44597bb-6s4j7" Apr 17 17:28:13.473892 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.473870 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3122264d-c697-4bd6-92fd-fa3f7076f5b4-console-config\") pod \"console-68b44597bb-6s4j7\" (UID: \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\") " pod="openshift-console/console-68b44597bb-6s4j7" Apr 17 17:28:13.474016 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.473925 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3122264d-c697-4bd6-92fd-fa3f7076f5b4-trusted-ca-bundle\") pod \"console-68b44597bb-6s4j7\" (UID: \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\") " pod="openshift-console/console-68b44597bb-6s4j7" Apr 17 17:28:13.474016 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.473930 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3122264d-c697-4bd6-92fd-fa3f7076f5b4-oauth-serving-cert\") pod \"console-68b44597bb-6s4j7\" (UID: \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\") " pod="openshift-console/console-68b44597bb-6s4j7" Apr 17 17:28:13.474095 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.474011 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3122264d-c697-4bd6-92fd-fa3f7076f5b4-service-ca\") pod \"console-68b44597bb-6s4j7\" (UID: \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\") " pod="openshift-console/console-68b44597bb-6s4j7" Apr 17 17:28:13.475463 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.475436 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3122264d-c697-4bd6-92fd-fa3f7076f5b4-console-serving-cert\") pod \"console-68b44597bb-6s4j7\" (UID: \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\") " pod="openshift-console/console-68b44597bb-6s4j7" Apr 17 17:28:13.475558 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.475531 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3122264d-c697-4bd6-92fd-fa3f7076f5b4-console-oauth-config\") pod \"console-68b44597bb-6s4j7\" (UID: \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\") " pod="openshift-console/console-68b44597bb-6s4j7" Apr 17 17:28:13.481723 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.481705 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqqg6\" (UniqueName: \"kubernetes.io/projected/3122264d-c697-4bd6-92fd-fa3f7076f5b4-kube-api-access-nqqg6\") pod \"console-68b44597bb-6s4j7\" (UID: \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\") " pod="openshift-console/console-68b44597bb-6s4j7" Apr 17 17:28:13.603900 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.603793 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68b44597bb-6s4j7" Apr 17 17:28:13.725066 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.725041 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-68b44597bb-6s4j7"] Apr 17 17:28:13.727583 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:28:13.727558 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3122264d_c697_4bd6_92fd_fa3f7076f5b4.slice/crio-70be57e0f3eeb8f497f93ea5c8176aa4e01a7b08df1906c7a25006f5c4d30c06 WatchSource:0}: Error finding container 70be57e0f3eeb8f497f93ea5c8176aa4e01a7b08df1906c7a25006f5c4d30c06: Status 404 returned error can't find the container with id 70be57e0f3eeb8f497f93ea5c8176aa4e01a7b08df1906c7a25006f5c4d30c06 Apr 17 17:28:13.987782 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.987701 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68b44597bb-6s4j7" event={"ID":"3122264d-c697-4bd6-92fd-fa3f7076f5b4","Type":"ContainerStarted","Data":"56e30b98e06df2e0315b69c097fbe43bbac035cfda9cdef8d7890e73857361aa"} Apr 17 17:28:13.987782 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:13.987735 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68b44597bb-6s4j7" event={"ID":"3122264d-c697-4bd6-92fd-fa3f7076f5b4","Type":"ContainerStarted","Data":"70be57e0f3eeb8f497f93ea5c8176aa4e01a7b08df1906c7a25006f5c4d30c06"} Apr 17 17:28:14.008713 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:14.008660 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-68b44597bb-6s4j7" podStartSLOduration=1.008646501 podStartE2EDuration="1.008646501s" podCreationTimestamp="2026-04-17 17:28:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:28:14.007404566 +0000 UTC m=+263.533085197" watchObservedRunningTime="2026-04-17 17:28:14.008646501 +0000 UTC m=+263.534327121" Apr 17 17:28:23.604641 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:23.604599 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-68b44597bb-6s4j7" Apr 17 17:28:23.604641 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:23.604642 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-68b44597bb-6s4j7" Apr 17 17:28:23.609582 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:23.609560 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-68b44597bb-6s4j7" Apr 17 17:28:24.022792 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:24.022762 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-68b44597bb-6s4j7" Apr 17 17:28:50.984926 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:50.984895 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctj42_a9e0a184-477b-45d6-835a-1606a973a5cf/console-operator/1.log" Apr 17 17:28:50.985488 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:50.985123 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctj42_a9e0a184-477b-45d6-835a-1606a973a5cf/console-operator/1.log" Apr 17 17:28:50.996361 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:28:50.996341 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 17:29:52.985785 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:52.985698 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wsk7c"] Apr 17 17:29:52.988068 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:52.988052 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wsk7c" Apr 17 17:29:52.990424 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:52.990394 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 17 17:29:52.990424 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:52.990413 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 17 17:29:52.990424 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:52.990423 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 17 17:29:52.990665 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:52.990460 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-gkpwx\"" Apr 17 17:29:52.998680 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:52.998655 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wsk7c"] Apr 17 17:29:53.128965 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:53.128930 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/457112d6-d835-4478-bef8-e7f779b79154-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-wsk7c\" (UID: \"457112d6-d835-4478-bef8-e7f779b79154\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wsk7c" Apr 17 17:29:53.129130 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:53.128979 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7k59\" (UniqueName: \"kubernetes.io/projected/457112d6-d835-4478-bef8-e7f779b79154-kube-api-access-n7k59\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-wsk7c\" (UID: \"457112d6-d835-4478-bef8-e7f779b79154\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wsk7c" Apr 17 17:29:53.229963 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:53.229928 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/457112d6-d835-4478-bef8-e7f779b79154-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-wsk7c\" (UID: \"457112d6-d835-4478-bef8-e7f779b79154\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wsk7c" Apr 17 17:29:53.230144 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:53.229982 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7k59\" (UniqueName: \"kubernetes.io/projected/457112d6-d835-4478-bef8-e7f779b79154-kube-api-access-n7k59\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-wsk7c\" (UID: \"457112d6-d835-4478-bef8-e7f779b79154\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wsk7c" Apr 17 17:29:53.235392 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:53.233173 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/457112d6-d835-4478-bef8-e7f779b79154-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-wsk7c\" (UID: \"457112d6-d835-4478-bef8-e7f779b79154\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wsk7c" Apr 17 17:29:53.238562 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:53.238502 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7k59\" (UniqueName: \"kubernetes.io/projected/457112d6-d835-4478-bef8-e7f779b79154-kube-api-access-n7k59\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-wsk7c\" (UID: \"457112d6-d835-4478-bef8-e7f779b79154\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wsk7c" Apr 17 17:29:53.298319 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:53.298288 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wsk7c" Apr 17 17:29:53.419657 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:53.419632 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wsk7c"] Apr 17 17:29:53.422154 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:29:53.422131 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod457112d6_d835_4478_bef8_e7f779b79154.slice/crio-8f1d3d20ddeef803240ee7e9713e93e5c2950b5b13423c5ca859325add9cbcea WatchSource:0}: Error finding container 8f1d3d20ddeef803240ee7e9713e93e5c2950b5b13423c5ca859325add9cbcea: Status 404 returned error can't find the container with id 8f1d3d20ddeef803240ee7e9713e93e5c2950b5b13423c5ca859325add9cbcea Apr 17 17:29:53.424031 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:53.424011 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:29:54.274715 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:54.274669 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wsk7c" event={"ID":"457112d6-d835-4478-bef8-e7f779b79154","Type":"ContainerStarted","Data":"8f1d3d20ddeef803240ee7e9713e93e5c2950b5b13423c5ca859325add9cbcea"} Apr 17 17:29:57.285456 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:57.285421 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wsk7c" event={"ID":"457112d6-d835-4478-bef8-e7f779b79154","Type":"ContainerStarted","Data":"9f42f3a7724dc3462a15c3e613970c05c54bcc90d1981549535d82c6f4e60575"} Apr 17 17:29:57.285839 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:57.285581 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wsk7c" Apr 17 17:29:57.311902 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:57.311795 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wsk7c" podStartSLOduration=1.927394269 podStartE2EDuration="5.311779767s" podCreationTimestamp="2026-04-17 17:29:52 +0000 UTC" firstStartedPulling="2026-04-17 17:29:53.424191962 +0000 UTC m=+362.949872567" lastFinishedPulling="2026-04-17 17:29:56.808577462 +0000 UTC m=+366.334258065" observedRunningTime="2026-04-17 17:29:57.310394229 +0000 UTC m=+366.836074848" watchObservedRunningTime="2026-04-17 17:29:57.311779767 +0000 UTC m=+366.837460386" Apr 17 17:29:58.005195 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:58.005161 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-78gq9"] Apr 17 17:29:58.008466 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:58.008448 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-78gq9" Apr 17 17:29:58.010652 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:58.010632 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-d9wx9\"" Apr 17 17:29:58.010766 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:58.010653 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 17 17:29:58.010766 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:58.010669 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 17 17:29:58.016081 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:58.016058 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-78gq9"] Apr 17 17:29:58.075860 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:58.075807 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3dd44c6e-b753-4c39-bb6f-bcf9d9e3d47f-certificates\") pod \"keda-admission-cf49989db-78gq9\" (UID: \"3dd44c6e-b753-4c39-bb6f-bcf9d9e3d47f\") " pod="openshift-keda/keda-admission-cf49989db-78gq9" Apr 17 17:29:58.075860 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:58.075862 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66gxf\" (UniqueName: \"kubernetes.io/projected/3dd44c6e-b753-4c39-bb6f-bcf9d9e3d47f-kube-api-access-66gxf\") pod \"keda-admission-cf49989db-78gq9\" (UID: \"3dd44c6e-b753-4c39-bb6f-bcf9d9e3d47f\") " pod="openshift-keda/keda-admission-cf49989db-78gq9" Apr 17 17:29:58.177440 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:58.177407 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3dd44c6e-b753-4c39-bb6f-bcf9d9e3d47f-certificates\") pod \"keda-admission-cf49989db-78gq9\" (UID: \"3dd44c6e-b753-4c39-bb6f-bcf9d9e3d47f\") " pod="openshift-keda/keda-admission-cf49989db-78gq9" Apr 17 17:29:58.177611 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:58.177451 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-66gxf\" (UniqueName: \"kubernetes.io/projected/3dd44c6e-b753-4c39-bb6f-bcf9d9e3d47f-kube-api-access-66gxf\") pod \"keda-admission-cf49989db-78gq9\" (UID: \"3dd44c6e-b753-4c39-bb6f-bcf9d9e3d47f\") " pod="openshift-keda/keda-admission-cf49989db-78gq9" Apr 17 17:29:58.180069 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:58.180029 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3dd44c6e-b753-4c39-bb6f-bcf9d9e3d47f-certificates\") pod \"keda-admission-cf49989db-78gq9\" (UID: \"3dd44c6e-b753-4c39-bb6f-bcf9d9e3d47f\") " pod="openshift-keda/keda-admission-cf49989db-78gq9" Apr 17 17:29:58.196902 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:58.196876 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-66gxf\" (UniqueName: \"kubernetes.io/projected/3dd44c6e-b753-4c39-bb6f-bcf9d9e3d47f-kube-api-access-66gxf\") pod \"keda-admission-cf49989db-78gq9\" (UID: \"3dd44c6e-b753-4c39-bb6f-bcf9d9e3d47f\") " pod="openshift-keda/keda-admission-cf49989db-78gq9" Apr 17 17:29:58.320033 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:58.319948 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-78gq9" Apr 17 17:29:58.455589 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:58.454873 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-78gq9"] Apr 17 17:29:58.457995 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:29:58.457969 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dd44c6e_b753_4c39_bb6f_bcf9d9e3d47f.slice/crio-5abb04a9d813ef58b81405b2c45112f307046b52d0a9cc7aa329fc6796a50c1d WatchSource:0}: Error finding container 5abb04a9d813ef58b81405b2c45112f307046b52d0a9cc7aa329fc6796a50c1d: Status 404 returned error can't find the container with id 5abb04a9d813ef58b81405b2c45112f307046b52d0a9cc7aa329fc6796a50c1d Apr 17 17:29:59.292992 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:29:59.292961 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-78gq9" event={"ID":"3dd44c6e-b753-4c39-bb6f-bcf9d9e3d47f","Type":"ContainerStarted","Data":"5abb04a9d813ef58b81405b2c45112f307046b52d0a9cc7aa329fc6796a50c1d"} Apr 17 17:30:00.297683 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:30:00.297651 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-78gq9" event={"ID":"3dd44c6e-b753-4c39-bb6f-bcf9d9e3d47f","Type":"ContainerStarted","Data":"b3f606d3954f7726ceff015a28be88f2999176e7a098d04a462c320f31b0a048"} Apr 17 17:30:00.298129 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:30:00.297745 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-78gq9" Apr 17 17:30:00.315417 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:30:00.315372 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-78gq9" podStartSLOduration=2.008891179 podStartE2EDuration="3.315358507s" podCreationTimestamp="2026-04-17 17:29:57 +0000 UTC" firstStartedPulling="2026-04-17 17:29:58.459167471 +0000 UTC m=+367.984848068" lastFinishedPulling="2026-04-17 17:29:59.765634796 +0000 UTC m=+369.291315396" observedRunningTime="2026-04-17 17:30:00.314218174 +0000 UTC m=+369.839898786" watchObservedRunningTime="2026-04-17 17:30:00.315358507 +0000 UTC m=+369.841039126" Apr 17 17:30:18.291384 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:30:18.291353 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wsk7c" Apr 17 17:30:21.303691 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:30:21.303662 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-78gq9" Apr 17 17:31:06.464773 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:06.464687 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-dxlsk"] Apr 17 17:31:06.468026 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:06.468005 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85dd7cfb4d-dxlsk" Apr 17 17:31:06.470180 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:06.470160 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 17 17:31:06.470272 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:06.470221 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-brh2c\"" Apr 17 17:31:06.470272 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:06.470241 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 17:31:06.471006 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:06.470993 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 17:31:06.477282 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:06.477261 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-dxlsk"] Apr 17 17:31:06.518911 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:06.518882 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxfpd\" (UniqueName: \"kubernetes.io/projected/547d3209-efdd-4252-9124-c50a585f1de5-kube-api-access-sxfpd\") pod \"kserve-controller-manager-85dd7cfb4d-dxlsk\" (UID: \"547d3209-efdd-4252-9124-c50a585f1de5\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-dxlsk" Apr 17 17:31:06.519013 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:06.518917 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/547d3209-efdd-4252-9124-c50a585f1de5-cert\") pod \"kserve-controller-manager-85dd7cfb4d-dxlsk\" (UID: \"547d3209-efdd-4252-9124-c50a585f1de5\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-dxlsk" Apr 17 17:31:06.620222 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:06.620191 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sxfpd\" (UniqueName: \"kubernetes.io/projected/547d3209-efdd-4252-9124-c50a585f1de5-kube-api-access-sxfpd\") pod \"kserve-controller-manager-85dd7cfb4d-dxlsk\" (UID: \"547d3209-efdd-4252-9124-c50a585f1de5\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-dxlsk" Apr 17 17:31:06.620330 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:06.620229 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/547d3209-efdd-4252-9124-c50a585f1de5-cert\") pod \"kserve-controller-manager-85dd7cfb4d-dxlsk\" (UID: \"547d3209-efdd-4252-9124-c50a585f1de5\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-dxlsk" Apr 17 17:31:06.620330 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:31:06.620320 2575 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 17 17:31:06.620396 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:31:06.620380 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/547d3209-efdd-4252-9124-c50a585f1de5-cert podName:547d3209-efdd-4252-9124-c50a585f1de5 nodeName:}" failed. No retries permitted until 2026-04-17 17:31:07.120363153 +0000 UTC m=+436.646043749 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/547d3209-efdd-4252-9124-c50a585f1de5-cert") pod "kserve-controller-manager-85dd7cfb4d-dxlsk" (UID: "547d3209-efdd-4252-9124-c50a585f1de5") : secret "kserve-webhook-server-cert" not found Apr 17 17:31:06.631019 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:06.630990 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxfpd\" (UniqueName: \"kubernetes.io/projected/547d3209-efdd-4252-9124-c50a585f1de5-kube-api-access-sxfpd\") pod \"kserve-controller-manager-85dd7cfb4d-dxlsk\" (UID: \"547d3209-efdd-4252-9124-c50a585f1de5\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-dxlsk" Apr 17 17:31:07.124867 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:07.124836 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/547d3209-efdd-4252-9124-c50a585f1de5-cert\") pod \"kserve-controller-manager-85dd7cfb4d-dxlsk\" (UID: \"547d3209-efdd-4252-9124-c50a585f1de5\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-dxlsk" Apr 17 17:31:07.127370 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:07.127339 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/547d3209-efdd-4252-9124-c50a585f1de5-cert\") pod \"kserve-controller-manager-85dd7cfb4d-dxlsk\" (UID: \"547d3209-efdd-4252-9124-c50a585f1de5\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-dxlsk" Apr 17 17:31:07.378625 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:07.378545 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85dd7cfb4d-dxlsk" Apr 17 17:31:07.495800 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:07.495766 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-dxlsk"] Apr 17 17:31:07.498796 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:31:07.498768 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod547d3209_efdd_4252_9124_c50a585f1de5.slice/crio-58481cf31947b396478cab875e808b25357730e41d43ebc0d5e6ab35c515e0da WatchSource:0}: Error finding container 58481cf31947b396478cab875e808b25357730e41d43ebc0d5e6ab35c515e0da: Status 404 returned error can't find the container with id 58481cf31947b396478cab875e808b25357730e41d43ebc0d5e6ab35c515e0da Apr 17 17:31:07.509616 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:07.509588 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85dd7cfb4d-dxlsk" event={"ID":"547d3209-efdd-4252-9124-c50a585f1de5","Type":"ContainerStarted","Data":"58481cf31947b396478cab875e808b25357730e41d43ebc0d5e6ab35c515e0da"} Apr 17 17:31:10.520745 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:10.520709 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85dd7cfb4d-dxlsk" event={"ID":"547d3209-efdd-4252-9124-c50a585f1de5","Type":"ContainerStarted","Data":"1ddae8cbd5fc87432197741d05219c09c9a129c65b62894bdd70578f05478600"} Apr 17 17:31:10.521125 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:10.520816 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-85dd7cfb4d-dxlsk" Apr 17 17:31:10.537157 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:10.537105 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-85dd7cfb4d-dxlsk" podStartSLOduration=1.96728251 podStartE2EDuration="4.537093227s" podCreationTimestamp="2026-04-17 17:31:06 +0000 UTC" firstStartedPulling="2026-04-17 17:31:07.500365026 +0000 UTC m=+437.026045623" lastFinishedPulling="2026-04-17 17:31:10.070175743 +0000 UTC m=+439.595856340" observedRunningTime="2026-04-17 17:31:10.535574798 +0000 UTC m=+440.061255416" watchObservedRunningTime="2026-04-17 17:31:10.537093227 +0000 UTC m=+440.062773845" Apr 17 17:31:41.196872 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:41.196840 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-dxlsk"] Apr 17 17:31:41.197492 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:41.197076 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-85dd7cfb4d-dxlsk" podUID="547d3209-efdd-4252-9124-c50a585f1de5" containerName="manager" containerID="cri-o://1ddae8cbd5fc87432197741d05219c09c9a129c65b62894bdd70578f05478600" gracePeriod=10 Apr 17 17:31:41.202018 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:41.201995 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-85dd7cfb4d-dxlsk" Apr 17 17:31:41.224186 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:41.224161 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-pzkh6"] Apr 17 17:31:41.227219 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:41.227199 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85dd7cfb4d-pzkh6" Apr 17 17:31:41.237315 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:41.237293 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-pzkh6"] Apr 17 17:31:41.293299 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:41.293275 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kxdr\" (UniqueName: \"kubernetes.io/projected/4efd0ded-0386-4ccb-9635-7fa20c9a0364-kube-api-access-7kxdr\") pod \"kserve-controller-manager-85dd7cfb4d-pzkh6\" (UID: \"4efd0ded-0386-4ccb-9635-7fa20c9a0364\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-pzkh6" Apr 17 17:31:41.293409 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:41.293312 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4efd0ded-0386-4ccb-9635-7fa20c9a0364-cert\") pod \"kserve-controller-manager-85dd7cfb4d-pzkh6\" (UID: \"4efd0ded-0386-4ccb-9635-7fa20c9a0364\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-pzkh6" Apr 17 17:31:41.394161 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:41.394126 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7kxdr\" (UniqueName: \"kubernetes.io/projected/4efd0ded-0386-4ccb-9635-7fa20c9a0364-kube-api-access-7kxdr\") pod \"kserve-controller-manager-85dd7cfb4d-pzkh6\" (UID: \"4efd0ded-0386-4ccb-9635-7fa20c9a0364\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-pzkh6" Apr 17 17:31:41.394287 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:41.394183 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4efd0ded-0386-4ccb-9635-7fa20c9a0364-cert\") pod \"kserve-controller-manager-85dd7cfb4d-pzkh6\" (UID: \"4efd0ded-0386-4ccb-9635-7fa20c9a0364\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-pzkh6" Apr 17 17:31:41.396969 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:41.396944 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4efd0ded-0386-4ccb-9635-7fa20c9a0364-cert\") pod \"kserve-controller-manager-85dd7cfb4d-pzkh6\" (UID: \"4efd0ded-0386-4ccb-9635-7fa20c9a0364\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-pzkh6" Apr 17 17:31:41.402150 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:41.402124 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kxdr\" (UniqueName: \"kubernetes.io/projected/4efd0ded-0386-4ccb-9635-7fa20c9a0364-kube-api-access-7kxdr\") pod \"kserve-controller-manager-85dd7cfb4d-pzkh6\" (UID: \"4efd0ded-0386-4ccb-9635-7fa20c9a0364\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-pzkh6" Apr 17 17:31:41.431754 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:41.431734 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85dd7cfb4d-dxlsk" Apr 17 17:31:41.495502 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:41.495433 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/547d3209-efdd-4252-9124-c50a585f1de5-cert\") pod \"547d3209-efdd-4252-9124-c50a585f1de5\" (UID: \"547d3209-efdd-4252-9124-c50a585f1de5\") " Apr 17 17:31:41.495502 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:41.495466 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxfpd\" (UniqueName: \"kubernetes.io/projected/547d3209-efdd-4252-9124-c50a585f1de5-kube-api-access-sxfpd\") pod \"547d3209-efdd-4252-9124-c50a585f1de5\" (UID: \"547d3209-efdd-4252-9124-c50a585f1de5\") " Apr 17 17:31:41.497481 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:41.497452 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/547d3209-efdd-4252-9124-c50a585f1de5-kube-api-access-sxfpd" (OuterVolumeSpecName: "kube-api-access-sxfpd") pod "547d3209-efdd-4252-9124-c50a585f1de5" (UID: "547d3209-efdd-4252-9124-c50a585f1de5"). InnerVolumeSpecName "kube-api-access-sxfpd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:31:41.497481 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:41.497470 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/547d3209-efdd-4252-9124-c50a585f1de5-cert" (OuterVolumeSpecName: "cert") pod "547d3209-efdd-4252-9124-c50a585f1de5" (UID: "547d3209-efdd-4252-9124-c50a585f1de5"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:31:41.567963 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:41.567934 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85dd7cfb4d-pzkh6" Apr 17 17:31:41.596595 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:41.596571 2575 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/547d3209-efdd-4252-9124-c50a585f1de5-cert\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:31:41.596702 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:41.596597 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sxfpd\" (UniqueName: \"kubernetes.io/projected/547d3209-efdd-4252-9124-c50a585f1de5-kube-api-access-sxfpd\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:31:41.627690 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:41.627655 2575 generic.go:358] "Generic (PLEG): container finished" podID="547d3209-efdd-4252-9124-c50a585f1de5" containerID="1ddae8cbd5fc87432197741d05219c09c9a129c65b62894bdd70578f05478600" exitCode=0 Apr 17 17:31:41.627857 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:41.627730 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85dd7cfb4d-dxlsk" Apr 17 17:31:41.627857 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:41.627745 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85dd7cfb4d-dxlsk" event={"ID":"547d3209-efdd-4252-9124-c50a585f1de5","Type":"ContainerDied","Data":"1ddae8cbd5fc87432197741d05219c09c9a129c65b62894bdd70578f05478600"} Apr 17 17:31:41.627857 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:41.627773 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85dd7cfb4d-dxlsk" event={"ID":"547d3209-efdd-4252-9124-c50a585f1de5","Type":"ContainerDied","Data":"58481cf31947b396478cab875e808b25357730e41d43ebc0d5e6ab35c515e0da"} Apr 17 17:31:41.627857 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:41.627800 2575 scope.go:117] "RemoveContainer" containerID="1ddae8cbd5fc87432197741d05219c09c9a129c65b62894bdd70578f05478600" Apr 17 17:31:41.636235 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:41.636216 2575 scope.go:117] "RemoveContainer" containerID="1ddae8cbd5fc87432197741d05219c09c9a129c65b62894bdd70578f05478600" Apr 17 17:31:41.636491 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:31:41.636470 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ddae8cbd5fc87432197741d05219c09c9a129c65b62894bdd70578f05478600\": container with ID starting with 1ddae8cbd5fc87432197741d05219c09c9a129c65b62894bdd70578f05478600 not found: ID does not exist" containerID="1ddae8cbd5fc87432197741d05219c09c9a129c65b62894bdd70578f05478600" Apr 17 17:31:41.636553 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:41.636504 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ddae8cbd5fc87432197741d05219c09c9a129c65b62894bdd70578f05478600"} err="failed to get container status \"1ddae8cbd5fc87432197741d05219c09c9a129c65b62894bdd70578f05478600\": rpc error: code = NotFound desc = could not find container \"1ddae8cbd5fc87432197741d05219c09c9a129c65b62894bdd70578f05478600\": container with ID starting with 1ddae8cbd5fc87432197741d05219c09c9a129c65b62894bdd70578f05478600 not found: ID does not exist" Apr 17 17:31:41.650439 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:41.650414 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-dxlsk"] Apr 17 17:31:41.654012 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:41.653991 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-dxlsk"] Apr 17 17:31:41.689333 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:41.689277 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-pzkh6"] Apr 17 17:31:41.692022 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:31:41.691995 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4efd0ded_0386_4ccb_9635_7fa20c9a0364.slice/crio-83edc143de672d54e8c6c85af4eb6b413abac3b51780595b1b4fc8489afdad88 WatchSource:0}: Error finding container 83edc143de672d54e8c6c85af4eb6b413abac3b51780595b1b4fc8489afdad88: Status 404 returned error can't find the container with id 83edc143de672d54e8c6c85af4eb6b413abac3b51780595b1b4fc8489afdad88 Apr 17 17:31:42.632853 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:42.632801 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85dd7cfb4d-pzkh6" event={"ID":"4efd0ded-0386-4ccb-9635-7fa20c9a0364","Type":"ContainerStarted","Data":"3fd6af7517a0074add85f9db0a35baac4b99895ed964ac7df6cf4740b4390e71"} Apr 17 17:31:42.632853 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:42.632857 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-85dd7cfb4d-pzkh6" Apr 17 17:31:42.633305 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:42.632871 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85dd7cfb4d-pzkh6" event={"ID":"4efd0ded-0386-4ccb-9635-7fa20c9a0364","Type":"ContainerStarted","Data":"83edc143de672d54e8c6c85af4eb6b413abac3b51780595b1b4fc8489afdad88"} Apr 17 17:31:42.650210 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:42.650167 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-85dd7cfb4d-pzkh6" podStartSLOduration=1.303191928 podStartE2EDuration="1.650153339s" podCreationTimestamp="2026-04-17 17:31:41 +0000 UTC" firstStartedPulling="2026-04-17 17:31:41.693214408 +0000 UTC m=+471.218895004" lastFinishedPulling="2026-04-17 17:31:42.040175809 +0000 UTC m=+471.565856415" observedRunningTime="2026-04-17 17:31:42.648255442 +0000 UTC m=+472.173936060" watchObservedRunningTime="2026-04-17 17:31:42.650153339 +0000 UTC m=+472.175833957" Apr 17 17:31:43.093635 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:31:43.093604 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="547d3209-efdd-4252-9124-c50a585f1de5" path="/var/lib/kubelet/pods/547d3209-efdd-4252-9124-c50a585f1de5/volumes" Apr 17 17:32:02.227118 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:02.227082 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68b44597bb-6s4j7"] Apr 17 17:32:13.642394 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:13.642361 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-85dd7cfb4d-pzkh6" Apr 17 17:32:14.517237 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:14.517205 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-5mvbh"] Apr 17 17:32:14.517537 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:14.517524 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="547d3209-efdd-4252-9124-c50a585f1de5" containerName="manager" Apr 17 17:32:14.517578 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:14.517539 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="547d3209-efdd-4252-9124-c50a585f1de5" containerName="manager" Apr 17 17:32:14.517611 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:14.517603 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="547d3209-efdd-4252-9124-c50a585f1de5" containerName="manager" Apr 17 17:32:14.520434 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:14.520419 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-5mvbh" Apr 17 17:32:14.523945 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:14.523919 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 17 17:32:14.524076 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:14.523956 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-9sgfb\"" Apr 17 17:32:14.536972 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:14.536947 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-5mvbh"] Apr 17 17:32:14.538184 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:14.538166 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-cg49l"] Apr 17 17:32:14.541378 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:14.541357 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-cg49l" Apr 17 17:32:14.543585 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:14.543564 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 17 17:32:14.543585 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:14.543578 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-fdrcp\"" Apr 17 17:32:14.549782 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:14.549759 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-cg49l"] Apr 17 17:32:14.645177 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:14.645147 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f7dcf894-ebef-4a1d-a53a-ba8d755b8497-tls-certs\") pod \"model-serving-api-86f7b4b499-5mvbh\" (UID: \"f7dcf894-ebef-4a1d-a53a-ba8d755b8497\") " pod="kserve/model-serving-api-86f7b4b499-5mvbh" Apr 17 17:32:14.645550 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:14.645199 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgclq\" (UniqueName: \"kubernetes.io/projected/c4b47ad4-4b7e-46b1-b529-319ee4352e17-kube-api-access-sgclq\") pod \"odh-model-controller-696fc77849-cg49l\" (UID: \"c4b47ad4-4b7e-46b1-b529-319ee4352e17\") " pod="kserve/odh-model-controller-696fc77849-cg49l" Apr 17 17:32:14.645550 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:14.645283 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttmhq\" (UniqueName: \"kubernetes.io/projected/f7dcf894-ebef-4a1d-a53a-ba8d755b8497-kube-api-access-ttmhq\") pod \"model-serving-api-86f7b4b499-5mvbh\" (UID: \"f7dcf894-ebef-4a1d-a53a-ba8d755b8497\") " pod="kserve/model-serving-api-86f7b4b499-5mvbh" Apr 17 17:32:14.645550 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:14.645320 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c4b47ad4-4b7e-46b1-b529-319ee4352e17-cert\") pod \"odh-model-controller-696fc77849-cg49l\" (UID: \"c4b47ad4-4b7e-46b1-b529-319ee4352e17\") " pod="kserve/odh-model-controller-696fc77849-cg49l" Apr 17 17:32:14.746650 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:14.746618 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttmhq\" (UniqueName: \"kubernetes.io/projected/f7dcf894-ebef-4a1d-a53a-ba8d755b8497-kube-api-access-ttmhq\") pod \"model-serving-api-86f7b4b499-5mvbh\" (UID: \"f7dcf894-ebef-4a1d-a53a-ba8d755b8497\") " pod="kserve/model-serving-api-86f7b4b499-5mvbh" Apr 17 17:32:14.746650 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:14.746652 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c4b47ad4-4b7e-46b1-b529-319ee4352e17-cert\") pod \"odh-model-controller-696fc77849-cg49l\" (UID: \"c4b47ad4-4b7e-46b1-b529-319ee4352e17\") " pod="kserve/odh-model-controller-696fc77849-cg49l" Apr 17 17:32:14.746923 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:14.746692 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f7dcf894-ebef-4a1d-a53a-ba8d755b8497-tls-certs\") pod \"model-serving-api-86f7b4b499-5mvbh\" (UID: \"f7dcf894-ebef-4a1d-a53a-ba8d755b8497\") " pod="kserve/model-serving-api-86f7b4b499-5mvbh" Apr 17 17:32:14.746923 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:14.746744 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgclq\" (UniqueName: \"kubernetes.io/projected/c4b47ad4-4b7e-46b1-b529-319ee4352e17-kube-api-access-sgclq\") pod \"odh-model-controller-696fc77849-cg49l\" (UID: \"c4b47ad4-4b7e-46b1-b529-319ee4352e17\") " pod="kserve/odh-model-controller-696fc77849-cg49l" Apr 17 17:32:14.746923 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:32:14.746799 2575 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 17 17:32:14.746923 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:32:14.746907 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c4b47ad4-4b7e-46b1-b529-319ee4352e17-cert podName:c4b47ad4-4b7e-46b1-b529-319ee4352e17 nodeName:}" failed. No retries permitted until 2026-04-17 17:32:15.246884683 +0000 UTC m=+504.772565284 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c4b47ad4-4b7e-46b1-b529-319ee4352e17-cert") pod "odh-model-controller-696fc77849-cg49l" (UID: "c4b47ad4-4b7e-46b1-b529-319ee4352e17") : secret "odh-model-controller-webhook-cert" not found Apr 17 17:32:14.749163 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:14.749140 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f7dcf894-ebef-4a1d-a53a-ba8d755b8497-tls-certs\") pod \"model-serving-api-86f7b4b499-5mvbh\" (UID: \"f7dcf894-ebef-4a1d-a53a-ba8d755b8497\") " pod="kserve/model-serving-api-86f7b4b499-5mvbh" Apr 17 17:32:14.758121 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:14.758094 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttmhq\" (UniqueName: \"kubernetes.io/projected/f7dcf894-ebef-4a1d-a53a-ba8d755b8497-kube-api-access-ttmhq\") pod \"model-serving-api-86f7b4b499-5mvbh\" (UID: \"f7dcf894-ebef-4a1d-a53a-ba8d755b8497\") " pod="kserve/model-serving-api-86f7b4b499-5mvbh" Apr 17 17:32:14.758243 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:14.758226 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgclq\" (UniqueName: \"kubernetes.io/projected/c4b47ad4-4b7e-46b1-b529-319ee4352e17-kube-api-access-sgclq\") pod \"odh-model-controller-696fc77849-cg49l\" (UID: \"c4b47ad4-4b7e-46b1-b529-319ee4352e17\") " pod="kserve/odh-model-controller-696fc77849-cg49l" Apr 17 17:32:14.831520 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:14.831460 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-5mvbh" Apr 17 17:32:14.957354 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:14.957323 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-5mvbh"] Apr 17 17:32:14.959923 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:32:14.959887 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7dcf894_ebef_4a1d_a53a_ba8d755b8497.slice/crio-f035c94e43cb1d3c95720003ba137b655c7db45db8c86fb76e9701a091463a10 WatchSource:0}: Error finding container f035c94e43cb1d3c95720003ba137b655c7db45db8c86fb76e9701a091463a10: Status 404 returned error can't find the container with id f035c94e43cb1d3c95720003ba137b655c7db45db8c86fb76e9701a091463a10 Apr 17 17:32:15.251212 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:15.251183 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c4b47ad4-4b7e-46b1-b529-319ee4352e17-cert\") pod \"odh-model-controller-696fc77849-cg49l\" (UID: \"c4b47ad4-4b7e-46b1-b529-319ee4352e17\") " pod="kserve/odh-model-controller-696fc77849-cg49l" Apr 17 17:32:15.253723 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:15.253694 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c4b47ad4-4b7e-46b1-b529-319ee4352e17-cert\") pod \"odh-model-controller-696fc77849-cg49l\" (UID: \"c4b47ad4-4b7e-46b1-b529-319ee4352e17\") " pod="kserve/odh-model-controller-696fc77849-cg49l" Apr 17 17:32:15.452299 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:15.452257 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-cg49l" Apr 17 17:32:15.593011 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:15.592974 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-cg49l"] Apr 17 17:32:15.620627 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:32:15.620593 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4b47ad4_4b7e_46b1_b529_319ee4352e17.slice/crio-d81748823ef4b651dafa7255d3a9aafb611944c98d68b4004a205199464dae98 WatchSource:0}: Error finding container d81748823ef4b651dafa7255d3a9aafb611944c98d68b4004a205199464dae98: Status 404 returned error can't find the container with id d81748823ef4b651dafa7255d3a9aafb611944c98d68b4004a205199464dae98 Apr 17 17:32:15.744246 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:15.744209 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-cg49l" event={"ID":"c4b47ad4-4b7e-46b1-b529-319ee4352e17","Type":"ContainerStarted","Data":"d81748823ef4b651dafa7255d3a9aafb611944c98d68b4004a205199464dae98"} Apr 17 17:32:15.745435 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:15.745406 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-5mvbh" event={"ID":"f7dcf894-ebef-4a1d-a53a-ba8d755b8497","Type":"ContainerStarted","Data":"f035c94e43cb1d3c95720003ba137b655c7db45db8c86fb76e9701a091463a10"} Apr 17 17:32:16.752162 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:16.752118 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-5mvbh" event={"ID":"f7dcf894-ebef-4a1d-a53a-ba8d755b8497","Type":"ContainerStarted","Data":"c70e29ccf780b1ed18a86c7b2a77cba7461aa36be7e85fffa3685af150ef4755"} Apr 17 17:32:16.752535 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:16.752194 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-5mvbh" Apr 17 17:32:16.773761 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:16.773690 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-5mvbh" podStartSLOduration=1.178981727 podStartE2EDuration="2.77367323s" podCreationTimestamp="2026-04-17 17:32:14 +0000 UTC" firstStartedPulling="2026-04-17 17:32:14.961621925 +0000 UTC m=+504.487302522" lastFinishedPulling="2026-04-17 17:32:16.556313413 +0000 UTC m=+506.081994025" observedRunningTime="2026-04-17 17:32:16.77124083 +0000 UTC m=+506.296921450" watchObservedRunningTime="2026-04-17 17:32:16.77367323 +0000 UTC m=+506.299353849" Apr 17 17:32:19.767617 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:19.767578 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-cg49l" event={"ID":"c4b47ad4-4b7e-46b1-b529-319ee4352e17","Type":"ContainerStarted","Data":"082a160d077e5d6ed8e27d3c30235cb03b075c7ffceb23e477522531a0948fd8"} Apr 17 17:32:19.768002 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:19.767638 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-cg49l" Apr 17 17:32:19.795728 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:19.795679 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-cg49l" podStartSLOduration=2.480363885 podStartE2EDuration="5.795664961s" podCreationTimestamp="2026-04-17 17:32:14 +0000 UTC" firstStartedPulling="2026-04-17 17:32:15.622151166 +0000 UTC m=+505.147831763" lastFinishedPulling="2026-04-17 17:32:18.937452241 +0000 UTC m=+508.463132839" observedRunningTime="2026-04-17 17:32:19.792852258 +0000 UTC m=+509.318532867" watchObservedRunningTime="2026-04-17 17:32:19.795664961 +0000 UTC m=+509.321345580" Apr 17 17:32:27.248039 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.247980 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-68b44597bb-6s4j7" podUID="3122264d-c697-4bd6-92fd-fa3f7076f5b4" containerName="console" containerID="cri-o://56e30b98e06df2e0315b69c097fbe43bbac035cfda9cdef8d7890e73857361aa" gracePeriod=15 Apr 17 17:32:27.490596 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.490576 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68b44597bb-6s4j7_3122264d-c697-4bd6-92fd-fa3f7076f5b4/console/0.log" Apr 17 17:32:27.490746 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.490638 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68b44597bb-6s4j7" Apr 17 17:32:27.550028 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.549963 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3122264d-c697-4bd6-92fd-fa3f7076f5b4-service-ca\") pod \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\" (UID: \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\") " Apr 17 17:32:27.550028 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.549997 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3122264d-c697-4bd6-92fd-fa3f7076f5b4-console-config\") pod \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\" (UID: \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\") " Apr 17 17:32:27.550028 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.550017 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3122264d-c697-4bd6-92fd-fa3f7076f5b4-console-oauth-config\") pod \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\" (UID: \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\") " Apr 17 17:32:27.550232 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.550041 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3122264d-c697-4bd6-92fd-fa3f7076f5b4-console-serving-cert\") pod \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\" (UID: \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\") " Apr 17 17:32:27.550232 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.550057 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3122264d-c697-4bd6-92fd-fa3f7076f5b4-trusted-ca-bundle\") pod \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\" (UID: \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\") " Apr 17 17:32:27.550232 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.550087 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3122264d-c697-4bd6-92fd-fa3f7076f5b4-oauth-serving-cert\") pod \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\" (UID: \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\") " Apr 17 17:32:27.550232 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.550108 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqqg6\" (UniqueName: \"kubernetes.io/projected/3122264d-c697-4bd6-92fd-fa3f7076f5b4-kube-api-access-nqqg6\") pod \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\" (UID: \"3122264d-c697-4bd6-92fd-fa3f7076f5b4\") " Apr 17 17:32:27.552089 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.550665 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3122264d-c697-4bd6-92fd-fa3f7076f5b4-service-ca" (OuterVolumeSpecName: "service-ca") pod "3122264d-c697-4bd6-92fd-fa3f7076f5b4" (UID: "3122264d-c697-4bd6-92fd-fa3f7076f5b4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:32:27.552089 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.550706 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3122264d-c697-4bd6-92fd-fa3f7076f5b4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "3122264d-c697-4bd6-92fd-fa3f7076f5b4" (UID: "3122264d-c697-4bd6-92fd-fa3f7076f5b4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:32:27.552089 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.551073 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3122264d-c697-4bd6-92fd-fa3f7076f5b4-console-config" (OuterVolumeSpecName: "console-config") pod "3122264d-c697-4bd6-92fd-fa3f7076f5b4" (UID: "3122264d-c697-4bd6-92fd-fa3f7076f5b4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:32:27.552089 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.551126 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3122264d-c697-4bd6-92fd-fa3f7076f5b4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "3122264d-c697-4bd6-92fd-fa3f7076f5b4" (UID: "3122264d-c697-4bd6-92fd-fa3f7076f5b4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:32:27.554893 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.553147 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3122264d-c697-4bd6-92fd-fa3f7076f5b4-kube-api-access-nqqg6" (OuterVolumeSpecName: "kube-api-access-nqqg6") pod "3122264d-c697-4bd6-92fd-fa3f7076f5b4" (UID: "3122264d-c697-4bd6-92fd-fa3f7076f5b4"). InnerVolumeSpecName "kube-api-access-nqqg6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:32:27.557773 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.557743 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3122264d-c697-4bd6-92fd-fa3f7076f5b4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "3122264d-c697-4bd6-92fd-fa3f7076f5b4" (UID: "3122264d-c697-4bd6-92fd-fa3f7076f5b4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:32:27.558188 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.558168 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3122264d-c697-4bd6-92fd-fa3f7076f5b4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "3122264d-c697-4bd6-92fd-fa3f7076f5b4" (UID: "3122264d-c697-4bd6-92fd-fa3f7076f5b4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:32:27.651545 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.651514 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3122264d-c697-4bd6-92fd-fa3f7076f5b4-service-ca\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:32:27.651545 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.651544 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3122264d-c697-4bd6-92fd-fa3f7076f5b4-console-config\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:32:27.651727 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.651554 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3122264d-c697-4bd6-92fd-fa3f7076f5b4-console-oauth-config\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:32:27.651727 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.651566 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3122264d-c697-4bd6-92fd-fa3f7076f5b4-console-serving-cert\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:32:27.651727 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.651575 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3122264d-c697-4bd6-92fd-fa3f7076f5b4-trusted-ca-bundle\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:32:27.651727 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.651583 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3122264d-c697-4bd6-92fd-fa3f7076f5b4-oauth-serving-cert\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:32:27.651727 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.651592 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nqqg6\" (UniqueName: \"kubernetes.io/projected/3122264d-c697-4bd6-92fd-fa3f7076f5b4-kube-api-access-nqqg6\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:32:27.760180 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.760152 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-5mvbh" Apr 17 17:32:27.795994 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.795967 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-68b44597bb-6s4j7_3122264d-c697-4bd6-92fd-fa3f7076f5b4/console/0.log" Apr 17 17:32:27.796152 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.796008 2575 generic.go:358] "Generic (PLEG): container finished" podID="3122264d-c697-4bd6-92fd-fa3f7076f5b4" containerID="56e30b98e06df2e0315b69c097fbe43bbac035cfda9cdef8d7890e73857361aa" exitCode=2 Apr 17 17:32:27.796152 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.796089 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68b44597bb-6s4j7" event={"ID":"3122264d-c697-4bd6-92fd-fa3f7076f5b4","Type":"ContainerDied","Data":"56e30b98e06df2e0315b69c097fbe43bbac035cfda9cdef8d7890e73857361aa"} Apr 17 17:32:27.796152 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.796112 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-68b44597bb-6s4j7" Apr 17 17:32:27.796152 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.796125 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-68b44597bb-6s4j7" event={"ID":"3122264d-c697-4bd6-92fd-fa3f7076f5b4","Type":"ContainerDied","Data":"70be57e0f3eeb8f497f93ea5c8176aa4e01a7b08df1906c7a25006f5c4d30c06"} Apr 17 17:32:27.796152 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.796145 2575 scope.go:117] "RemoveContainer" containerID="56e30b98e06df2e0315b69c097fbe43bbac035cfda9cdef8d7890e73857361aa" Apr 17 17:32:27.805597 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.805574 2575 scope.go:117] "RemoveContainer" containerID="56e30b98e06df2e0315b69c097fbe43bbac035cfda9cdef8d7890e73857361aa" Apr 17 17:32:27.805947 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:32:27.805918 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56e30b98e06df2e0315b69c097fbe43bbac035cfda9cdef8d7890e73857361aa\": container with ID starting with 56e30b98e06df2e0315b69c097fbe43bbac035cfda9cdef8d7890e73857361aa not found: ID does not exist" containerID="56e30b98e06df2e0315b69c097fbe43bbac035cfda9cdef8d7890e73857361aa" Apr 17 17:32:27.806053 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.805957 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56e30b98e06df2e0315b69c097fbe43bbac035cfda9cdef8d7890e73857361aa"} err="failed to get container status \"56e30b98e06df2e0315b69c097fbe43bbac035cfda9cdef8d7890e73857361aa\": rpc error: code = NotFound desc = could not find container \"56e30b98e06df2e0315b69c097fbe43bbac035cfda9cdef8d7890e73857361aa\": container with ID starting with 56e30b98e06df2e0315b69c097fbe43bbac035cfda9cdef8d7890e73857361aa not found: ID does not exist" Apr 17 17:32:27.822215 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.822190 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-68b44597bb-6s4j7"] Apr 17 17:32:27.829453 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:27.829432 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-68b44597bb-6s4j7"] Apr 17 17:32:29.094298 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:29.094268 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3122264d-c697-4bd6-92fd-fa3f7076f5b4" path="/var/lib/kubelet/pods/3122264d-c697-4bd6-92fd-fa3f7076f5b4/volumes" Apr 17 17:32:30.773657 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:30.773623 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-cg49l" Apr 17 17:32:51.266974 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.266939 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286"] Apr 17 17:32:51.267395 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.267377 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3122264d-c697-4bd6-92fd-fa3f7076f5b4" containerName="console" Apr 17 17:32:51.267442 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.267399 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3122264d-c697-4bd6-92fd-fa3f7076f5b4" containerName="console" Apr 17 17:32:51.267485 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.267463 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3122264d-c697-4bd6-92fd-fa3f7076f5b4" containerName="console" Apr 17 17:32:51.275006 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.274941 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" Apr 17 17:32:51.278282 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.278039 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 17:32:51.278282 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.278090 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-fbgtt\"" Apr 17 17:32:51.278282 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.278087 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-1-predictor-serving-cert\"" Apr 17 17:32:51.278531 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.278360 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 17:32:51.278531 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.278388 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\"" Apr 17 17:32:51.280421 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.280398 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286"] Apr 17 17:32:51.336213 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.336179 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zphll\" (UniqueName: \"kubernetes.io/projected/5c934c46-580b-4625-b33b-262e329e8b7a-kube-api-access-zphll\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-97286\" (UID: \"5c934c46-580b-4625-b33b-262e329e8b7a\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" Apr 17 17:32:51.336364 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.336257 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c934c46-580b-4625-b33b-262e329e8b7a-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-97286\" (UID: \"5c934c46-580b-4625-b33b-262e329e8b7a\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" Apr 17 17:32:51.336364 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.336333 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5c934c46-580b-4625-b33b-262e329e8b7a-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-97286\" (UID: \"5c934c46-580b-4625-b33b-262e329e8b7a\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" Apr 17 17:32:51.336467 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.336373 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5c934c46-580b-4625-b33b-262e329e8b7a-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-97286\" (UID: \"5c934c46-580b-4625-b33b-262e329e8b7a\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" Apr 17 17:32:51.436975 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.436942 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zphll\" (UniqueName: \"kubernetes.io/projected/5c934c46-580b-4625-b33b-262e329e8b7a-kube-api-access-zphll\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-97286\" (UID: \"5c934c46-580b-4625-b33b-262e329e8b7a\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" Apr 17 17:32:51.437151 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.437004 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c934c46-580b-4625-b33b-262e329e8b7a-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-97286\" (UID: \"5c934c46-580b-4625-b33b-262e329e8b7a\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" Apr 17 17:32:51.437151 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.437063 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5c934c46-580b-4625-b33b-262e329e8b7a-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-97286\" (UID: \"5c934c46-580b-4625-b33b-262e329e8b7a\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" Apr 17 17:32:51.437151 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.437097 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5c934c46-580b-4625-b33b-262e329e8b7a-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-97286\" (UID: \"5c934c46-580b-4625-b33b-262e329e8b7a\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" Apr 17 17:32:51.437498 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.437476 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c934c46-580b-4625-b33b-262e329e8b7a-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-97286\" (UID: \"5c934c46-580b-4625-b33b-262e329e8b7a\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" Apr 17 17:32:51.438030 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.438004 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5c934c46-580b-4625-b33b-262e329e8b7a-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-97286\" (UID: \"5c934c46-580b-4625-b33b-262e329e8b7a\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" Apr 17 17:32:51.439797 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.439773 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5c934c46-580b-4625-b33b-262e329e8b7a-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-97286\" (UID: \"5c934c46-580b-4625-b33b-262e329e8b7a\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" Apr 17 17:32:51.447571 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.447548 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zphll\" (UniqueName: \"kubernetes.io/projected/5c934c46-580b-4625-b33b-262e329e8b7a-kube-api-access-zphll\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-97286\" (UID: \"5c934c46-580b-4625-b33b-262e329e8b7a\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" Apr 17 17:32:51.462137 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.462115 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw"] Apr 17 17:32:51.465797 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.465782 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" Apr 17 17:32:51.467912 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.467888 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-235f3-predictor-serving-cert\"" Apr 17 17:32:51.468072 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.468055 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-235f3-kube-rbac-proxy-sar-config\"" Apr 17 17:32:51.476332 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.476311 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw"] Apr 17 17:32:51.538228 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.538146 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-235f3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1759c02c-89ff-4873-a196-cd59e3cc1f1c-success-200-isvc-235f3-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-235f3-predictor-5d88b4c684-j25qw\" (UID: \"1759c02c-89ff-4873-a196-cd59e3cc1f1c\") " pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" Apr 17 17:32:51.538228 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.538188 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1759c02c-89ff-4873-a196-cd59e3cc1f1c-proxy-tls\") pod \"success-200-isvc-235f3-predictor-5d88b4c684-j25qw\" (UID: \"1759c02c-89ff-4873-a196-cd59e3cc1f1c\") " pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" Apr 17 17:32:51.538419 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.538317 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rkdw\" (UniqueName: \"kubernetes.io/projected/1759c02c-89ff-4873-a196-cd59e3cc1f1c-kube-api-access-8rkdw\") pod \"success-200-isvc-235f3-predictor-5d88b4c684-j25qw\" (UID: \"1759c02c-89ff-4873-a196-cd59e3cc1f1c\") " pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" Apr 17 17:32:51.587091 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.587066 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" Apr 17 17:32:51.639535 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.639438 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rkdw\" (UniqueName: \"kubernetes.io/projected/1759c02c-89ff-4873-a196-cd59e3cc1f1c-kube-api-access-8rkdw\") pod \"success-200-isvc-235f3-predictor-5d88b4c684-j25qw\" (UID: \"1759c02c-89ff-4873-a196-cd59e3cc1f1c\") " pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" Apr 17 17:32:51.639535 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.639527 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-235f3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1759c02c-89ff-4873-a196-cd59e3cc1f1c-success-200-isvc-235f3-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-235f3-predictor-5d88b4c684-j25qw\" (UID: \"1759c02c-89ff-4873-a196-cd59e3cc1f1c\") " pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" Apr 17 17:32:51.639711 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.639569 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1759c02c-89ff-4873-a196-cd59e3cc1f1c-proxy-tls\") pod \"success-200-isvc-235f3-predictor-5d88b4c684-j25qw\" (UID: \"1759c02c-89ff-4873-a196-cd59e3cc1f1c\") " pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" Apr 17 17:32:51.641210 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.641160 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-235f3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1759c02c-89ff-4873-a196-cd59e3cc1f1c-success-200-isvc-235f3-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-235f3-predictor-5d88b4c684-j25qw\" (UID: \"1759c02c-89ff-4873-a196-cd59e3cc1f1c\") " pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" Apr 17 17:32:51.654867 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.647342 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1759c02c-89ff-4873-a196-cd59e3cc1f1c-proxy-tls\") pod \"success-200-isvc-235f3-predictor-5d88b4c684-j25qw\" (UID: \"1759c02c-89ff-4873-a196-cd59e3cc1f1c\") " pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" Apr 17 17:32:51.654867 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.651785 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rkdw\" (UniqueName: \"kubernetes.io/projected/1759c02c-89ff-4873-a196-cd59e3cc1f1c-kube-api-access-8rkdw\") pod \"success-200-isvc-235f3-predictor-5d88b4c684-j25qw\" (UID: \"1759c02c-89ff-4873-a196-cd59e3cc1f1c\") " pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" Apr 17 17:32:51.715377 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.715342 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286"] Apr 17 17:32:51.718389 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:32:51.718364 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c934c46_580b_4625_b33b_262e329e8b7a.slice/crio-c7d6c2fabeb20a963741d97494671e3cd32c5c1079cc842e211c8598305c9518 WatchSource:0}: Error finding container c7d6c2fabeb20a963741d97494671e3cd32c5c1079cc842e211c8598305c9518: Status 404 returned error can't find the container with id c7d6c2fabeb20a963741d97494671e3cd32c5c1079cc842e211c8598305c9518 Apr 17 17:32:51.775889 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.775862 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" Apr 17 17:32:51.883450 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.883407 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" event={"ID":"5c934c46-580b-4625-b33b-262e329e8b7a","Type":"ContainerStarted","Data":"c7d6c2fabeb20a963741d97494671e3cd32c5c1079cc842e211c8598305c9518"} Apr 17 17:32:51.901581 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:51.901558 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw"] Apr 17 17:32:51.903906 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:32:51.903879 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1759c02c_89ff_4873_a196_cd59e3cc1f1c.slice/crio-8dbc59a3693ef3d55ed39924e3c740833a399fcb9b551864b7a5c8318283fd87 WatchSource:0}: Error finding container 8dbc59a3693ef3d55ed39924e3c740833a399fcb9b551864b7a5c8318283fd87: Status 404 returned error can't find the container with id 8dbc59a3693ef3d55ed39924e3c740833a399fcb9b551864b7a5c8318283fd87 Apr 17 17:32:52.893670 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:32:52.893596 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" event={"ID":"1759c02c-89ff-4873-a196-cd59e3cc1f1c","Type":"ContainerStarted","Data":"8dbc59a3693ef3d55ed39924e3c740833a399fcb9b551864b7a5c8318283fd87"} Apr 17 17:33:02.167877 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:33:02.167788 2575 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: copying system image from manifest list: parsing image configuration: fetching blob: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" image="quay.io/opendatahub/kserve-storage-initializer@sha256:4cb151dc63436df7d13d4e5ee0cc2d7fdd1ff90cc80417c51a70ddeb283661c4" Apr 17 17:33:02.168376 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:33:02.168010 2575 kuberuntime_manager.go:1358] "Unhandled Error" err="init container &Container{Name:storage-initializer,Image:quay.io/opendatahub/kserve-storage-initializer@sha256:4cb151dc63436df7d13d4e5ee0cc2d7fdd1ff90cc80417c51a70ddeb283661c4,Command:[],Args:[gs://kfserving-examples/models/sklearn/1.0/model /mnt/models],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:HF_HUB_ENABLE_HF_TRANSFER,Value:1,ValueFrom:nil,},EnvVar{Name:HF_XET_HIGH_PERFORMANCE,Value:1,ValueFrom:nil,},EnvVar{Name:HF_XET_NUM_CONCURRENT_RANGE_GETS,Value:8,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{1 0} {} 1 DecimalSI},memory: {{25769803776 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{104857600 0} {} 100Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kserve-provision-location,ReadOnly:false,MountPath:/mnt/models,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zphll,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod isvc-sklearn-graph-1-predictor-5b497dcd98-97286_kserve-ci-e2e-test(5c934c46-580b-4625-b33b-262e329e8b7a): ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: parsing image configuration: fetching blob: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 17 17:33:02.169204 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:33:02.169165 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: copying system image from manifest list: parsing image configuration: fetching blob: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" podUID="5c934c46-580b-4625-b33b-262e329e8b7a" Apr 17 17:33:02.934201 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:33:02.934168 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/opendatahub/kserve-storage-initializer@sha256:4cb151dc63436df7d13d4e5ee0cc2d7fdd1ff90cc80417c51a70ddeb283661c4\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: parsing image configuration: fetching blob: received unexpected HTTP status: 504 Gateway Time-out; artifact err: provided artifact is a container image\"" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" podUID="5c934c46-580b-4625-b33b-262e329e8b7a" Apr 17 17:33:03.944562 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:33:03.944516 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" event={"ID":"1759c02c-89ff-4873-a196-cd59e3cc1f1c","Type":"ContainerStarted","Data":"64940f49adfaa7366b9b57b680f4843de53317fc3fde05ddfe5697da6c73e3b5"} Apr 17 17:33:06.956639 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:33:06.956596 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" event={"ID":"1759c02c-89ff-4873-a196-cd59e3cc1f1c","Type":"ContainerStarted","Data":"70380340d325f10ba73407111374e8ad12348b5ab20d2320002ca0858c17f654"} Apr 17 17:33:06.957089 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:33:06.956836 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" Apr 17 17:33:06.957089 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:33:06.956867 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" Apr 17 17:33:06.958198 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:33:06.958173 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" podUID="1759c02c-89ff-4873-a196-cd59e3cc1f1c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 17 17:33:06.975521 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:33:06.975472 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" podStartSLOduration=1.798039673 podStartE2EDuration="15.975457904s" podCreationTimestamp="2026-04-17 17:32:51 +0000 UTC" firstStartedPulling="2026-04-17 17:32:51.905500251 +0000 UTC m=+541.431180848" lastFinishedPulling="2026-04-17 17:33:06.08291847 +0000 UTC m=+555.608599079" observedRunningTime="2026-04-17 17:33:06.974406055 +0000 UTC m=+556.500086673" watchObservedRunningTime="2026-04-17 17:33:06.975457904 +0000 UTC m=+556.501138523" Apr 17 17:33:07.959805 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:33:07.959767 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" podUID="1759c02c-89ff-4873-a196-cd59e3cc1f1c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 17 17:33:12.964350 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:33:12.964322 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" Apr 17 17:33:12.964816 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:33:12.964789 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" podUID="1759c02c-89ff-4873-a196-cd59e3cc1f1c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 17 17:33:20.005550 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:33:20.005518 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" event={"ID":"5c934c46-580b-4625-b33b-262e329e8b7a","Type":"ContainerStarted","Data":"fadeed1a6af8b8ed10e31a9fe859cf08fc846e7e05f4c0f58707a8a579b1f989"} Apr 17 17:33:22.965535 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:33:22.965491 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" podUID="1759c02c-89ff-4873-a196-cd59e3cc1f1c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 17 17:33:24.020101 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:33:24.020067 2575 generic.go:358] "Generic (PLEG): container finished" podID="5c934c46-580b-4625-b33b-262e329e8b7a" containerID="fadeed1a6af8b8ed10e31a9fe859cf08fc846e7e05f4c0f58707a8a579b1f989" exitCode=0 Apr 17 17:33:24.020432 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:33:24.020117 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" event={"ID":"5c934c46-580b-4625-b33b-262e329e8b7a","Type":"ContainerDied","Data":"fadeed1a6af8b8ed10e31a9fe859cf08fc846e7e05f4c0f58707a8a579b1f989"} Apr 17 17:33:31.047632 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:33:31.047595 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" event={"ID":"5c934c46-580b-4625-b33b-262e329e8b7a","Type":"ContainerStarted","Data":"c71b52cc2b5be0600f94d5a62f959fce1885bbca4aff325e8b9da40d93560b34"} Apr 17 17:33:31.048154 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:33:31.047645 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" event={"ID":"5c934c46-580b-4625-b33b-262e329e8b7a","Type":"ContainerStarted","Data":"5588eae34f0940508395705b2acb26c066d69baecfa8c031f49426e54507f99c"} Apr 17 17:33:31.048154 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:33:31.048016 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" Apr 17 17:33:31.048154 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:33:31.048145 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" Apr 17 17:33:31.049438 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:33:31.049410 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" podUID="5c934c46-580b-4625-b33b-262e329e8b7a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 17 17:33:31.085690 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:33:31.085651 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" podStartSLOduration=1.374159986 podStartE2EDuration="40.085635511s" podCreationTimestamp="2026-04-17 17:32:51 +0000 UTC" firstStartedPulling="2026-04-17 17:32:51.720204021 +0000 UTC m=+541.245884623" lastFinishedPulling="2026-04-17 17:33:30.431679548 +0000 UTC m=+579.957360148" observedRunningTime="2026-04-17 17:33:31.083895976 +0000 UTC m=+580.609576610" watchObservedRunningTime="2026-04-17 17:33:31.085635511 +0000 UTC m=+580.611316130" Apr 17 17:33:32.051354 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:33:32.051316 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" podUID="5c934c46-580b-4625-b33b-262e329e8b7a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 17 17:33:32.965365 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:33:32.965322 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" podUID="1759c02c-89ff-4873-a196-cd59e3cc1f1c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 17 17:33:37.056790 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:33:37.056758 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" Apr 17 17:33:37.057296 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:33:37.057268 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" podUID="5c934c46-580b-4625-b33b-262e329e8b7a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 17 17:33:42.965818 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:33:42.965781 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" podUID="1759c02c-89ff-4873-a196-cd59e3cc1f1c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 17 17:33:47.057351 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:33:47.057314 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" podUID="5c934c46-580b-4625-b33b-262e329e8b7a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 17 17:33:51.010908 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:33:51.010877 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctj42_a9e0a184-477b-45d6-835a-1606a973a5cf/console-operator/1.log" Apr 17 17:33:51.011345 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:33:51.011045 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctj42_a9e0a184-477b-45d6-835a-1606a973a5cf/console-operator/1.log" Apr 17 17:33:52.965958 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:33:52.965931 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" Apr 17 17:33:57.057965 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:33:57.057925 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" podUID="5c934c46-580b-4625-b33b-262e329e8b7a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 17 17:34:07.057753 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:07.057671 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" podUID="5c934c46-580b-4625-b33b-262e329e8b7a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 17 17:34:11.254129 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:11.254092 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z"] Apr 17 17:34:11.257206 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:11.257188 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z" Apr 17 17:34:11.259381 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:11.259360 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-235f3-kube-rbac-proxy-sar-config\"" Apr 17 17:34:11.259461 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:11.259369 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-235f3-serving-cert\"" Apr 17 17:34:11.264627 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:11.264422 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z"] Apr 17 17:34:11.411854 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:11.411811 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e998f7a-f6a9-4c7b-86fc-62266b83449b-openshift-service-ca-bundle\") pod \"switch-graph-235f3-669999df9c-fr57z\" (UID: \"5e998f7a-f6a9-4c7b-86fc-62266b83449b\") " pod="kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z" Apr 17 17:34:11.412024 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:11.411896 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e998f7a-f6a9-4c7b-86fc-62266b83449b-proxy-tls\") pod \"switch-graph-235f3-669999df9c-fr57z\" (UID: \"5e998f7a-f6a9-4c7b-86fc-62266b83449b\") " pod="kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z" Apr 17 17:34:11.512430 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:11.512356 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e998f7a-f6a9-4c7b-86fc-62266b83449b-openshift-service-ca-bundle\") pod \"switch-graph-235f3-669999df9c-fr57z\" (UID: \"5e998f7a-f6a9-4c7b-86fc-62266b83449b\") " pod="kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z" Apr 17 17:34:11.512430 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:11.512416 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e998f7a-f6a9-4c7b-86fc-62266b83449b-proxy-tls\") pod \"switch-graph-235f3-669999df9c-fr57z\" (UID: \"5e998f7a-f6a9-4c7b-86fc-62266b83449b\") " pod="kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z" Apr 17 17:34:11.512997 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:11.512972 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e998f7a-f6a9-4c7b-86fc-62266b83449b-openshift-service-ca-bundle\") pod \"switch-graph-235f3-669999df9c-fr57z\" (UID: \"5e998f7a-f6a9-4c7b-86fc-62266b83449b\") " pod="kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z" Apr 17 17:34:11.514671 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:11.514647 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e998f7a-f6a9-4c7b-86fc-62266b83449b-proxy-tls\") pod \"switch-graph-235f3-669999df9c-fr57z\" (UID: \"5e998f7a-f6a9-4c7b-86fc-62266b83449b\") " pod="kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z" Apr 17 17:34:11.568758 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:11.568732 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z" Apr 17 17:34:11.686255 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:11.686224 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z"] Apr 17 17:34:11.689179 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:34:11.689152 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e998f7a_f6a9_4c7b_86fc_62266b83449b.slice/crio-b542e8307a43387c2903e0237f137d88f94189f35603d7384b74a955a291a2f6 WatchSource:0}: Error finding container b542e8307a43387c2903e0237f137d88f94189f35603d7384b74a955a291a2f6: Status 404 returned error can't find the container with id b542e8307a43387c2903e0237f137d88f94189f35603d7384b74a955a291a2f6 Apr 17 17:34:12.180232 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:12.180195 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z" event={"ID":"5e998f7a-f6a9-4c7b-86fc-62266b83449b","Type":"ContainerStarted","Data":"b542e8307a43387c2903e0237f137d88f94189f35603d7384b74a955a291a2f6"} Apr 17 17:34:15.190597 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:15.190558 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z" event={"ID":"5e998f7a-f6a9-4c7b-86fc-62266b83449b","Type":"ContainerStarted","Data":"a909f2004dd4ea04166ae3eba4133edb287b861362310563d676242bbc51a37a"} Apr 17 17:34:15.191006 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:15.190741 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z" Apr 17 17:34:15.206775 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:15.206728 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z" podStartSLOduration=1.408482593 podStartE2EDuration="4.206716264s" podCreationTimestamp="2026-04-17 17:34:11 +0000 UTC" firstStartedPulling="2026-04-17 17:34:11.690944345 +0000 UTC m=+621.216624942" lastFinishedPulling="2026-04-17 17:34:14.489178013 +0000 UTC m=+624.014858613" observedRunningTime="2026-04-17 17:34:15.205896776 +0000 UTC m=+624.731577394" watchObservedRunningTime="2026-04-17 17:34:15.206716264 +0000 UTC m=+624.732396884" Apr 17 17:34:17.057302 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:17.057265 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" podUID="5c934c46-580b-4625-b33b-262e329e8b7a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 17 17:34:21.199881 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:21.199849 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z" Apr 17 17:34:21.461660 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:21.461579 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z"] Apr 17 17:34:21.461872 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:21.461841 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z" podUID="5e998f7a-f6a9-4c7b-86fc-62266b83449b" containerName="switch-graph-235f3" containerID="cri-o://a909f2004dd4ea04166ae3eba4133edb287b861362310563d676242bbc51a37a" gracePeriod=30 Apr 17 17:34:21.570001 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:21.569961 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw"] Apr 17 17:34:21.570323 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:21.570292 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" podUID="1759c02c-89ff-4873-a196-cd59e3cc1f1c" containerName="kserve-container" containerID="cri-o://64940f49adfaa7366b9b57b680f4843de53317fc3fde05ddfe5697da6c73e3b5" gracePeriod=30 Apr 17 17:34:21.570649 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:21.570537 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" podUID="1759c02c-89ff-4873-a196-cd59e3cc1f1c" containerName="kube-rbac-proxy" containerID="cri-o://70380340d325f10ba73407111374e8ad12348b5ab20d2320002ca0858c17f654" gracePeriod=30 Apr 17 17:34:21.626923 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:21.626894 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm"] Apr 17 17:34:21.630582 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:21.630543 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" Apr 17 17:34:21.632681 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:21.632660 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-d987b-kube-rbac-proxy-sar-config\"" Apr 17 17:34:21.632769 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:21.632662 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-d987b-predictor-serving-cert\"" Apr 17 17:34:21.639930 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:21.639906 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm"] Apr 17 17:34:21.691771 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:21.691740 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7501e586-6f33-437c-a1de-8e37d3a78b56-proxy-tls\") pod \"success-200-isvc-d987b-predictor-7847c5c48b-q65fm\" (UID: \"7501e586-6f33-437c-a1de-8e37d3a78b56\") " pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" Apr 17 17:34:21.691942 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:21.691779 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bhlr\" (UniqueName: \"kubernetes.io/projected/7501e586-6f33-437c-a1de-8e37d3a78b56-kube-api-access-8bhlr\") pod \"success-200-isvc-d987b-predictor-7847c5c48b-q65fm\" (UID: \"7501e586-6f33-437c-a1de-8e37d3a78b56\") " pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" Apr 17 17:34:21.691942 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:21.691842 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-d987b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7501e586-6f33-437c-a1de-8e37d3a78b56-success-200-isvc-d987b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-d987b-predictor-7847c5c48b-q65fm\" (UID: \"7501e586-6f33-437c-a1de-8e37d3a78b56\") " pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" Apr 17 17:34:21.792446 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:21.792406 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7501e586-6f33-437c-a1de-8e37d3a78b56-proxy-tls\") pod \"success-200-isvc-d987b-predictor-7847c5c48b-q65fm\" (UID: \"7501e586-6f33-437c-a1de-8e37d3a78b56\") " pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" Apr 17 17:34:21.792446 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:21.792449 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bhlr\" (UniqueName: \"kubernetes.io/projected/7501e586-6f33-437c-a1de-8e37d3a78b56-kube-api-access-8bhlr\") pod \"success-200-isvc-d987b-predictor-7847c5c48b-q65fm\" (UID: \"7501e586-6f33-437c-a1de-8e37d3a78b56\") " pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" Apr 17 17:34:21.792695 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:34:21.792500 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-d987b-predictor-serving-cert: secret "success-200-isvc-d987b-predictor-serving-cert" not found Apr 17 17:34:21.792695 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:21.792503 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-d987b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7501e586-6f33-437c-a1de-8e37d3a78b56-success-200-isvc-d987b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-d987b-predictor-7847c5c48b-q65fm\" (UID: \"7501e586-6f33-437c-a1de-8e37d3a78b56\") " pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" Apr 17 17:34:21.792695 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:34:21.792562 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7501e586-6f33-437c-a1de-8e37d3a78b56-proxy-tls podName:7501e586-6f33-437c-a1de-8e37d3a78b56 nodeName:}" failed. No retries permitted until 2026-04-17 17:34:22.292542858 +0000 UTC m=+631.818223460 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/7501e586-6f33-437c-a1de-8e37d3a78b56-proxy-tls") pod "success-200-isvc-d987b-predictor-7847c5c48b-q65fm" (UID: "7501e586-6f33-437c-a1de-8e37d3a78b56") : secret "success-200-isvc-d987b-predictor-serving-cert" not found Apr 17 17:34:21.793173 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:21.793150 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-d987b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7501e586-6f33-437c-a1de-8e37d3a78b56-success-200-isvc-d987b-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-d987b-predictor-7847c5c48b-q65fm\" (UID: \"7501e586-6f33-437c-a1de-8e37d3a78b56\") " pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" Apr 17 17:34:21.801331 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:21.801311 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bhlr\" (UniqueName: \"kubernetes.io/projected/7501e586-6f33-437c-a1de-8e37d3a78b56-kube-api-access-8bhlr\") pod \"success-200-isvc-d987b-predictor-7847c5c48b-q65fm\" (UID: \"7501e586-6f33-437c-a1de-8e37d3a78b56\") " pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" Apr 17 17:34:22.214606 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:22.214522 2575 generic.go:358] "Generic (PLEG): container finished" podID="1759c02c-89ff-4873-a196-cd59e3cc1f1c" containerID="70380340d325f10ba73407111374e8ad12348b5ab20d2320002ca0858c17f654" exitCode=2 Apr 17 17:34:22.214606 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:22.214566 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" event={"ID":"1759c02c-89ff-4873-a196-cd59e3cc1f1c","Type":"ContainerDied","Data":"70380340d325f10ba73407111374e8ad12348b5ab20d2320002ca0858c17f654"} Apr 17 17:34:22.297149 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:22.297108 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7501e586-6f33-437c-a1de-8e37d3a78b56-proxy-tls\") pod \"success-200-isvc-d987b-predictor-7847c5c48b-q65fm\" (UID: \"7501e586-6f33-437c-a1de-8e37d3a78b56\") " pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" Apr 17 17:34:22.299540 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:22.299519 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7501e586-6f33-437c-a1de-8e37d3a78b56-proxy-tls\") pod \"success-200-isvc-d987b-predictor-7847c5c48b-q65fm\" (UID: \"7501e586-6f33-437c-a1de-8e37d3a78b56\") " pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" Apr 17 17:34:22.543362 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:22.543324 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" Apr 17 17:34:22.667882 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:22.667858 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm"] Apr 17 17:34:22.669909 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:34:22.669880 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7501e586_6f33_437c_a1de_8e37d3a78b56.slice/crio-df8196e17efd25a609bde4934a64c114309635bbd455907ffb0f4ec1bffe95e9 WatchSource:0}: Error finding container df8196e17efd25a609bde4934a64c114309635bbd455907ffb0f4ec1bffe95e9: Status 404 returned error can't find the container with id df8196e17efd25a609bde4934a64c114309635bbd455907ffb0f4ec1bffe95e9 Apr 17 17:34:22.960814 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:22.960765 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" podUID="1759c02c-89ff-4873-a196-cd59e3cc1f1c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.31:8643/healthz\": dial tcp 10.132.0.31:8643: connect: connection refused" Apr 17 17:34:22.965359 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:22.965332 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" podUID="1759c02c-89ff-4873-a196-cd59e3cc1f1c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 17 17:34:23.219261 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:23.219173 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" event={"ID":"7501e586-6f33-437c-a1de-8e37d3a78b56","Type":"ContainerStarted","Data":"4f6eb69aa495e81884721d42d61bf9d772323c7e038303abfed0be45d09ab334"} Apr 17 17:34:23.219261 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:23.219215 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" event={"ID":"7501e586-6f33-437c-a1de-8e37d3a78b56","Type":"ContainerStarted","Data":"dfbc79a1238808a1602a551bf6d823d6b6ead410232523c78946edb08d36c34f"} Apr 17 17:34:23.219261 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:23.219225 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" event={"ID":"7501e586-6f33-437c-a1de-8e37d3a78b56","Type":"ContainerStarted","Data":"df8196e17efd25a609bde4934a64c114309635bbd455907ffb0f4ec1bffe95e9"} Apr 17 17:34:23.219760 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:23.219327 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" Apr 17 17:34:23.219760 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:23.219358 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" Apr 17 17:34:23.220479 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:23.220453 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" podUID="7501e586-6f33-437c-a1de-8e37d3a78b56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 17 17:34:23.237202 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:23.237165 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" podStartSLOduration=2.237151799 podStartE2EDuration="2.237151799s" podCreationTimestamp="2026-04-17 17:34:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:34:23.234795647 +0000 UTC m=+632.760476268" watchObservedRunningTime="2026-04-17 17:34:23.237151799 +0000 UTC m=+632.762832417" Apr 17 17:34:24.223404 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:24.223363 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" podUID="7501e586-6f33-437c-a1de-8e37d3a78b56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 17 17:34:24.619713 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:24.619689 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" Apr 17 17:34:24.713438 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:24.713406 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1759c02c-89ff-4873-a196-cd59e3cc1f1c-proxy-tls\") pod \"1759c02c-89ff-4873-a196-cd59e3cc1f1c\" (UID: \"1759c02c-89ff-4873-a196-cd59e3cc1f1c\") " Apr 17 17:34:24.713601 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:24.713466 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-235f3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1759c02c-89ff-4873-a196-cd59e3cc1f1c-success-200-isvc-235f3-kube-rbac-proxy-sar-config\") pod \"1759c02c-89ff-4873-a196-cd59e3cc1f1c\" (UID: \"1759c02c-89ff-4873-a196-cd59e3cc1f1c\") " Apr 17 17:34:24.713601 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:24.713511 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rkdw\" (UniqueName: \"kubernetes.io/projected/1759c02c-89ff-4873-a196-cd59e3cc1f1c-kube-api-access-8rkdw\") pod \"1759c02c-89ff-4873-a196-cd59e3cc1f1c\" (UID: \"1759c02c-89ff-4873-a196-cd59e3cc1f1c\") " Apr 17 17:34:24.713890 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:24.713857 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1759c02c-89ff-4873-a196-cd59e3cc1f1c-success-200-isvc-235f3-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-235f3-kube-rbac-proxy-sar-config") pod "1759c02c-89ff-4873-a196-cd59e3cc1f1c" (UID: "1759c02c-89ff-4873-a196-cd59e3cc1f1c"). InnerVolumeSpecName "success-200-isvc-235f3-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:34:24.715724 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:24.715694 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1759c02c-89ff-4873-a196-cd59e3cc1f1c-kube-api-access-8rkdw" (OuterVolumeSpecName: "kube-api-access-8rkdw") pod "1759c02c-89ff-4873-a196-cd59e3cc1f1c" (UID: "1759c02c-89ff-4873-a196-cd59e3cc1f1c"). InnerVolumeSpecName "kube-api-access-8rkdw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:34:24.715724 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:24.715709 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1759c02c-89ff-4873-a196-cd59e3cc1f1c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1759c02c-89ff-4873-a196-cd59e3cc1f1c" (UID: "1759c02c-89ff-4873-a196-cd59e3cc1f1c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:34:24.814041 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:24.814013 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1759c02c-89ff-4873-a196-cd59e3cc1f1c-proxy-tls\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:34:24.814041 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:24.814039 2575 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-235f3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1759c02c-89ff-4873-a196-cd59e3cc1f1c-success-200-isvc-235f3-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:34:24.814208 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:24.814049 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8rkdw\" (UniqueName: \"kubernetes.io/projected/1759c02c-89ff-4873-a196-cd59e3cc1f1c-kube-api-access-8rkdw\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:34:25.228463 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:25.228420 2575 generic.go:358] "Generic (PLEG): container finished" podID="1759c02c-89ff-4873-a196-cd59e3cc1f1c" containerID="64940f49adfaa7366b9b57b680f4843de53317fc3fde05ddfe5697da6c73e3b5" exitCode=0 Apr 17 17:34:25.228931 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:25.228543 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" event={"ID":"1759c02c-89ff-4873-a196-cd59e3cc1f1c","Type":"ContainerDied","Data":"64940f49adfaa7366b9b57b680f4843de53317fc3fde05ddfe5697da6c73e3b5"} Apr 17 17:34:25.228931 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:25.228561 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" Apr 17 17:34:25.228931 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:25.228584 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw" event={"ID":"1759c02c-89ff-4873-a196-cd59e3cc1f1c","Type":"ContainerDied","Data":"8dbc59a3693ef3d55ed39924e3c740833a399fcb9b551864b7a5c8318283fd87"} Apr 17 17:34:25.228931 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:25.228600 2575 scope.go:117] "RemoveContainer" containerID="70380340d325f10ba73407111374e8ad12348b5ab20d2320002ca0858c17f654" Apr 17 17:34:25.237361 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:25.237346 2575 scope.go:117] "RemoveContainer" containerID="64940f49adfaa7366b9b57b680f4843de53317fc3fde05ddfe5697da6c73e3b5" Apr 17 17:34:25.244713 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:25.244692 2575 scope.go:117] "RemoveContainer" containerID="70380340d325f10ba73407111374e8ad12348b5ab20d2320002ca0858c17f654" Apr 17 17:34:25.245021 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:34:25.244997 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70380340d325f10ba73407111374e8ad12348b5ab20d2320002ca0858c17f654\": container with ID starting with 70380340d325f10ba73407111374e8ad12348b5ab20d2320002ca0858c17f654 not found: ID does not exist" containerID="70380340d325f10ba73407111374e8ad12348b5ab20d2320002ca0858c17f654" Apr 17 17:34:25.245089 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:25.245030 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70380340d325f10ba73407111374e8ad12348b5ab20d2320002ca0858c17f654"} err="failed to get container status \"70380340d325f10ba73407111374e8ad12348b5ab20d2320002ca0858c17f654\": rpc error: code = NotFound desc = could not find container \"70380340d325f10ba73407111374e8ad12348b5ab20d2320002ca0858c17f654\": container with ID starting with 70380340d325f10ba73407111374e8ad12348b5ab20d2320002ca0858c17f654 not found: ID does not exist" Apr 17 17:34:25.245089 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:25.245057 2575 scope.go:117] "RemoveContainer" containerID="64940f49adfaa7366b9b57b680f4843de53317fc3fde05ddfe5697da6c73e3b5" Apr 17 17:34:25.245290 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:34:25.245265 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64940f49adfaa7366b9b57b680f4843de53317fc3fde05ddfe5697da6c73e3b5\": container with ID starting with 64940f49adfaa7366b9b57b680f4843de53317fc3fde05ddfe5697da6c73e3b5 not found: ID does not exist" containerID="64940f49adfaa7366b9b57b680f4843de53317fc3fde05ddfe5697da6c73e3b5" Apr 17 17:34:25.245365 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:25.245302 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64940f49adfaa7366b9b57b680f4843de53317fc3fde05ddfe5697da6c73e3b5"} err="failed to get container status \"64940f49adfaa7366b9b57b680f4843de53317fc3fde05ddfe5697da6c73e3b5\": rpc error: code = NotFound desc = could not find container \"64940f49adfaa7366b9b57b680f4843de53317fc3fde05ddfe5697da6c73e3b5\": container with ID starting with 64940f49adfaa7366b9b57b680f4843de53317fc3fde05ddfe5697da6c73e3b5 not found: ID does not exist" Apr 17 17:34:25.246360 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:25.246341 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw"] Apr 17 17:34:25.249712 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:25.249692 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-235f3-predictor-5d88b4c684-j25qw"] Apr 17 17:34:26.197998 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:26.197964 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z" podUID="5e998f7a-f6a9-4c7b-86fc-62266b83449b" containerName="switch-graph-235f3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:34:27.058081 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:27.058041 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" podUID="5c934c46-580b-4625-b33b-262e329e8b7a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 17 17:34:27.095388 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:27.095355 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1759c02c-89ff-4873-a196-cd59e3cc1f1c" path="/var/lib/kubelet/pods/1759c02c-89ff-4873-a196-cd59e3cc1f1c/volumes" Apr 17 17:34:29.227504 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:29.227472 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" Apr 17 17:34:29.228073 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:29.228046 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" podUID="7501e586-6f33-437c-a1de-8e37d3a78b56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 17 17:34:31.198338 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:31.198295 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z" podUID="5e998f7a-f6a9-4c7b-86fc-62266b83449b" containerName="switch-graph-235f3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:34:36.198297 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:36.198259 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z" podUID="5e998f7a-f6a9-4c7b-86fc-62266b83449b" containerName="switch-graph-235f3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:34:36.198670 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:36.198352 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z" Apr 17 17:34:37.057977 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:37.057949 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" Apr 17 17:34:39.228166 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:39.228122 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" podUID="7501e586-6f33-437c-a1de-8e37d3a78b56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 17 17:34:41.198282 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:41.198241 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z" podUID="5e998f7a-f6a9-4c7b-86fc-62266b83449b" containerName="switch-graph-235f3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:34:46.197893 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:46.197853 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z" podUID="5e998f7a-f6a9-4c7b-86fc-62266b83449b" containerName="switch-graph-235f3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:34:49.228922 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:49.228885 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" podUID="7501e586-6f33-437c-a1de-8e37d3a78b56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 17 17:34:51.198677 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:51.198642 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z" podUID="5e998f7a-f6a9-4c7b-86fc-62266b83449b" containerName="switch-graph-235f3" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:34:52.105735 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:52.105715 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z" Apr 17 17:34:52.214295 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:52.214268 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e998f7a-f6a9-4c7b-86fc-62266b83449b-openshift-service-ca-bundle\") pod \"5e998f7a-f6a9-4c7b-86fc-62266b83449b\" (UID: \"5e998f7a-f6a9-4c7b-86fc-62266b83449b\") " Apr 17 17:34:52.214743 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:52.214309 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e998f7a-f6a9-4c7b-86fc-62266b83449b-proxy-tls\") pod \"5e998f7a-f6a9-4c7b-86fc-62266b83449b\" (UID: \"5e998f7a-f6a9-4c7b-86fc-62266b83449b\") " Apr 17 17:34:52.214743 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:52.214679 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e998f7a-f6a9-4c7b-86fc-62266b83449b-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "5e998f7a-f6a9-4c7b-86fc-62266b83449b" (UID: "5e998f7a-f6a9-4c7b-86fc-62266b83449b"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:34:52.216284 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:52.216255 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e998f7a-f6a9-4c7b-86fc-62266b83449b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5e998f7a-f6a9-4c7b-86fc-62266b83449b" (UID: "5e998f7a-f6a9-4c7b-86fc-62266b83449b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:34:52.315542 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:52.315510 2575 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e998f7a-f6a9-4c7b-86fc-62266b83449b-openshift-service-ca-bundle\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:34:52.315542 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:52.315538 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e998f7a-f6a9-4c7b-86fc-62266b83449b-proxy-tls\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:34:52.323089 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:52.323061 2575 generic.go:358] "Generic (PLEG): container finished" podID="5e998f7a-f6a9-4c7b-86fc-62266b83449b" containerID="a909f2004dd4ea04166ae3eba4133edb287b861362310563d676242bbc51a37a" exitCode=0 Apr 17 17:34:52.323199 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:52.323127 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z" Apr 17 17:34:52.323199 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:52.323147 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z" event={"ID":"5e998f7a-f6a9-4c7b-86fc-62266b83449b","Type":"ContainerDied","Data":"a909f2004dd4ea04166ae3eba4133edb287b861362310563d676242bbc51a37a"} Apr 17 17:34:52.323199 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:52.323184 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z" event={"ID":"5e998f7a-f6a9-4c7b-86fc-62266b83449b","Type":"ContainerDied","Data":"b542e8307a43387c2903e0237f137d88f94189f35603d7384b74a955a291a2f6"} Apr 17 17:34:52.323304 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:52.323204 2575 scope.go:117] "RemoveContainer" containerID="a909f2004dd4ea04166ae3eba4133edb287b861362310563d676242bbc51a37a" Apr 17 17:34:52.332453 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:52.332433 2575 scope.go:117] "RemoveContainer" containerID="a909f2004dd4ea04166ae3eba4133edb287b861362310563d676242bbc51a37a" Apr 17 17:34:52.332702 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:34:52.332682 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a909f2004dd4ea04166ae3eba4133edb287b861362310563d676242bbc51a37a\": container with ID starting with a909f2004dd4ea04166ae3eba4133edb287b861362310563d676242bbc51a37a not found: ID does not exist" containerID="a909f2004dd4ea04166ae3eba4133edb287b861362310563d676242bbc51a37a" Apr 17 17:34:52.332755 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:52.332709 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a909f2004dd4ea04166ae3eba4133edb287b861362310563d676242bbc51a37a"} err="failed to get container status \"a909f2004dd4ea04166ae3eba4133edb287b861362310563d676242bbc51a37a\": rpc error: code = NotFound desc = could not find container \"a909f2004dd4ea04166ae3eba4133edb287b861362310563d676242bbc51a37a\": container with ID starting with a909f2004dd4ea04166ae3eba4133edb287b861362310563d676242bbc51a37a not found: ID does not exist" Apr 17 17:34:52.344529 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:52.344508 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z"] Apr 17 17:34:52.347367 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:52.347349 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-235f3-669999df9c-fr57z"] Apr 17 17:34:53.094928 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:53.094894 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e998f7a-f6a9-4c7b-86fc-62266b83449b" path="/var/lib/kubelet/pods/5e998f7a-f6a9-4c7b-86fc-62266b83449b/volumes" Apr 17 17:34:59.228064 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:34:59.228019 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" podUID="7501e586-6f33-437c-a1de-8e37d3a78b56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 17 17:35:01.225587 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:01.225557 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb"] Apr 17 17:35:01.225975 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:01.225947 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1759c02c-89ff-4873-a196-cd59e3cc1f1c" containerName="kube-rbac-proxy" Apr 17 17:35:01.225975 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:01.225959 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1759c02c-89ff-4873-a196-cd59e3cc1f1c" containerName="kube-rbac-proxy" Apr 17 17:35:01.225975 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:01.225970 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1759c02c-89ff-4873-a196-cd59e3cc1f1c" containerName="kserve-container" Apr 17 17:35:01.225975 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:01.225975 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1759c02c-89ff-4873-a196-cd59e3cc1f1c" containerName="kserve-container" Apr 17 17:35:01.226180 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:01.225991 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e998f7a-f6a9-4c7b-86fc-62266b83449b" containerName="switch-graph-235f3" Apr 17 17:35:01.226180 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:01.225997 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e998f7a-f6a9-4c7b-86fc-62266b83449b" containerName="switch-graph-235f3" Apr 17 17:35:01.226180 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:01.226049 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1759c02c-89ff-4873-a196-cd59e3cc1f1c" containerName="kube-rbac-proxy" Apr 17 17:35:01.226180 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:01.226058 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e998f7a-f6a9-4c7b-86fc-62266b83449b" containerName="switch-graph-235f3" Apr 17 17:35:01.226180 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:01.226068 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1759c02c-89ff-4873-a196-cd59e3cc1f1c" containerName="kserve-container" Apr 17 17:35:01.230503 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:01.230480 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb" Apr 17 17:35:01.232565 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:01.232538 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-kube-rbac-proxy-sar-config\"" Apr 17 17:35:01.232665 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:01.232627 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-serving-cert\"" Apr 17 17:35:01.237922 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:01.237677 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb"] Apr 17 17:35:01.284396 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:01.284374 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96980502-64fc-41d2-9ae9-1e7eefe8f6d5-openshift-service-ca-bundle\") pod \"model-chainer-579966fbd6-9m9hb\" (UID: \"96980502-64fc-41d2-9ae9-1e7eefe8f6d5\") " pod="kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb" Apr 17 17:35:01.284530 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:01.284449 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/96980502-64fc-41d2-9ae9-1e7eefe8f6d5-proxy-tls\") pod \"model-chainer-579966fbd6-9m9hb\" (UID: \"96980502-64fc-41d2-9ae9-1e7eefe8f6d5\") " pod="kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb" Apr 17 17:35:01.385194 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:01.385164 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/96980502-64fc-41d2-9ae9-1e7eefe8f6d5-proxy-tls\") pod \"model-chainer-579966fbd6-9m9hb\" (UID: \"96980502-64fc-41d2-9ae9-1e7eefe8f6d5\") " pod="kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb" Apr 17 17:35:01.385361 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:01.385207 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96980502-64fc-41d2-9ae9-1e7eefe8f6d5-openshift-service-ca-bundle\") pod \"model-chainer-579966fbd6-9m9hb\" (UID: \"96980502-64fc-41d2-9ae9-1e7eefe8f6d5\") " pod="kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb" Apr 17 17:35:01.385361 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:35:01.385323 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-serving-cert: secret "model-chainer-serving-cert" not found Apr 17 17:35:01.385449 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:35:01.385389 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/96980502-64fc-41d2-9ae9-1e7eefe8f6d5-proxy-tls podName:96980502-64fc-41d2-9ae9-1e7eefe8f6d5 nodeName:}" failed. No retries permitted until 2026-04-17 17:35:01.885371152 +0000 UTC m=+671.411051761 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/96980502-64fc-41d2-9ae9-1e7eefe8f6d5-proxy-tls") pod "model-chainer-579966fbd6-9m9hb" (UID: "96980502-64fc-41d2-9ae9-1e7eefe8f6d5") : secret "model-chainer-serving-cert" not found Apr 17 17:35:01.385917 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:01.385898 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96980502-64fc-41d2-9ae9-1e7eefe8f6d5-openshift-service-ca-bundle\") pod \"model-chainer-579966fbd6-9m9hb\" (UID: \"96980502-64fc-41d2-9ae9-1e7eefe8f6d5\") " pod="kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb" Apr 17 17:35:01.889264 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:01.889226 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/96980502-64fc-41d2-9ae9-1e7eefe8f6d5-proxy-tls\") pod \"model-chainer-579966fbd6-9m9hb\" (UID: \"96980502-64fc-41d2-9ae9-1e7eefe8f6d5\") " pod="kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb" Apr 17 17:35:01.891638 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:01.891604 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/96980502-64fc-41d2-9ae9-1e7eefe8f6d5-proxy-tls\") pod \"model-chainer-579966fbd6-9m9hb\" (UID: \"96980502-64fc-41d2-9ae9-1e7eefe8f6d5\") " pod="kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb" Apr 17 17:35:02.142192 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:02.142106 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb" Apr 17 17:35:02.269035 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:02.269009 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb"] Apr 17 17:35:02.271020 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:35:02.270993 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96980502_64fc_41d2_9ae9_1e7eefe8f6d5.slice/crio-7768d211e424d6f5e55078722106cffc6316818c0d16fd1034e6cf5380a52d97 WatchSource:0}: Error finding container 7768d211e424d6f5e55078722106cffc6316818c0d16fd1034e6cf5380a52d97: Status 404 returned error can't find the container with id 7768d211e424d6f5e55078722106cffc6316818c0d16fd1034e6cf5380a52d97 Apr 17 17:35:02.272710 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:02.272691 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:35:02.362283 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:02.362247 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb" event={"ID":"96980502-64fc-41d2-9ae9-1e7eefe8f6d5","Type":"ContainerStarted","Data":"2d4ceb01895fcd0378643e66f8b328ab97c1481758ded61abcaf45c34095bcd0"} Apr 17 17:35:02.362283 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:02.362283 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb" event={"ID":"96980502-64fc-41d2-9ae9-1e7eefe8f6d5","Type":"ContainerStarted","Data":"7768d211e424d6f5e55078722106cffc6316818c0d16fd1034e6cf5380a52d97"} Apr 17 17:35:02.362459 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:02.362365 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb" Apr 17 17:35:02.379323 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:02.379275 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb" podStartSLOduration=1.379260261 podStartE2EDuration="1.379260261s" podCreationTimestamp="2026-04-17 17:35:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:35:02.377682461 +0000 UTC m=+671.903363071" watchObservedRunningTime="2026-04-17 17:35:02.379260261 +0000 UTC m=+671.904940900" Apr 17 17:35:08.371373 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:08.371342 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb" Apr 17 17:35:09.229497 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:09.229466 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" Apr 17 17:35:11.323010 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:11.322976 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb"] Apr 17 17:35:11.323482 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:11.323207 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb" podUID="96980502-64fc-41d2-9ae9-1e7eefe8f6d5" containerName="model-chainer" containerID="cri-o://2d4ceb01895fcd0378643e66f8b328ab97c1481758ded61abcaf45c34095bcd0" gracePeriod=30 Apr 17 17:35:11.449455 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:11.449417 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286"] Apr 17 17:35:11.449856 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:11.449769 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" podUID="5c934c46-580b-4625-b33b-262e329e8b7a" containerName="kserve-container" containerID="cri-o://5588eae34f0940508395705b2acb26c066d69baecfa8c031f49426e54507f99c" gracePeriod=30 Apr 17 17:35:11.449995 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:11.449817 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" podUID="5c934c46-580b-4625-b33b-262e329e8b7a" containerName="kube-rbac-proxy" containerID="cri-o://c71b52cc2b5be0600f94d5a62f959fce1885bbca4aff325e8b9da40d93560b34" gracePeriod=30 Apr 17 17:35:11.471454 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:11.471430 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb"] Apr 17 17:35:11.475215 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:11.475197 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" Apr 17 17:35:11.477276 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:11.477256 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-fa93d-predictor-serving-cert\"" Apr 17 17:35:11.477374 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:11.477319 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-fa93d-kube-rbac-proxy-sar-config\"" Apr 17 17:35:11.485562 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:11.485539 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb"] Apr 17 17:35:11.565762 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:11.565726 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ca08d1c-47e9-4764-9175-f10bfaf98bd4-proxy-tls\") pod \"success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb\" (UID: \"4ca08d1c-47e9-4764-9175-f10bfaf98bd4\") " pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" Apr 17 17:35:11.565762 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:11.565766 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwxmn\" (UniqueName: \"kubernetes.io/projected/4ca08d1c-47e9-4764-9175-f10bfaf98bd4-kube-api-access-fwxmn\") pod \"success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb\" (UID: \"4ca08d1c-47e9-4764-9175-f10bfaf98bd4\") " pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" Apr 17 17:35:11.566045 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:11.565841 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-fa93d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ca08d1c-47e9-4764-9175-f10bfaf98bd4-success-200-isvc-fa93d-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb\" (UID: \"4ca08d1c-47e9-4764-9175-f10bfaf98bd4\") " pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" Apr 17 17:35:11.667206 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:11.667128 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ca08d1c-47e9-4764-9175-f10bfaf98bd4-proxy-tls\") pod \"success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb\" (UID: \"4ca08d1c-47e9-4764-9175-f10bfaf98bd4\") " pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" Apr 17 17:35:11.667206 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:11.667166 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fwxmn\" (UniqueName: \"kubernetes.io/projected/4ca08d1c-47e9-4764-9175-f10bfaf98bd4-kube-api-access-fwxmn\") pod \"success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb\" (UID: \"4ca08d1c-47e9-4764-9175-f10bfaf98bd4\") " pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" Apr 17 17:35:11.667413 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:11.667210 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-fa93d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ca08d1c-47e9-4764-9175-f10bfaf98bd4-success-200-isvc-fa93d-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb\" (UID: \"4ca08d1c-47e9-4764-9175-f10bfaf98bd4\") " pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" Apr 17 17:35:11.667413 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:35:11.667282 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-serving-cert: secret "success-200-isvc-fa93d-predictor-serving-cert" not found Apr 17 17:35:11.667413 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:35:11.667335 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ca08d1c-47e9-4764-9175-f10bfaf98bd4-proxy-tls podName:4ca08d1c-47e9-4764-9175-f10bfaf98bd4 nodeName:}" failed. No retries permitted until 2026-04-17 17:35:12.167320077 +0000 UTC m=+681.693000679 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4ca08d1c-47e9-4764-9175-f10bfaf98bd4-proxy-tls") pod "success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" (UID: "4ca08d1c-47e9-4764-9175-f10bfaf98bd4") : secret "success-200-isvc-fa93d-predictor-serving-cert" not found Apr 17 17:35:11.667929 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:11.667909 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-fa93d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ca08d1c-47e9-4764-9175-f10bfaf98bd4-success-200-isvc-fa93d-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb\" (UID: \"4ca08d1c-47e9-4764-9175-f10bfaf98bd4\") " pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" Apr 17 17:35:11.677551 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:11.677531 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwxmn\" (UniqueName: \"kubernetes.io/projected/4ca08d1c-47e9-4764-9175-f10bfaf98bd4-kube-api-access-fwxmn\") pod \"success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb\" (UID: \"4ca08d1c-47e9-4764-9175-f10bfaf98bd4\") " pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" Apr 17 17:35:12.051602 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:12.051561 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" podUID="5c934c46-580b-4625-b33b-262e329e8b7a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.30:8643/healthz\": dial tcp 10.132.0.30:8643: connect: connection refused" Apr 17 17:35:12.171004 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:12.170969 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ca08d1c-47e9-4764-9175-f10bfaf98bd4-proxy-tls\") pod \"success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb\" (UID: \"4ca08d1c-47e9-4764-9175-f10bfaf98bd4\") " pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" Apr 17 17:35:12.173348 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:12.173321 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ca08d1c-47e9-4764-9175-f10bfaf98bd4-proxy-tls\") pod \"success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb\" (UID: \"4ca08d1c-47e9-4764-9175-f10bfaf98bd4\") " pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" Apr 17 17:35:12.386547 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:12.386465 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" Apr 17 17:35:12.398349 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:12.398322 2575 generic.go:358] "Generic (PLEG): container finished" podID="5c934c46-580b-4625-b33b-262e329e8b7a" containerID="c71b52cc2b5be0600f94d5a62f959fce1885bbca4aff325e8b9da40d93560b34" exitCode=2 Apr 17 17:35:12.398466 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:12.398388 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" event={"ID":"5c934c46-580b-4625-b33b-262e329e8b7a","Type":"ContainerDied","Data":"c71b52cc2b5be0600f94d5a62f959fce1885bbca4aff325e8b9da40d93560b34"} Apr 17 17:35:12.512106 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:12.512074 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb"] Apr 17 17:35:12.515355 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:35:12.515323 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ca08d1c_47e9_4764_9175_f10bfaf98bd4.slice/crio-9cd582323401fea061bc6d37545309bc724e836ad13286d6597b082878dd2924 WatchSource:0}: Error finding container 9cd582323401fea061bc6d37545309bc724e836ad13286d6597b082878dd2924: Status 404 returned error can't find the container with id 9cd582323401fea061bc6d37545309bc724e836ad13286d6597b082878dd2924 Apr 17 17:35:13.371425 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:13.371375 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb" podUID="96980502-64fc-41d2-9ae9-1e7eefe8f6d5" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:35:13.403435 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:13.403403 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" event={"ID":"4ca08d1c-47e9-4764-9175-f10bfaf98bd4","Type":"ContainerStarted","Data":"3b6776348c9933373e773a84b9faeb50fa2a4ffd2e965096d644b1b6a147285e"} Apr 17 17:35:13.403435 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:13.403440 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" event={"ID":"4ca08d1c-47e9-4764-9175-f10bfaf98bd4","Type":"ContainerStarted","Data":"28bafaba377235a728d826f8a59b33963181da0b047fd049e667ae8adebe5704"} Apr 17 17:35:13.403868 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:13.403451 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" event={"ID":"4ca08d1c-47e9-4764-9175-f10bfaf98bd4","Type":"ContainerStarted","Data":"9cd582323401fea061bc6d37545309bc724e836ad13286d6597b082878dd2924"} Apr 17 17:35:13.403868 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:13.403530 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" Apr 17 17:35:13.422440 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:13.422386 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" podStartSLOduration=2.422368049 podStartE2EDuration="2.422368049s" podCreationTimestamp="2026-04-17 17:35:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:35:13.420364664 +0000 UTC m=+682.946045295" watchObservedRunningTime="2026-04-17 17:35:13.422368049 +0000 UTC m=+682.948048670" Apr 17 17:35:14.406881 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:14.406849 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" Apr 17 17:35:14.408041 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:14.408018 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" podUID="4ca08d1c-47e9-4764-9175-f10bfaf98bd4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 17 17:35:15.410681 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:15.410640 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" podUID="4ca08d1c-47e9-4764-9175-f10bfaf98bd4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 17 17:35:15.593761 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:15.593739 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" Apr 17 17:35:15.701871 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:15.701766 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c934c46-580b-4625-b33b-262e329e8b7a-kserve-provision-location\") pod \"5c934c46-580b-4625-b33b-262e329e8b7a\" (UID: \"5c934c46-580b-4625-b33b-262e329e8b7a\") " Apr 17 17:35:15.701871 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:15.701813 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zphll\" (UniqueName: \"kubernetes.io/projected/5c934c46-580b-4625-b33b-262e329e8b7a-kube-api-access-zphll\") pod \"5c934c46-580b-4625-b33b-262e329e8b7a\" (UID: \"5c934c46-580b-4625-b33b-262e329e8b7a\") " Apr 17 17:35:15.701871 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:15.701852 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5c934c46-580b-4625-b33b-262e329e8b7a-proxy-tls\") pod \"5c934c46-580b-4625-b33b-262e329e8b7a\" (UID: \"5c934c46-580b-4625-b33b-262e329e8b7a\") " Apr 17 17:35:15.702141 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:15.701882 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5c934c46-580b-4625-b33b-262e329e8b7a-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"5c934c46-580b-4625-b33b-262e329e8b7a\" (UID: \"5c934c46-580b-4625-b33b-262e329e8b7a\") " Apr 17 17:35:15.702203 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:15.702131 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c934c46-580b-4625-b33b-262e329e8b7a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5c934c46-580b-4625-b33b-262e329e8b7a" (UID: "5c934c46-580b-4625-b33b-262e329e8b7a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:35:15.702302 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:15.702282 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c934c46-580b-4625-b33b-262e329e8b7a-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-1-kube-rbac-proxy-sar-config") pod "5c934c46-580b-4625-b33b-262e329e8b7a" (UID: "5c934c46-580b-4625-b33b-262e329e8b7a"). InnerVolumeSpecName "isvc-sklearn-graph-1-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:35:15.703965 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:15.703933 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c934c46-580b-4625-b33b-262e329e8b7a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5c934c46-580b-4625-b33b-262e329e8b7a" (UID: "5c934c46-580b-4625-b33b-262e329e8b7a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:35:15.703965 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:15.703946 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c934c46-580b-4625-b33b-262e329e8b7a-kube-api-access-zphll" (OuterVolumeSpecName: "kube-api-access-zphll") pod "5c934c46-580b-4625-b33b-262e329e8b7a" (UID: "5c934c46-580b-4625-b33b-262e329e8b7a"). InnerVolumeSpecName "kube-api-access-zphll". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:35:15.803288 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:15.803248 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5c934c46-580b-4625-b33b-262e329e8b7a-kserve-provision-location\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:35:15.803288 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:15.803286 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zphll\" (UniqueName: \"kubernetes.io/projected/5c934c46-580b-4625-b33b-262e329e8b7a-kube-api-access-zphll\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:35:15.803288 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:15.803295 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5c934c46-580b-4625-b33b-262e329e8b7a-proxy-tls\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:35:15.803485 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:15.803307 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5c934c46-580b-4625-b33b-262e329e8b7a-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:35:16.414838 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:16.414797 2575 generic.go:358] "Generic (PLEG): container finished" podID="5c934c46-580b-4625-b33b-262e329e8b7a" containerID="5588eae34f0940508395705b2acb26c066d69baecfa8c031f49426e54507f99c" exitCode=0 Apr 17 17:35:16.415245 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:16.414858 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" event={"ID":"5c934c46-580b-4625-b33b-262e329e8b7a","Type":"ContainerDied","Data":"5588eae34f0940508395705b2acb26c066d69baecfa8c031f49426e54507f99c"} Apr 17 17:35:16.415245 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:16.414893 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" event={"ID":"5c934c46-580b-4625-b33b-262e329e8b7a","Type":"ContainerDied","Data":"c7d6c2fabeb20a963741d97494671e3cd32c5c1079cc842e211c8598305c9518"} Apr 17 17:35:16.415245 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:16.414913 2575 scope.go:117] "RemoveContainer" containerID="c71b52cc2b5be0600f94d5a62f959fce1885bbca4aff325e8b9da40d93560b34" Apr 17 17:35:16.415245 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:16.414915 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286" Apr 17 17:35:16.423035 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:16.422933 2575 scope.go:117] "RemoveContainer" containerID="5588eae34f0940508395705b2acb26c066d69baecfa8c031f49426e54507f99c" Apr 17 17:35:16.430344 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:16.430323 2575 scope.go:117] "RemoveContainer" containerID="fadeed1a6af8b8ed10e31a9fe859cf08fc846e7e05f4c0f58707a8a579b1f989" Apr 17 17:35:16.437715 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:16.437593 2575 scope.go:117] "RemoveContainer" containerID="c71b52cc2b5be0600f94d5a62f959fce1885bbca4aff325e8b9da40d93560b34" Apr 17 17:35:16.437946 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:35:16.437926 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c71b52cc2b5be0600f94d5a62f959fce1885bbca4aff325e8b9da40d93560b34\": container with ID starting with c71b52cc2b5be0600f94d5a62f959fce1885bbca4aff325e8b9da40d93560b34 not found: ID does not exist" containerID="c71b52cc2b5be0600f94d5a62f959fce1885bbca4aff325e8b9da40d93560b34" Apr 17 17:35:16.438015 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:16.437957 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c71b52cc2b5be0600f94d5a62f959fce1885bbca4aff325e8b9da40d93560b34"} err="failed to get container status \"c71b52cc2b5be0600f94d5a62f959fce1885bbca4aff325e8b9da40d93560b34\": rpc error: code = NotFound desc = could not find container \"c71b52cc2b5be0600f94d5a62f959fce1885bbca4aff325e8b9da40d93560b34\": container with ID starting with c71b52cc2b5be0600f94d5a62f959fce1885bbca4aff325e8b9da40d93560b34 not found: ID does not exist" Apr 17 17:35:16.438015 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:16.437976 2575 scope.go:117] "RemoveContainer" containerID="5588eae34f0940508395705b2acb26c066d69baecfa8c031f49426e54507f99c" Apr 17 17:35:16.438293 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:35:16.438269 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5588eae34f0940508395705b2acb26c066d69baecfa8c031f49426e54507f99c\": container with ID starting with 5588eae34f0940508395705b2acb26c066d69baecfa8c031f49426e54507f99c not found: ID does not exist" containerID="5588eae34f0940508395705b2acb26c066d69baecfa8c031f49426e54507f99c" Apr 17 17:35:16.438348 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:16.438302 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5588eae34f0940508395705b2acb26c066d69baecfa8c031f49426e54507f99c"} err="failed to get container status \"5588eae34f0940508395705b2acb26c066d69baecfa8c031f49426e54507f99c\": rpc error: code = NotFound desc = could not find container \"5588eae34f0940508395705b2acb26c066d69baecfa8c031f49426e54507f99c\": container with ID starting with 5588eae34f0940508395705b2acb26c066d69baecfa8c031f49426e54507f99c not found: ID does not exist" Apr 17 17:35:16.438348 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:16.438326 2575 scope.go:117] "RemoveContainer" containerID="fadeed1a6af8b8ed10e31a9fe859cf08fc846e7e05f4c0f58707a8a579b1f989" Apr 17 17:35:16.438575 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:35:16.438554 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fadeed1a6af8b8ed10e31a9fe859cf08fc846e7e05f4c0f58707a8a579b1f989\": container with ID starting with fadeed1a6af8b8ed10e31a9fe859cf08fc846e7e05f4c0f58707a8a579b1f989 not found: ID does not exist" containerID="fadeed1a6af8b8ed10e31a9fe859cf08fc846e7e05f4c0f58707a8a579b1f989" Apr 17 17:35:16.438617 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:16.438583 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fadeed1a6af8b8ed10e31a9fe859cf08fc846e7e05f4c0f58707a8a579b1f989"} err="failed to get container status \"fadeed1a6af8b8ed10e31a9fe859cf08fc846e7e05f4c0f58707a8a579b1f989\": rpc error: code = NotFound desc = could not find container \"fadeed1a6af8b8ed10e31a9fe859cf08fc846e7e05f4c0f58707a8a579b1f989\": container with ID starting with fadeed1a6af8b8ed10e31a9fe859cf08fc846e7e05f4c0f58707a8a579b1f989 not found: ID does not exist" Apr 17 17:35:16.439334 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:16.439317 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286"] Apr 17 17:35:16.445439 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:16.445416 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-97286"] Apr 17 17:35:17.095343 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:17.095313 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c934c46-580b-4625-b33b-262e329e8b7a" path="/var/lib/kubelet/pods/5c934c46-580b-4625-b33b-262e329e8b7a/volumes" Apr 17 17:35:18.370564 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:18.370529 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb" podUID="96980502-64fc-41d2-9ae9-1e7eefe8f6d5" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:35:20.414878 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:20.414847 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" Apr 17 17:35:20.415318 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:20.415295 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" podUID="4ca08d1c-47e9-4764-9175-f10bfaf98bd4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 17 17:35:21.677813 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:21.677769 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws"] Apr 17 17:35:21.678204 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:21.678142 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c934c46-580b-4625-b33b-262e329e8b7a" containerName="storage-initializer" Apr 17 17:35:21.678204 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:21.678153 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c934c46-580b-4625-b33b-262e329e8b7a" containerName="storage-initializer" Apr 17 17:35:21.678204 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:21.678166 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c934c46-580b-4625-b33b-262e329e8b7a" containerName="kserve-container" Apr 17 17:35:21.678204 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:21.678171 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c934c46-580b-4625-b33b-262e329e8b7a" containerName="kserve-container" Apr 17 17:35:21.678204 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:21.678180 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c934c46-580b-4625-b33b-262e329e8b7a" containerName="kube-rbac-proxy" Apr 17 17:35:21.678204 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:21.678186 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c934c46-580b-4625-b33b-262e329e8b7a" containerName="kube-rbac-proxy" Apr 17 17:35:21.678381 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:21.678257 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c934c46-580b-4625-b33b-262e329e8b7a" containerName="kserve-container" Apr 17 17:35:21.678381 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:21.678266 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c934c46-580b-4625-b33b-262e329e8b7a" containerName="kube-rbac-proxy" Apr 17 17:35:21.682812 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:21.682792 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws" Apr 17 17:35:21.684735 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:21.684714 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-d987b-serving-cert\"" Apr 17 17:35:21.684901 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:21.684880 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-d987b-kube-rbac-proxy-sar-config\"" Apr 17 17:35:21.688092 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:21.688073 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws"] Apr 17 17:35:21.852797 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:21.852761 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa2a2239-31a2-4a3b-8325-c38b645c4b7e-openshift-service-ca-bundle\") pod \"switch-graph-d987b-6595588756-qw7ws\" (UID: \"fa2a2239-31a2-4a3b-8325-c38b645c4b7e\") " pod="kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws" Apr 17 17:35:21.852980 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:21.852888 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fa2a2239-31a2-4a3b-8325-c38b645c4b7e-proxy-tls\") pod \"switch-graph-d987b-6595588756-qw7ws\" (UID: \"fa2a2239-31a2-4a3b-8325-c38b645c4b7e\") " pod="kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws" Apr 17 17:35:21.954367 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:21.954281 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fa2a2239-31a2-4a3b-8325-c38b645c4b7e-proxy-tls\") pod \"switch-graph-d987b-6595588756-qw7ws\" (UID: \"fa2a2239-31a2-4a3b-8325-c38b645c4b7e\") " pod="kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws" Apr 17 17:35:21.954367 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:21.954343 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa2a2239-31a2-4a3b-8325-c38b645c4b7e-openshift-service-ca-bundle\") pod \"switch-graph-d987b-6595588756-qw7ws\" (UID: \"fa2a2239-31a2-4a3b-8325-c38b645c4b7e\") " pod="kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws" Apr 17 17:35:21.954942 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:21.954918 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa2a2239-31a2-4a3b-8325-c38b645c4b7e-openshift-service-ca-bundle\") pod \"switch-graph-d987b-6595588756-qw7ws\" (UID: \"fa2a2239-31a2-4a3b-8325-c38b645c4b7e\") " pod="kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws" Apr 17 17:35:21.956743 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:21.956714 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fa2a2239-31a2-4a3b-8325-c38b645c4b7e-proxy-tls\") pod \"switch-graph-d987b-6595588756-qw7ws\" (UID: \"fa2a2239-31a2-4a3b-8325-c38b645c4b7e\") " pod="kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws" Apr 17 17:35:21.993880 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:21.993849 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws" Apr 17 17:35:22.116368 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:22.116336 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws"] Apr 17 17:35:22.119419 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:35:22.119390 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa2a2239_31a2_4a3b_8325_c38b645c4b7e.slice/crio-41240b12898698ff6e8e09225befc0d246c5bed68dba8b3dff72a005646e8c8c WatchSource:0}: Error finding container 41240b12898698ff6e8e09225befc0d246c5bed68dba8b3dff72a005646e8c8c: Status 404 returned error can't find the container with id 41240b12898698ff6e8e09225befc0d246c5bed68dba8b3dff72a005646e8c8c Apr 17 17:35:22.436800 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:22.436762 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws" event={"ID":"fa2a2239-31a2-4a3b-8325-c38b645c4b7e","Type":"ContainerStarted","Data":"45bcf2d1570d06c2eff5136c0341dcd9aeaa0f29d2cf17241944a6cb049f8a78"} Apr 17 17:35:22.436800 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:22.436800 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws" event={"ID":"fa2a2239-31a2-4a3b-8325-c38b645c4b7e","Type":"ContainerStarted","Data":"41240b12898698ff6e8e09225befc0d246c5bed68dba8b3dff72a005646e8c8c"} Apr 17 17:35:22.437074 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:22.436871 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws" Apr 17 17:35:22.453868 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:22.453805 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws" podStartSLOduration=1.4537835860000001 podStartE2EDuration="1.453783586s" podCreationTimestamp="2026-04-17 17:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:35:22.451881929 +0000 UTC m=+691.977562548" watchObservedRunningTime="2026-04-17 17:35:22.453783586 +0000 UTC m=+691.979464204" Apr 17 17:35:23.369932 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:23.369894 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb" podUID="96980502-64fc-41d2-9ae9-1e7eefe8f6d5" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:35:23.370307 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:23.370005 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb" Apr 17 17:35:28.370174 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:28.370129 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb" podUID="96980502-64fc-41d2-9ae9-1e7eefe8f6d5" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:35:28.446376 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:28.446350 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws" Apr 17 17:35:30.415799 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:30.415760 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" podUID="4ca08d1c-47e9-4764-9175-f10bfaf98bd4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 17 17:35:33.370493 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:33.370447 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb" podUID="96980502-64fc-41d2-9ae9-1e7eefe8f6d5" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:35:38.369940 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:38.369853 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb" podUID="96980502-64fc-41d2-9ae9-1e7eefe8f6d5" containerName="model-chainer" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:35:40.415959 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:40.415918 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" podUID="4ca08d1c-47e9-4764-9175-f10bfaf98bd4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 17 17:35:41.458632 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:41.458610 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb" Apr 17 17:35:41.505033 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:41.505003 2575 generic.go:358] "Generic (PLEG): container finished" podID="96980502-64fc-41d2-9ae9-1e7eefe8f6d5" containerID="2d4ceb01895fcd0378643e66f8b328ab97c1481758ded61abcaf45c34095bcd0" exitCode=0 Apr 17 17:35:41.505182 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:41.505058 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb" Apr 17 17:35:41.505182 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:41.505066 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb" event={"ID":"96980502-64fc-41d2-9ae9-1e7eefe8f6d5","Type":"ContainerDied","Data":"2d4ceb01895fcd0378643e66f8b328ab97c1481758ded61abcaf45c34095bcd0"} Apr 17 17:35:41.505182 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:41.505122 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb" event={"ID":"96980502-64fc-41d2-9ae9-1e7eefe8f6d5","Type":"ContainerDied","Data":"7768d211e424d6f5e55078722106cffc6316818c0d16fd1034e6cf5380a52d97"} Apr 17 17:35:41.505182 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:41.505143 2575 scope.go:117] "RemoveContainer" containerID="2d4ceb01895fcd0378643e66f8b328ab97c1481758ded61abcaf45c34095bcd0" Apr 17 17:35:41.507246 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:41.507229 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96980502-64fc-41d2-9ae9-1e7eefe8f6d5-openshift-service-ca-bundle\") pod \"96980502-64fc-41d2-9ae9-1e7eefe8f6d5\" (UID: \"96980502-64fc-41d2-9ae9-1e7eefe8f6d5\") " Apr 17 17:35:41.507312 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:41.507267 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/96980502-64fc-41d2-9ae9-1e7eefe8f6d5-proxy-tls\") pod \"96980502-64fc-41d2-9ae9-1e7eefe8f6d5\" (UID: \"96980502-64fc-41d2-9ae9-1e7eefe8f6d5\") " Apr 17 17:35:41.507650 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:41.507614 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96980502-64fc-41d2-9ae9-1e7eefe8f6d5-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "96980502-64fc-41d2-9ae9-1e7eefe8f6d5" (UID: "96980502-64fc-41d2-9ae9-1e7eefe8f6d5"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:35:41.509247 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:41.509225 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96980502-64fc-41d2-9ae9-1e7eefe8f6d5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "96980502-64fc-41d2-9ae9-1e7eefe8f6d5" (UID: "96980502-64fc-41d2-9ae9-1e7eefe8f6d5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:35:41.512961 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:41.512942 2575 scope.go:117] "RemoveContainer" containerID="2d4ceb01895fcd0378643e66f8b328ab97c1481758ded61abcaf45c34095bcd0" Apr 17 17:35:41.513226 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:35:41.513198 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d4ceb01895fcd0378643e66f8b328ab97c1481758ded61abcaf45c34095bcd0\": container with ID starting with 2d4ceb01895fcd0378643e66f8b328ab97c1481758ded61abcaf45c34095bcd0 not found: ID does not exist" containerID="2d4ceb01895fcd0378643e66f8b328ab97c1481758ded61abcaf45c34095bcd0" Apr 17 17:35:41.513291 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:41.513227 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d4ceb01895fcd0378643e66f8b328ab97c1481758ded61abcaf45c34095bcd0"} err="failed to get container status \"2d4ceb01895fcd0378643e66f8b328ab97c1481758ded61abcaf45c34095bcd0\": rpc error: code = NotFound desc = could not find container \"2d4ceb01895fcd0378643e66f8b328ab97c1481758ded61abcaf45c34095bcd0\": container with ID starting with 2d4ceb01895fcd0378643e66f8b328ab97c1481758ded61abcaf45c34095bcd0 not found: ID does not exist" Apr 17 17:35:41.608393 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:41.608330 2575 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96980502-64fc-41d2-9ae9-1e7eefe8f6d5-openshift-service-ca-bundle\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:35:41.608393 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:41.608353 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/96980502-64fc-41d2-9ae9-1e7eefe8f6d5-proxy-tls\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:35:41.826002 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:41.825975 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb"] Apr 17 17:35:41.831345 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:41.831314 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-579966fbd6-9m9hb"] Apr 17 17:35:43.094519 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:43.094485 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96980502-64fc-41d2-9ae9-1e7eefe8f6d5" path="/var/lib/kubelet/pods/96980502-64fc-41d2-9ae9-1e7eefe8f6d5/volumes" Apr 17 17:35:50.415411 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:35:50.415377 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" podUID="4ca08d1c-47e9-4764-9175-f10bfaf98bd4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 17 17:36:00.415869 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:36:00.415814 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" Apr 17 17:36:11.480738 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:36:11.480708 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn"] Apr 17 17:36:11.481124 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:36:11.481112 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96980502-64fc-41d2-9ae9-1e7eefe8f6d5" containerName="model-chainer" Apr 17 17:36:11.481124 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:36:11.481125 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="96980502-64fc-41d2-9ae9-1e7eefe8f6d5" containerName="model-chainer" Apr 17 17:36:11.481198 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:36:11.481190 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="96980502-64fc-41d2-9ae9-1e7eefe8f6d5" containerName="model-chainer" Apr 17 17:36:11.483919 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:36:11.483900 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn" Apr 17 17:36:11.485909 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:36:11.485890 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-fa93d-kube-rbac-proxy-sar-config\"" Apr 17 17:36:11.486021 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:36:11.486002 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-fa93d-serving-cert\"" Apr 17 17:36:11.493424 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:36:11.493397 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn"] Apr 17 17:36:11.546587 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:36:11.546557 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9-openshift-service-ca-bundle\") pod \"sequence-graph-fa93d-58f6759648-qxwxn\" (UID: \"06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9\") " pod="kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn" Apr 17 17:36:11.546587 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:36:11.546591 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9-proxy-tls\") pod \"sequence-graph-fa93d-58f6759648-qxwxn\" (UID: \"06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9\") " pod="kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn" Apr 17 17:36:11.647391 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:36:11.647359 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9-openshift-service-ca-bundle\") pod \"sequence-graph-fa93d-58f6759648-qxwxn\" (UID: \"06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9\") " pod="kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn" Apr 17 17:36:11.647391 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:36:11.647395 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9-proxy-tls\") pod \"sequence-graph-fa93d-58f6759648-qxwxn\" (UID: \"06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9\") " pod="kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn" Apr 17 17:36:11.647996 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:36:11.647977 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9-openshift-service-ca-bundle\") pod \"sequence-graph-fa93d-58f6759648-qxwxn\" (UID: \"06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9\") " pod="kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn" Apr 17 17:36:11.650039 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:36:11.650016 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9-proxy-tls\") pod \"sequence-graph-fa93d-58f6759648-qxwxn\" (UID: \"06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9\") " pod="kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn" Apr 17 17:36:11.795298 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:36:11.795270 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn" Apr 17 17:36:11.938448 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:36:11.938417 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn"] Apr 17 17:36:11.941792 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:36:11.941767 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06cf717f_8ff8_48d5_bb9d_a6ce56b0bbc9.slice/crio-254dc9369179e842092a1463f124e45cd6ce089d0f21b40970d972341c51e35b WatchSource:0}: Error finding container 254dc9369179e842092a1463f124e45cd6ce089d0f21b40970d972341c51e35b: Status 404 returned error can't find the container with id 254dc9369179e842092a1463f124e45cd6ce089d0f21b40970d972341c51e35b Apr 17 17:36:12.604391 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:36:12.604351 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn" event={"ID":"06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9","Type":"ContainerStarted","Data":"0c323abe4280252d1e9ad69741bb4b3e8d7ee586c2918d3d13f4a25dcbda6288"} Apr 17 17:36:12.604391 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:36:12.604397 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn" event={"ID":"06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9","Type":"ContainerStarted","Data":"254dc9369179e842092a1463f124e45cd6ce089d0f21b40970d972341c51e35b"} Apr 17 17:36:12.604835 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:36:12.604425 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn" Apr 17 17:36:12.620445 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:36:12.620399 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn" podStartSLOduration=1.620386593 podStartE2EDuration="1.620386593s" podCreationTimestamp="2026-04-17 17:36:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:36:12.618019455 +0000 UTC m=+742.143700074" watchObservedRunningTime="2026-04-17 17:36:12.620386593 +0000 UTC m=+742.146067212" Apr 17 17:36:18.613695 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:36:18.613664 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn" Apr 17 17:38:51.033651 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:38:51.033569 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctj42_a9e0a184-477b-45d6-835a-1606a973a5cf/console-operator/1.log" Apr 17 17:38:51.035044 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:38:51.035020 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctj42_a9e0a184-477b-45d6-835a-1606a973a5cf/console-operator/1.log" Apr 17 17:43:36.497290 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:36.497253 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws"] Apr 17 17:43:36.497748 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:36.497561 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws" podUID="fa2a2239-31a2-4a3b-8325-c38b645c4b7e" containerName="switch-graph-d987b" containerID="cri-o://45bcf2d1570d06c2eff5136c0341dcd9aeaa0f29d2cf17241944a6cb049f8a78" gracePeriod=30 Apr 17 17:43:36.604018 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:36.603984 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm"] Apr 17 17:43:36.604276 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:36.604247 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" podUID="7501e586-6f33-437c-a1de-8e37d3a78b56" containerName="kserve-container" containerID="cri-o://dfbc79a1238808a1602a551bf6d823d6b6ead410232523c78946edb08d36c34f" gracePeriod=30 Apr 17 17:43:36.604355 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:36.604299 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" podUID="7501e586-6f33-437c-a1de-8e37d3a78b56" containerName="kube-rbac-proxy" containerID="cri-o://4f6eb69aa495e81884721d42d61bf9d772323c7e038303abfed0be45d09ab334" gracePeriod=30 Apr 17 17:43:36.699437 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:36.699402 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk"] Apr 17 17:43:36.702952 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:36.702930 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" Apr 17 17:43:36.705013 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:36.704989 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-d1a8d-predictor-serving-cert\"" Apr 17 17:43:36.705119 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:36.705057 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-d1a8d-kube-rbac-proxy-sar-config\"" Apr 17 17:43:36.720382 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:36.720355 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk"] Apr 17 17:43:36.803215 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:36.803186 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9lwb\" (UniqueName: \"kubernetes.io/projected/2e9c6d8c-eb46-4b9d-88bd-6f402b235473-kube-api-access-b9lwb\") pod \"success-200-isvc-d1a8d-predictor-656694f686-jhmhk\" (UID: \"2e9c6d8c-eb46-4b9d-88bd-6f402b235473\") " pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" Apr 17 17:43:36.803364 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:36.803236 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-d1a8d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2e9c6d8c-eb46-4b9d-88bd-6f402b235473-success-200-isvc-d1a8d-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-d1a8d-predictor-656694f686-jhmhk\" (UID: \"2e9c6d8c-eb46-4b9d-88bd-6f402b235473\") " pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" Apr 17 17:43:36.803364 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:36.803302 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e9c6d8c-eb46-4b9d-88bd-6f402b235473-proxy-tls\") pod \"success-200-isvc-d1a8d-predictor-656694f686-jhmhk\" (UID: \"2e9c6d8c-eb46-4b9d-88bd-6f402b235473\") " pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" Apr 17 17:43:36.904491 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:36.904453 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e9c6d8c-eb46-4b9d-88bd-6f402b235473-proxy-tls\") pod \"success-200-isvc-d1a8d-predictor-656694f686-jhmhk\" (UID: \"2e9c6d8c-eb46-4b9d-88bd-6f402b235473\") " pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" Apr 17 17:43:36.904657 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:36.904531 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b9lwb\" (UniqueName: \"kubernetes.io/projected/2e9c6d8c-eb46-4b9d-88bd-6f402b235473-kube-api-access-b9lwb\") pod \"success-200-isvc-d1a8d-predictor-656694f686-jhmhk\" (UID: \"2e9c6d8c-eb46-4b9d-88bd-6f402b235473\") " pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" Apr 17 17:43:36.904657 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:36.904567 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-d1a8d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2e9c6d8c-eb46-4b9d-88bd-6f402b235473-success-200-isvc-d1a8d-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-d1a8d-predictor-656694f686-jhmhk\" (UID: \"2e9c6d8c-eb46-4b9d-88bd-6f402b235473\") " pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" Apr 17 17:43:36.904657 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:43:36.904593 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-serving-cert: secret "success-200-isvc-d1a8d-predictor-serving-cert" not found Apr 17 17:43:36.904779 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:43:36.904668 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e9c6d8c-eb46-4b9d-88bd-6f402b235473-proxy-tls podName:2e9c6d8c-eb46-4b9d-88bd-6f402b235473 nodeName:}" failed. No retries permitted until 2026-04-17 17:43:37.404653042 +0000 UTC m=+1186.930333639 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/2e9c6d8c-eb46-4b9d-88bd-6f402b235473-proxy-tls") pod "success-200-isvc-d1a8d-predictor-656694f686-jhmhk" (UID: "2e9c6d8c-eb46-4b9d-88bd-6f402b235473") : secret "success-200-isvc-d1a8d-predictor-serving-cert" not found Apr 17 17:43:36.905244 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:36.905225 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-d1a8d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2e9c6d8c-eb46-4b9d-88bd-6f402b235473-success-200-isvc-d1a8d-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-d1a8d-predictor-656694f686-jhmhk\" (UID: \"2e9c6d8c-eb46-4b9d-88bd-6f402b235473\") " pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" Apr 17 17:43:36.913005 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:36.912977 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9lwb\" (UniqueName: \"kubernetes.io/projected/2e9c6d8c-eb46-4b9d-88bd-6f402b235473-kube-api-access-b9lwb\") pod \"success-200-isvc-d1a8d-predictor-656694f686-jhmhk\" (UID: \"2e9c6d8c-eb46-4b9d-88bd-6f402b235473\") " pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" Apr 17 17:43:37.096982 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:37.096900 2575 generic.go:358] "Generic (PLEG): container finished" podID="7501e586-6f33-437c-a1de-8e37d3a78b56" containerID="4f6eb69aa495e81884721d42d61bf9d772323c7e038303abfed0be45d09ab334" exitCode=2 Apr 17 17:43:37.096982 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:37.096968 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" event={"ID":"7501e586-6f33-437c-a1de-8e37d3a78b56","Type":"ContainerDied","Data":"4f6eb69aa495e81884721d42d61bf9d772323c7e038303abfed0be45d09ab334"} Apr 17 17:43:37.407067 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:37.406986 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e9c6d8c-eb46-4b9d-88bd-6f402b235473-proxy-tls\") pod \"success-200-isvc-d1a8d-predictor-656694f686-jhmhk\" (UID: \"2e9c6d8c-eb46-4b9d-88bd-6f402b235473\") " pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" Apr 17 17:43:37.409470 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:37.409443 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e9c6d8c-eb46-4b9d-88bd-6f402b235473-proxy-tls\") pod \"success-200-isvc-d1a8d-predictor-656694f686-jhmhk\" (UID: \"2e9c6d8c-eb46-4b9d-88bd-6f402b235473\") " pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" Apr 17 17:43:37.614497 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:37.614461 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" Apr 17 17:43:37.743492 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:37.743353 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk"] Apr 17 17:43:37.746305 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:43:37.746279 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e9c6d8c_eb46_4b9d_88bd_6f402b235473.slice/crio-bd10153b0a5e45b03afacb9e0563bb4bbf3b3ad0c73bf32301c6a99f87d35973 WatchSource:0}: Error finding container bd10153b0a5e45b03afacb9e0563bb4bbf3b3ad0c73bf32301c6a99f87d35973: Status 404 returned error can't find the container with id bd10153b0a5e45b03afacb9e0563bb4bbf3b3ad0c73bf32301c6a99f87d35973 Apr 17 17:43:37.748083 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:37.748068 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:43:38.101242 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:38.101202 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" event={"ID":"2e9c6d8c-eb46-4b9d-88bd-6f402b235473","Type":"ContainerStarted","Data":"0ba7d7169b8a1876e21ae584ccc91f9fc516ceda351a049c7c2492e865e921fc"} Apr 17 17:43:38.101242 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:38.101239 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" event={"ID":"2e9c6d8c-eb46-4b9d-88bd-6f402b235473","Type":"ContainerStarted","Data":"1091338ae83361785e3932eea04b5f811749ba855321a17dcd0f0e4da8291fd1"} Apr 17 17:43:38.101242 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:38.101248 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" event={"ID":"2e9c6d8c-eb46-4b9d-88bd-6f402b235473","Type":"ContainerStarted","Data":"bd10153b0a5e45b03afacb9e0563bb4bbf3b3ad0c73bf32301c6a99f87d35973"} Apr 17 17:43:38.101502 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:38.101340 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" Apr 17 17:43:38.101502 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:38.101445 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" Apr 17 17:43:38.102762 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:38.102742 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" podUID="2e9c6d8c-eb46-4b9d-88bd-6f402b235473" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 17 17:43:38.119939 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:38.119896 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" podStartSLOduration=2.119884038 podStartE2EDuration="2.119884038s" podCreationTimestamp="2026-04-17 17:43:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:43:38.118383701 +0000 UTC m=+1187.644064336" watchObservedRunningTime="2026-04-17 17:43:38.119884038 +0000 UTC m=+1187.645564681" Apr 17 17:43:38.444529 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:38.444446 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws" podUID="fa2a2239-31a2-4a3b-8325-c38b645c4b7e" containerName="switch-graph-d987b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:43:39.104229 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:39.104196 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" podUID="2e9c6d8c-eb46-4b9d-88bd-6f402b235473" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 17 17:43:39.224491 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:39.224452 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" podUID="7501e586-6f33-437c-a1de-8e37d3a78b56" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.33:8643/healthz\": dial tcp 10.132.0.33:8643: connect: connection refused" Apr 17 17:43:39.228812 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:39.228782 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" podUID="7501e586-6f33-437c-a1de-8e37d3a78b56" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 17 17:43:39.758406 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:39.758374 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" Apr 17 17:43:39.825583 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:39.825497 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bhlr\" (UniqueName: \"kubernetes.io/projected/7501e586-6f33-437c-a1de-8e37d3a78b56-kube-api-access-8bhlr\") pod \"7501e586-6f33-437c-a1de-8e37d3a78b56\" (UID: \"7501e586-6f33-437c-a1de-8e37d3a78b56\") " Apr 17 17:43:39.825741 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:39.825588 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7501e586-6f33-437c-a1de-8e37d3a78b56-proxy-tls\") pod \"7501e586-6f33-437c-a1de-8e37d3a78b56\" (UID: \"7501e586-6f33-437c-a1de-8e37d3a78b56\") " Apr 17 17:43:39.825741 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:39.825620 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-d987b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7501e586-6f33-437c-a1de-8e37d3a78b56-success-200-isvc-d987b-kube-rbac-proxy-sar-config\") pod \"7501e586-6f33-437c-a1de-8e37d3a78b56\" (UID: \"7501e586-6f33-437c-a1de-8e37d3a78b56\") " Apr 17 17:43:39.826016 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:39.825992 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7501e586-6f33-437c-a1de-8e37d3a78b56-success-200-isvc-d987b-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-d987b-kube-rbac-proxy-sar-config") pod "7501e586-6f33-437c-a1de-8e37d3a78b56" (UID: "7501e586-6f33-437c-a1de-8e37d3a78b56"). InnerVolumeSpecName "success-200-isvc-d987b-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:43:39.827566 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:39.827534 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7501e586-6f33-437c-a1de-8e37d3a78b56-kube-api-access-8bhlr" (OuterVolumeSpecName: "kube-api-access-8bhlr") pod "7501e586-6f33-437c-a1de-8e37d3a78b56" (UID: "7501e586-6f33-437c-a1de-8e37d3a78b56"). InnerVolumeSpecName "kube-api-access-8bhlr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:43:39.827653 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:39.827619 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7501e586-6f33-437c-a1de-8e37d3a78b56-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7501e586-6f33-437c-a1de-8e37d3a78b56" (UID: "7501e586-6f33-437c-a1de-8e37d3a78b56"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:43:39.926619 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:39.926591 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8bhlr\" (UniqueName: \"kubernetes.io/projected/7501e586-6f33-437c-a1de-8e37d3a78b56-kube-api-access-8bhlr\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:43:39.926619 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:39.926618 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7501e586-6f33-437c-a1de-8e37d3a78b56-proxy-tls\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:43:39.926782 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:39.926635 2575 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-d987b-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7501e586-6f33-437c-a1de-8e37d3a78b56-success-200-isvc-d987b-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:43:40.109315 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:40.109222 2575 generic.go:358] "Generic (PLEG): container finished" podID="7501e586-6f33-437c-a1de-8e37d3a78b56" containerID="dfbc79a1238808a1602a551bf6d823d6b6ead410232523c78946edb08d36c34f" exitCode=0 Apr 17 17:43:40.109315 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:40.109295 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" Apr 17 17:43:40.109315 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:40.109305 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" event={"ID":"7501e586-6f33-437c-a1de-8e37d3a78b56","Type":"ContainerDied","Data":"dfbc79a1238808a1602a551bf6d823d6b6ead410232523c78946edb08d36c34f"} Apr 17 17:43:40.109798 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:40.109341 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm" event={"ID":"7501e586-6f33-437c-a1de-8e37d3a78b56","Type":"ContainerDied","Data":"df8196e17efd25a609bde4934a64c114309635bbd455907ffb0f4ec1bffe95e9"} Apr 17 17:43:40.109798 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:40.109357 2575 scope.go:117] "RemoveContainer" containerID="4f6eb69aa495e81884721d42d61bf9d772323c7e038303abfed0be45d09ab334" Apr 17 17:43:40.118359 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:40.118341 2575 scope.go:117] "RemoveContainer" containerID="dfbc79a1238808a1602a551bf6d823d6b6ead410232523c78946edb08d36c34f" Apr 17 17:43:40.125360 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:40.125345 2575 scope.go:117] "RemoveContainer" containerID="4f6eb69aa495e81884721d42d61bf9d772323c7e038303abfed0be45d09ab334" Apr 17 17:43:40.125601 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:43:40.125583 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f6eb69aa495e81884721d42d61bf9d772323c7e038303abfed0be45d09ab334\": container with ID starting with 4f6eb69aa495e81884721d42d61bf9d772323c7e038303abfed0be45d09ab334 not found: ID does not exist" containerID="4f6eb69aa495e81884721d42d61bf9d772323c7e038303abfed0be45d09ab334" Apr 17 17:43:40.125664 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:40.125612 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f6eb69aa495e81884721d42d61bf9d772323c7e038303abfed0be45d09ab334"} err="failed to get container status \"4f6eb69aa495e81884721d42d61bf9d772323c7e038303abfed0be45d09ab334\": rpc error: code = NotFound desc = could not find container \"4f6eb69aa495e81884721d42d61bf9d772323c7e038303abfed0be45d09ab334\": container with ID starting with 4f6eb69aa495e81884721d42d61bf9d772323c7e038303abfed0be45d09ab334 not found: ID does not exist" Apr 17 17:43:40.125664 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:40.125635 2575 scope.go:117] "RemoveContainer" containerID="dfbc79a1238808a1602a551bf6d823d6b6ead410232523c78946edb08d36c34f" Apr 17 17:43:40.125864 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:43:40.125846 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfbc79a1238808a1602a551bf6d823d6b6ead410232523c78946edb08d36c34f\": container with ID starting with dfbc79a1238808a1602a551bf6d823d6b6ead410232523c78946edb08d36c34f not found: ID does not exist" containerID="dfbc79a1238808a1602a551bf6d823d6b6ead410232523c78946edb08d36c34f" Apr 17 17:43:40.125937 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:40.125872 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfbc79a1238808a1602a551bf6d823d6b6ead410232523c78946edb08d36c34f"} err="failed to get container status \"dfbc79a1238808a1602a551bf6d823d6b6ead410232523c78946edb08d36c34f\": rpc error: code = NotFound desc = could not find container \"dfbc79a1238808a1602a551bf6d823d6b6ead410232523c78946edb08d36c34f\": container with ID starting with dfbc79a1238808a1602a551bf6d823d6b6ead410232523c78946edb08d36c34f not found: ID does not exist" Apr 17 17:43:40.132686 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:40.132666 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm"] Apr 17 17:43:40.136337 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:40.136318 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d987b-predictor-7847c5c48b-q65fm"] Apr 17 17:43:41.094541 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:41.094507 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7501e586-6f33-437c-a1de-8e37d3a78b56" path="/var/lib/kubelet/pods/7501e586-6f33-437c-a1de-8e37d3a78b56/volumes" Apr 17 17:43:43.444069 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:43.444033 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws" podUID="fa2a2239-31a2-4a3b-8325-c38b645c4b7e" containerName="switch-graph-d987b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:43:44.108585 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:44.108558 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" Apr 17 17:43:44.109180 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:44.109149 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" podUID="2e9c6d8c-eb46-4b9d-88bd-6f402b235473" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 17 17:43:48.444478 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:48.444443 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws" podUID="fa2a2239-31a2-4a3b-8325-c38b645c4b7e" containerName="switch-graph-d987b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:43:48.444872 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:48.444537 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws" Apr 17 17:43:51.057280 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:51.057250 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctj42_a9e0a184-477b-45d6-835a-1606a973a5cf/console-operator/1.log" Apr 17 17:43:51.058981 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:51.058956 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctj42_a9e0a184-477b-45d6-835a-1606a973a5cf/console-operator/1.log" Apr 17 17:43:53.444514 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:53.444469 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws" podUID="fa2a2239-31a2-4a3b-8325-c38b645c4b7e" containerName="switch-graph-d987b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:43:54.109585 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:54.109546 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" podUID="2e9c6d8c-eb46-4b9d-88bd-6f402b235473" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 17 17:43:58.443964 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:43:58.443923 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws" podUID="fa2a2239-31a2-4a3b-8325-c38b645c4b7e" containerName="switch-graph-d987b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:44:03.444652 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:03.444614 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws" podUID="fa2a2239-31a2-4a3b-8325-c38b645c4b7e" containerName="switch-graph-d987b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:44:04.109160 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:04.109112 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" podUID="2e9c6d8c-eb46-4b9d-88bd-6f402b235473" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 17 17:44:06.682777 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:06.682756 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws" Apr 17 17:44:06.740736 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:06.740707 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fa2a2239-31a2-4a3b-8325-c38b645c4b7e-proxy-tls\") pod \"fa2a2239-31a2-4a3b-8325-c38b645c4b7e\" (UID: \"fa2a2239-31a2-4a3b-8325-c38b645c4b7e\") " Apr 17 17:44:06.740919 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:06.740744 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa2a2239-31a2-4a3b-8325-c38b645c4b7e-openshift-service-ca-bundle\") pod \"fa2a2239-31a2-4a3b-8325-c38b645c4b7e\" (UID: \"fa2a2239-31a2-4a3b-8325-c38b645c4b7e\") " Apr 17 17:44:06.741174 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:06.741145 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa2a2239-31a2-4a3b-8325-c38b645c4b7e-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "fa2a2239-31a2-4a3b-8325-c38b645c4b7e" (UID: "fa2a2239-31a2-4a3b-8325-c38b645c4b7e"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:44:06.742761 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:06.742741 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa2a2239-31a2-4a3b-8325-c38b645c4b7e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fa2a2239-31a2-4a3b-8325-c38b645c4b7e" (UID: "fa2a2239-31a2-4a3b-8325-c38b645c4b7e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:44:06.842235 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:06.842152 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fa2a2239-31a2-4a3b-8325-c38b645c4b7e-proxy-tls\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:44:06.842235 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:06.842182 2575 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa2a2239-31a2-4a3b-8325-c38b645c4b7e-openshift-service-ca-bundle\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:44:07.199507 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:07.199474 2575 generic.go:358] "Generic (PLEG): container finished" podID="fa2a2239-31a2-4a3b-8325-c38b645c4b7e" containerID="45bcf2d1570d06c2eff5136c0341dcd9aeaa0f29d2cf17241944a6cb049f8a78" exitCode=0 Apr 17 17:44:07.199670 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:07.199542 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws" Apr 17 17:44:07.199670 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:07.199555 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws" event={"ID":"fa2a2239-31a2-4a3b-8325-c38b645c4b7e","Type":"ContainerDied","Data":"45bcf2d1570d06c2eff5136c0341dcd9aeaa0f29d2cf17241944a6cb049f8a78"} Apr 17 17:44:07.199670 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:07.199594 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws" event={"ID":"fa2a2239-31a2-4a3b-8325-c38b645c4b7e","Type":"ContainerDied","Data":"41240b12898698ff6e8e09225befc0d246c5bed68dba8b3dff72a005646e8c8c"} Apr 17 17:44:07.199670 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:07.199613 2575 scope.go:117] "RemoveContainer" containerID="45bcf2d1570d06c2eff5136c0341dcd9aeaa0f29d2cf17241944a6cb049f8a78" Apr 17 17:44:07.207313 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:07.207291 2575 scope.go:117] "RemoveContainer" containerID="45bcf2d1570d06c2eff5136c0341dcd9aeaa0f29d2cf17241944a6cb049f8a78" Apr 17 17:44:07.207533 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:44:07.207517 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45bcf2d1570d06c2eff5136c0341dcd9aeaa0f29d2cf17241944a6cb049f8a78\": container with ID starting with 45bcf2d1570d06c2eff5136c0341dcd9aeaa0f29d2cf17241944a6cb049f8a78 not found: ID does not exist" containerID="45bcf2d1570d06c2eff5136c0341dcd9aeaa0f29d2cf17241944a6cb049f8a78" Apr 17 17:44:07.207577 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:07.207540 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45bcf2d1570d06c2eff5136c0341dcd9aeaa0f29d2cf17241944a6cb049f8a78"} err="failed to get container status \"45bcf2d1570d06c2eff5136c0341dcd9aeaa0f29d2cf17241944a6cb049f8a78\": rpc error: code = NotFound desc = could not find container \"45bcf2d1570d06c2eff5136c0341dcd9aeaa0f29d2cf17241944a6cb049f8a78\": container with ID starting with 45bcf2d1570d06c2eff5136c0341dcd9aeaa0f29d2cf17241944a6cb049f8a78 not found: ID does not exist" Apr 17 17:44:07.214609 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:07.214586 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws"] Apr 17 17:44:07.216725 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:07.216706 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-d987b-6595588756-qw7ws"] Apr 17 17:44:09.095358 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:09.095325 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa2a2239-31a2-4a3b-8325-c38b645c4b7e" path="/var/lib/kubelet/pods/fa2a2239-31a2-4a3b-8325-c38b645c4b7e/volumes" Apr 17 17:44:14.109988 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:14.109950 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" podUID="2e9c6d8c-eb46-4b9d-88bd-6f402b235473" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 17 17:44:24.109964 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:24.109934 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" Apr 17 17:44:26.145286 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:26.145248 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn"] Apr 17 17:44:26.145754 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:26.145605 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn" podUID="06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9" containerName="sequence-graph-fa93d" containerID="cri-o://0c323abe4280252d1e9ad69741bb4b3e8d7ee586c2918d3d13f4a25dcbda6288" gracePeriod=30 Apr 17 17:44:26.311483 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:26.311449 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb"] Apr 17 17:44:26.311805 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:26.311771 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" podUID="4ca08d1c-47e9-4764-9175-f10bfaf98bd4" containerName="kserve-container" containerID="cri-o://28bafaba377235a728d826f8a59b33963181da0b047fd049e667ae8adebe5704" gracePeriod=30 Apr 17 17:44:26.311944 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:26.311878 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" podUID="4ca08d1c-47e9-4764-9175-f10bfaf98bd4" containerName="kube-rbac-proxy" containerID="cri-o://3b6776348c9933373e773a84b9faeb50fa2a4ffd2e965096d644b1b6a147285e" gracePeriod=30 Apr 17 17:44:26.359427 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:26.359392 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl"] Apr 17 17:44:26.359852 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:26.359808 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa2a2239-31a2-4a3b-8325-c38b645c4b7e" containerName="switch-graph-d987b" Apr 17 17:44:26.359852 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:26.359842 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa2a2239-31a2-4a3b-8325-c38b645c4b7e" containerName="switch-graph-d987b" Apr 17 17:44:26.359997 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:26.359863 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7501e586-6f33-437c-a1de-8e37d3a78b56" containerName="kube-rbac-proxy" Apr 17 17:44:26.359997 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:26.359871 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="7501e586-6f33-437c-a1de-8e37d3a78b56" containerName="kube-rbac-proxy" Apr 17 17:44:26.359997 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:26.359882 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7501e586-6f33-437c-a1de-8e37d3a78b56" containerName="kserve-container" Apr 17 17:44:26.359997 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:26.359889 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="7501e586-6f33-437c-a1de-8e37d3a78b56" containerName="kserve-container" Apr 17 17:44:26.359997 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:26.359976 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="7501e586-6f33-437c-a1de-8e37d3a78b56" containerName="kube-rbac-proxy" Apr 17 17:44:26.359997 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:26.359992 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="fa2a2239-31a2-4a3b-8325-c38b645c4b7e" containerName="switch-graph-d987b" Apr 17 17:44:26.360274 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:26.360002 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="7501e586-6f33-437c-a1de-8e37d3a78b56" containerName="kserve-container" Apr 17 17:44:26.364835 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:26.364799 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" Apr 17 17:44:26.366973 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:26.366948 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-a1277-predictor-serving-cert\"" Apr 17 17:44:26.367104 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:26.367004 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-a1277-kube-rbac-proxy-sar-config\"" Apr 17 17:44:26.376026 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:26.375994 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl"] Apr 17 17:44:26.506078 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:26.506041 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwvl8\" (UniqueName: \"kubernetes.io/projected/1aa5f184-67fd-454e-a1b7-11e64456aebe-kube-api-access-nwvl8\") pod \"success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl\" (UID: \"1aa5f184-67fd-454e-a1b7-11e64456aebe\") " pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" Apr 17 17:44:26.506248 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:26.506131 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1aa5f184-67fd-454e-a1b7-11e64456aebe-proxy-tls\") pod \"success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl\" (UID: \"1aa5f184-67fd-454e-a1b7-11e64456aebe\") " pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" Apr 17 17:44:26.506248 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:26.506165 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-a1277-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1aa5f184-67fd-454e-a1b7-11e64456aebe-success-200-isvc-a1277-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl\" (UID: \"1aa5f184-67fd-454e-a1b7-11e64456aebe\") " pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" Apr 17 17:44:26.607442 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:26.607404 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nwvl8\" (UniqueName: \"kubernetes.io/projected/1aa5f184-67fd-454e-a1b7-11e64456aebe-kube-api-access-nwvl8\") pod \"success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl\" (UID: \"1aa5f184-67fd-454e-a1b7-11e64456aebe\") " pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" Apr 17 17:44:26.607617 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:26.607451 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1aa5f184-67fd-454e-a1b7-11e64456aebe-proxy-tls\") pod \"success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl\" (UID: \"1aa5f184-67fd-454e-a1b7-11e64456aebe\") " pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" Apr 17 17:44:26.607617 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:26.607475 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-a1277-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1aa5f184-67fd-454e-a1b7-11e64456aebe-success-200-isvc-a1277-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl\" (UID: \"1aa5f184-67fd-454e-a1b7-11e64456aebe\") " pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" Apr 17 17:44:26.608151 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:26.608130 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-a1277-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1aa5f184-67fd-454e-a1b7-11e64456aebe-success-200-isvc-a1277-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl\" (UID: \"1aa5f184-67fd-454e-a1b7-11e64456aebe\") " pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" Apr 17 17:44:26.609705 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:26.609674 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1aa5f184-67fd-454e-a1b7-11e64456aebe-proxy-tls\") pod \"success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl\" (UID: \"1aa5f184-67fd-454e-a1b7-11e64456aebe\") " pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" Apr 17 17:44:26.617194 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:26.617174 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwvl8\" (UniqueName: \"kubernetes.io/projected/1aa5f184-67fd-454e-a1b7-11e64456aebe-kube-api-access-nwvl8\") pod \"success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl\" (UID: \"1aa5f184-67fd-454e-a1b7-11e64456aebe\") " pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" Apr 17 17:44:26.679166 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:26.679102 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" Apr 17 17:44:26.802509 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:26.802477 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl"] Apr 17 17:44:26.804283 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:44:26.804257 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1aa5f184_67fd_454e_a1b7_11e64456aebe.slice/crio-1d175d17d08b56daa812ff9bc07835a1735baddd6fd3d5b96d6143759e0a50ef WatchSource:0}: Error finding container 1d175d17d08b56daa812ff9bc07835a1735baddd6fd3d5b96d6143759e0a50ef: Status 404 returned error can't find the container with id 1d175d17d08b56daa812ff9bc07835a1735baddd6fd3d5b96d6143759e0a50ef Apr 17 17:44:27.272108 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:27.272067 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" event={"ID":"1aa5f184-67fd-454e-a1b7-11e64456aebe","Type":"ContainerStarted","Data":"bbafa73134dc22f2764a0dc1929bbcc2fc083ab1fde2594c6c3ddbdc463a79bc"} Apr 17 17:44:27.272108 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:27.272115 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" event={"ID":"1aa5f184-67fd-454e-a1b7-11e64456aebe","Type":"ContainerStarted","Data":"833dbbad1c46b7d04ecb204b96e3cc2571d658951075b424f68468774c93631b"} Apr 17 17:44:27.272631 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:27.272133 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" event={"ID":"1aa5f184-67fd-454e-a1b7-11e64456aebe","Type":"ContainerStarted","Data":"1d175d17d08b56daa812ff9bc07835a1735baddd6fd3d5b96d6143759e0a50ef"} Apr 17 17:44:27.272631 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:27.272151 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" Apr 17 17:44:27.273753 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:27.273689 2575 generic.go:358] "Generic (PLEG): container finished" podID="4ca08d1c-47e9-4764-9175-f10bfaf98bd4" containerID="3b6776348c9933373e773a84b9faeb50fa2a4ffd2e965096d644b1b6a147285e" exitCode=2 Apr 17 17:44:27.273930 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:27.273754 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" event={"ID":"4ca08d1c-47e9-4764-9175-f10bfaf98bd4","Type":"ContainerDied","Data":"3b6776348c9933373e773a84b9faeb50fa2a4ffd2e965096d644b1b6a147285e"} Apr 17 17:44:27.291496 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:27.291456 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" podStartSLOduration=1.291443138 podStartE2EDuration="1.291443138s" podCreationTimestamp="2026-04-17 17:44:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:44:27.288929526 +0000 UTC m=+1236.814610179" watchObservedRunningTime="2026-04-17 17:44:27.291443138 +0000 UTC m=+1236.817123758" Apr 17 17:44:28.278002 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:28.277968 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" Apr 17 17:44:28.279089 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:28.279061 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" podUID="1aa5f184-67fd-454e-a1b7-11e64456aebe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 17 17:44:28.611368 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:28.611282 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn" podUID="06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9" containerName="sequence-graph-fa93d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:44:29.282703 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:29.282675 2575 generic.go:358] "Generic (PLEG): container finished" podID="4ca08d1c-47e9-4764-9175-f10bfaf98bd4" containerID="28bafaba377235a728d826f8a59b33963181da0b047fd049e667ae8adebe5704" exitCode=0 Apr 17 17:44:29.283086 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:29.282757 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" event={"ID":"4ca08d1c-47e9-4764-9175-f10bfaf98bd4","Type":"ContainerDied","Data":"28bafaba377235a728d826f8a59b33963181da0b047fd049e667ae8adebe5704"} Apr 17 17:44:29.283162 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:29.283077 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" podUID="1aa5f184-67fd-454e-a1b7-11e64456aebe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 17 17:44:29.367209 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:29.367186 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" Apr 17 17:44:29.531282 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:29.531249 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-fa93d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ca08d1c-47e9-4764-9175-f10bfaf98bd4-success-200-isvc-fa93d-kube-rbac-proxy-sar-config\") pod \"4ca08d1c-47e9-4764-9175-f10bfaf98bd4\" (UID: \"4ca08d1c-47e9-4764-9175-f10bfaf98bd4\") " Apr 17 17:44:29.531466 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:29.531291 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ca08d1c-47e9-4764-9175-f10bfaf98bd4-proxy-tls\") pod \"4ca08d1c-47e9-4764-9175-f10bfaf98bd4\" (UID: \"4ca08d1c-47e9-4764-9175-f10bfaf98bd4\") " Apr 17 17:44:29.531466 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:29.531375 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwxmn\" (UniqueName: \"kubernetes.io/projected/4ca08d1c-47e9-4764-9175-f10bfaf98bd4-kube-api-access-fwxmn\") pod \"4ca08d1c-47e9-4764-9175-f10bfaf98bd4\" (UID: \"4ca08d1c-47e9-4764-9175-f10bfaf98bd4\") " Apr 17 17:44:29.531742 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:29.531705 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ca08d1c-47e9-4764-9175-f10bfaf98bd4-success-200-isvc-fa93d-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-fa93d-kube-rbac-proxy-sar-config") pod "4ca08d1c-47e9-4764-9175-f10bfaf98bd4" (UID: "4ca08d1c-47e9-4764-9175-f10bfaf98bd4"). InnerVolumeSpecName "success-200-isvc-fa93d-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:44:29.533508 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:29.533476 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ca08d1c-47e9-4764-9175-f10bfaf98bd4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4ca08d1c-47e9-4764-9175-f10bfaf98bd4" (UID: "4ca08d1c-47e9-4764-9175-f10bfaf98bd4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:44:29.533508 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:29.533491 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ca08d1c-47e9-4764-9175-f10bfaf98bd4-kube-api-access-fwxmn" (OuterVolumeSpecName: "kube-api-access-fwxmn") pod "4ca08d1c-47e9-4764-9175-f10bfaf98bd4" (UID: "4ca08d1c-47e9-4764-9175-f10bfaf98bd4"). InnerVolumeSpecName "kube-api-access-fwxmn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:44:29.632021 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:29.631985 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fwxmn\" (UniqueName: \"kubernetes.io/projected/4ca08d1c-47e9-4764-9175-f10bfaf98bd4-kube-api-access-fwxmn\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:44:29.632021 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:29.632016 2575 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-fa93d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ca08d1c-47e9-4764-9175-f10bfaf98bd4-success-200-isvc-fa93d-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:44:29.632021 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:29.632027 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ca08d1c-47e9-4764-9175-f10bfaf98bd4-proxy-tls\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:44:30.287030 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:30.287000 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" event={"ID":"4ca08d1c-47e9-4764-9175-f10bfaf98bd4","Type":"ContainerDied","Data":"9cd582323401fea061bc6d37545309bc724e836ad13286d6597b082878dd2924"} Apr 17 17:44:30.287030 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:30.287032 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb" Apr 17 17:44:30.287514 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:30.287044 2575 scope.go:117] "RemoveContainer" containerID="3b6776348c9933373e773a84b9faeb50fa2a4ffd2e965096d644b1b6a147285e" Apr 17 17:44:30.297241 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:30.296724 2575 scope.go:117] "RemoveContainer" containerID="28bafaba377235a728d826f8a59b33963181da0b047fd049e667ae8adebe5704" Apr 17 17:44:30.309463 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:30.309440 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb"] Apr 17 17:44:30.312853 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:30.312813 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-fa93d-predictor-6cc78c6bf6-t78gb"] Apr 17 17:44:31.094515 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:31.094483 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ca08d1c-47e9-4764-9175-f10bfaf98bd4" path="/var/lib/kubelet/pods/4ca08d1c-47e9-4764-9175-f10bfaf98bd4/volumes" Apr 17 17:44:33.612208 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:33.612170 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn" podUID="06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9" containerName="sequence-graph-fa93d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:44:34.287991 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:34.287962 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" Apr 17 17:44:34.288621 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:34.288598 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" podUID="1aa5f184-67fd-454e-a1b7-11e64456aebe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 17 17:44:36.751093 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:36.751021 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj"] Apr 17 17:44:36.751436 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:36.751380 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ca08d1c-47e9-4764-9175-f10bfaf98bd4" containerName="kserve-container" Apr 17 17:44:36.751436 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:36.751390 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ca08d1c-47e9-4764-9175-f10bfaf98bd4" containerName="kserve-container" Apr 17 17:44:36.751436 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:36.751400 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ca08d1c-47e9-4764-9175-f10bfaf98bd4" containerName="kube-rbac-proxy" Apr 17 17:44:36.751436 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:36.751405 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ca08d1c-47e9-4764-9175-f10bfaf98bd4" containerName="kube-rbac-proxy" Apr 17 17:44:36.751562 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:36.751458 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ca08d1c-47e9-4764-9175-f10bfaf98bd4" containerName="kube-rbac-proxy" Apr 17 17:44:36.751562 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:36.751469 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ca08d1c-47e9-4764-9175-f10bfaf98bd4" containerName="kserve-container" Apr 17 17:44:36.755885 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:36.755867 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj" Apr 17 17:44:36.758437 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:36.758414 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-d1a8d-serving-cert\"" Apr 17 17:44:36.758597 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:36.758582 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-d1a8d-kube-rbac-proxy-sar-config\"" Apr 17 17:44:36.762064 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:36.762043 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj"] Apr 17 17:44:36.896255 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:36.896224 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/443d72a2-0912-4fb1-9fbf-1146f11b002f-openshift-service-ca-bundle\") pod \"ensemble-graph-d1a8d-548898d46-886wj\" (UID: \"443d72a2-0912-4fb1-9fbf-1146f11b002f\") " pod="kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj" Apr 17 17:44:36.896414 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:36.896301 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/443d72a2-0912-4fb1-9fbf-1146f11b002f-proxy-tls\") pod \"ensemble-graph-d1a8d-548898d46-886wj\" (UID: \"443d72a2-0912-4fb1-9fbf-1146f11b002f\") " pod="kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj" Apr 17 17:44:36.996920 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:36.996878 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/443d72a2-0912-4fb1-9fbf-1146f11b002f-openshift-service-ca-bundle\") pod \"ensemble-graph-d1a8d-548898d46-886wj\" (UID: \"443d72a2-0912-4fb1-9fbf-1146f11b002f\") " pod="kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj" Apr 17 17:44:36.997109 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:36.996975 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/443d72a2-0912-4fb1-9fbf-1146f11b002f-proxy-tls\") pod \"ensemble-graph-d1a8d-548898d46-886wj\" (UID: \"443d72a2-0912-4fb1-9fbf-1146f11b002f\") " pod="kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj" Apr 17 17:44:36.997526 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:36.997506 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/443d72a2-0912-4fb1-9fbf-1146f11b002f-openshift-service-ca-bundle\") pod \"ensemble-graph-d1a8d-548898d46-886wj\" (UID: \"443d72a2-0912-4fb1-9fbf-1146f11b002f\") " pod="kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj" Apr 17 17:44:36.999366 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:36.999342 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/443d72a2-0912-4fb1-9fbf-1146f11b002f-proxy-tls\") pod \"ensemble-graph-d1a8d-548898d46-886wj\" (UID: \"443d72a2-0912-4fb1-9fbf-1146f11b002f\") " pod="kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj" Apr 17 17:44:37.066512 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:37.066413 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj" Apr 17 17:44:37.192640 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:37.192616 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj"] Apr 17 17:44:37.194924 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:44:37.194884 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod443d72a2_0912_4fb1_9fbf_1146f11b002f.slice/crio-17fe15dde51a089c716e9da55f95d4a95ef5698692c8da4104676b3f14333308 WatchSource:0}: Error finding container 17fe15dde51a089c716e9da55f95d4a95ef5698692c8da4104676b3f14333308: Status 404 returned error can't find the container with id 17fe15dde51a089c716e9da55f95d4a95ef5698692c8da4104676b3f14333308 Apr 17 17:44:37.311637 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:37.311605 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj" event={"ID":"443d72a2-0912-4fb1-9fbf-1146f11b002f","Type":"ContainerStarted","Data":"7369c530941dbfb5e32cb1d8e53dd94af880dfc432fba2f956ed922e34b75797"} Apr 17 17:44:37.311637 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:37.311641 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj" event={"ID":"443d72a2-0912-4fb1-9fbf-1146f11b002f","Type":"ContainerStarted","Data":"17fe15dde51a089c716e9da55f95d4a95ef5698692c8da4104676b3f14333308"} Apr 17 17:44:37.311848 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:37.311685 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj" Apr 17 17:44:37.327924 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:37.327837 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj" podStartSLOduration=1.327808407 podStartE2EDuration="1.327808407s" podCreationTimestamp="2026-04-17 17:44:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:44:37.325970025 +0000 UTC m=+1246.851650643" watchObservedRunningTime="2026-04-17 17:44:37.327808407 +0000 UTC m=+1246.853489026" Apr 17 17:44:38.612088 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:38.612044 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn" podUID="06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9" containerName="sequence-graph-fa93d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:44:38.612469 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:38.612170 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn" Apr 17 17:44:43.320479 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:43.320446 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj" Apr 17 17:44:43.611997 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:43.611907 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn" podUID="06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9" containerName="sequence-graph-fa93d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:44:44.288977 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:44.288933 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" podUID="1aa5f184-67fd-454e-a1b7-11e64456aebe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 17 17:44:46.803463 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:46.803429 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj"] Apr 17 17:44:46.803852 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:46.803672 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj" podUID="443d72a2-0912-4fb1-9fbf-1146f11b002f" containerName="ensemble-graph-d1a8d" containerID="cri-o://7369c530941dbfb5e32cb1d8e53dd94af880dfc432fba2f956ed922e34b75797" gracePeriod=30 Apr 17 17:44:46.921571 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:46.921502 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk"] Apr 17 17:44:46.921855 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:46.921810 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" podUID="2e9c6d8c-eb46-4b9d-88bd-6f402b235473" containerName="kserve-container" containerID="cri-o://1091338ae83361785e3932eea04b5f811749ba855321a17dcd0f0e4da8291fd1" gracePeriod=30 Apr 17 17:44:46.922105 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:46.921874 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" podUID="2e9c6d8c-eb46-4b9d-88bd-6f402b235473" containerName="kube-rbac-proxy" containerID="cri-o://0ba7d7169b8a1876e21ae584ccc91f9fc516ceda351a049c7c2492e865e921fc" gracePeriod=30 Apr 17 17:44:46.945071 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:46.945045 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65"] Apr 17 17:44:46.948787 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:46.948769 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" Apr 17 17:44:46.951461 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:46.951437 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-313c7-predictor-serving-cert\"" Apr 17 17:44:46.951618 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:46.951593 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-313c7-kube-rbac-proxy-sar-config\"" Apr 17 17:44:46.959381 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:46.959358 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65"] Apr 17 17:44:46.973218 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:46.973185 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-313c7-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/33b7e886-f7fd-4627-9183-0751e15881f5-success-200-isvc-313c7-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-313c7-predictor-5bddb7798d-j8c65\" (UID: \"33b7e886-f7fd-4627-9183-0751e15881f5\") " pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" Apr 17 17:44:46.973329 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:46.973229 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnrj2\" (UniqueName: \"kubernetes.io/projected/33b7e886-f7fd-4627-9183-0751e15881f5-kube-api-access-pnrj2\") pod \"success-200-isvc-313c7-predictor-5bddb7798d-j8c65\" (UID: \"33b7e886-f7fd-4627-9183-0751e15881f5\") " pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" Apr 17 17:44:46.973329 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:46.973298 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33b7e886-f7fd-4627-9183-0751e15881f5-proxy-tls\") pod \"success-200-isvc-313c7-predictor-5bddb7798d-j8c65\" (UID: \"33b7e886-f7fd-4627-9183-0751e15881f5\") " pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" Apr 17 17:44:47.074212 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:47.074139 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-313c7-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/33b7e886-f7fd-4627-9183-0751e15881f5-success-200-isvc-313c7-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-313c7-predictor-5bddb7798d-j8c65\" (UID: \"33b7e886-f7fd-4627-9183-0751e15881f5\") " pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" Apr 17 17:44:47.074212 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:47.074175 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pnrj2\" (UniqueName: \"kubernetes.io/projected/33b7e886-f7fd-4627-9183-0751e15881f5-kube-api-access-pnrj2\") pod \"success-200-isvc-313c7-predictor-5bddb7798d-j8c65\" (UID: \"33b7e886-f7fd-4627-9183-0751e15881f5\") " pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" Apr 17 17:44:47.074389 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:47.074214 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33b7e886-f7fd-4627-9183-0751e15881f5-proxy-tls\") pod \"success-200-isvc-313c7-predictor-5bddb7798d-j8c65\" (UID: \"33b7e886-f7fd-4627-9183-0751e15881f5\") " pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" Apr 17 17:44:47.074866 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:47.074844 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-313c7-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/33b7e886-f7fd-4627-9183-0751e15881f5-success-200-isvc-313c7-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-313c7-predictor-5bddb7798d-j8c65\" (UID: \"33b7e886-f7fd-4627-9183-0751e15881f5\") " pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" Apr 17 17:44:47.076577 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:47.076556 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33b7e886-f7fd-4627-9183-0751e15881f5-proxy-tls\") pod \"success-200-isvc-313c7-predictor-5bddb7798d-j8c65\" (UID: \"33b7e886-f7fd-4627-9183-0751e15881f5\") " pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" Apr 17 17:44:47.093183 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:47.093156 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnrj2\" (UniqueName: \"kubernetes.io/projected/33b7e886-f7fd-4627-9183-0751e15881f5-kube-api-access-pnrj2\") pod \"success-200-isvc-313c7-predictor-5bddb7798d-j8c65\" (UID: \"33b7e886-f7fd-4627-9183-0751e15881f5\") " pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" Apr 17 17:44:47.260175 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:47.260136 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" Apr 17 17:44:47.348152 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:47.348117 2575 generic.go:358] "Generic (PLEG): container finished" podID="2e9c6d8c-eb46-4b9d-88bd-6f402b235473" containerID="0ba7d7169b8a1876e21ae584ccc91f9fc516ceda351a049c7c2492e865e921fc" exitCode=2 Apr 17 17:44:47.348271 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:47.348155 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" event={"ID":"2e9c6d8c-eb46-4b9d-88bd-6f402b235473","Type":"ContainerDied","Data":"0ba7d7169b8a1876e21ae584ccc91f9fc516ceda351a049c7c2492e865e921fc"} Apr 17 17:44:47.390078 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:47.390043 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65"] Apr 17 17:44:47.393254 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:44:47.393223 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33b7e886_f7fd_4627_9183_0751e15881f5.slice/crio-cfc5a0d516054c5318671ffab6c9e0de18bf274a9209f7191fbcab9fd96fa6e7 WatchSource:0}: Error finding container cfc5a0d516054c5318671ffab6c9e0de18bf274a9209f7191fbcab9fd96fa6e7: Status 404 returned error can't find the container with id cfc5a0d516054c5318671ffab6c9e0de18bf274a9209f7191fbcab9fd96fa6e7 Apr 17 17:44:48.319286 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:48.319251 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj" podUID="443d72a2-0912-4fb1-9fbf-1146f11b002f" containerName="ensemble-graph-d1a8d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:44:48.353131 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:48.353098 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" event={"ID":"33b7e886-f7fd-4627-9183-0751e15881f5","Type":"ContainerStarted","Data":"e56c23105a60f2dae8a8e06b52ead7f8036d397dbd83f2d3fcdb7b3a48bf6638"} Apr 17 17:44:48.353131 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:48.353133 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" event={"ID":"33b7e886-f7fd-4627-9183-0751e15881f5","Type":"ContainerStarted","Data":"f21eae66f4a6e92fcd98fb17c90ec5f6c6bbc134277a75fbfd579f1900bb5a4d"} Apr 17 17:44:48.353395 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:48.353144 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" event={"ID":"33b7e886-f7fd-4627-9183-0751e15881f5","Type":"ContainerStarted","Data":"cfc5a0d516054c5318671ffab6c9e0de18bf274a9209f7191fbcab9fd96fa6e7"} Apr 17 17:44:48.353395 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:48.353258 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" Apr 17 17:44:48.374687 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:48.374637 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" podStartSLOduration=2.374623152 podStartE2EDuration="2.374623152s" podCreationTimestamp="2026-04-17 17:44:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:44:48.373361369 +0000 UTC m=+1257.899041988" watchObservedRunningTime="2026-04-17 17:44:48.374623152 +0000 UTC m=+1257.900303771" Apr 17 17:44:48.611579 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:48.611495 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn" podUID="06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9" containerName="sequence-graph-fa93d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:44:49.104730 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:49.104690 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" podUID="2e9c6d8c-eb46-4b9d-88bd-6f402b235473" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.38:8643/healthz\": dial tcp 10.132.0.38:8643: connect: connection refused" Apr 17 17:44:49.359450 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:49.358883 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" Apr 17 17:44:49.362442 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:49.362405 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" podUID="33b7e886-f7fd-4627-9183-0751e15881f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 17 17:44:50.061532 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:50.061507 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" Apr 17 17:44:50.096351 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:50.096287 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9lwb\" (UniqueName: \"kubernetes.io/projected/2e9c6d8c-eb46-4b9d-88bd-6f402b235473-kube-api-access-b9lwb\") pod \"2e9c6d8c-eb46-4b9d-88bd-6f402b235473\" (UID: \"2e9c6d8c-eb46-4b9d-88bd-6f402b235473\") " Apr 17 17:44:50.096351 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:50.096316 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e9c6d8c-eb46-4b9d-88bd-6f402b235473-proxy-tls\") pod \"2e9c6d8c-eb46-4b9d-88bd-6f402b235473\" (UID: \"2e9c6d8c-eb46-4b9d-88bd-6f402b235473\") " Apr 17 17:44:50.096491 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:50.096351 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-d1a8d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2e9c6d8c-eb46-4b9d-88bd-6f402b235473-success-200-isvc-d1a8d-kube-rbac-proxy-sar-config\") pod \"2e9c6d8c-eb46-4b9d-88bd-6f402b235473\" (UID: \"2e9c6d8c-eb46-4b9d-88bd-6f402b235473\") " Apr 17 17:44:50.096799 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:50.096766 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e9c6d8c-eb46-4b9d-88bd-6f402b235473-success-200-isvc-d1a8d-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-d1a8d-kube-rbac-proxy-sar-config") pod "2e9c6d8c-eb46-4b9d-88bd-6f402b235473" (UID: "2e9c6d8c-eb46-4b9d-88bd-6f402b235473"). InnerVolumeSpecName "success-200-isvc-d1a8d-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:44:50.098478 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:50.098450 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e9c6d8c-eb46-4b9d-88bd-6f402b235473-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2e9c6d8c-eb46-4b9d-88bd-6f402b235473" (UID: "2e9c6d8c-eb46-4b9d-88bd-6f402b235473"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:44:50.098592 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:50.098491 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e9c6d8c-eb46-4b9d-88bd-6f402b235473-kube-api-access-b9lwb" (OuterVolumeSpecName: "kube-api-access-b9lwb") pod "2e9c6d8c-eb46-4b9d-88bd-6f402b235473" (UID: "2e9c6d8c-eb46-4b9d-88bd-6f402b235473"). InnerVolumeSpecName "kube-api-access-b9lwb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:44:50.197454 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:50.197421 2575 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-d1a8d-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2e9c6d8c-eb46-4b9d-88bd-6f402b235473-success-200-isvc-d1a8d-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:44:50.197454 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:50.197448 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b9lwb\" (UniqueName: \"kubernetes.io/projected/2e9c6d8c-eb46-4b9d-88bd-6f402b235473-kube-api-access-b9lwb\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:44:50.197454 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:50.197458 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e9c6d8c-eb46-4b9d-88bd-6f402b235473-proxy-tls\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:44:50.361890 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:50.361786 2575 generic.go:358] "Generic (PLEG): container finished" podID="2e9c6d8c-eb46-4b9d-88bd-6f402b235473" containerID="1091338ae83361785e3932eea04b5f811749ba855321a17dcd0f0e4da8291fd1" exitCode=0 Apr 17 17:44:50.361890 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:50.361870 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" event={"ID":"2e9c6d8c-eb46-4b9d-88bd-6f402b235473","Type":"ContainerDied","Data":"1091338ae83361785e3932eea04b5f811749ba855321a17dcd0f0e4da8291fd1"} Apr 17 17:44:50.361890 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:50.361886 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" Apr 17 17:44:50.362434 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:50.361919 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk" event={"ID":"2e9c6d8c-eb46-4b9d-88bd-6f402b235473","Type":"ContainerDied","Data":"bd10153b0a5e45b03afacb9e0563bb4bbf3b3ad0c73bf32301c6a99f87d35973"} Apr 17 17:44:50.362434 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:50.361946 2575 scope.go:117] "RemoveContainer" containerID="0ba7d7169b8a1876e21ae584ccc91f9fc516ceda351a049c7c2492e865e921fc" Apr 17 17:44:50.362434 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:50.362294 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" podUID="33b7e886-f7fd-4627-9183-0751e15881f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 17 17:44:50.370894 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:50.370879 2575 scope.go:117] "RemoveContainer" containerID="1091338ae83361785e3932eea04b5f811749ba855321a17dcd0f0e4da8291fd1" Apr 17 17:44:50.378536 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:50.378513 2575 scope.go:117] "RemoveContainer" containerID="0ba7d7169b8a1876e21ae584ccc91f9fc516ceda351a049c7c2492e865e921fc" Apr 17 17:44:50.378781 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:44:50.378758 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ba7d7169b8a1876e21ae584ccc91f9fc516ceda351a049c7c2492e865e921fc\": container with ID starting with 0ba7d7169b8a1876e21ae584ccc91f9fc516ceda351a049c7c2492e865e921fc not found: ID does not exist" containerID="0ba7d7169b8a1876e21ae584ccc91f9fc516ceda351a049c7c2492e865e921fc" Apr 17 17:44:50.378867 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:50.378790 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ba7d7169b8a1876e21ae584ccc91f9fc516ceda351a049c7c2492e865e921fc"} err="failed to get container status \"0ba7d7169b8a1876e21ae584ccc91f9fc516ceda351a049c7c2492e865e921fc\": rpc error: code = NotFound desc = could not find container \"0ba7d7169b8a1876e21ae584ccc91f9fc516ceda351a049c7c2492e865e921fc\": container with ID starting with 0ba7d7169b8a1876e21ae584ccc91f9fc516ceda351a049c7c2492e865e921fc not found: ID does not exist" Apr 17 17:44:50.378867 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:50.378809 2575 scope.go:117] "RemoveContainer" containerID="1091338ae83361785e3932eea04b5f811749ba855321a17dcd0f0e4da8291fd1" Apr 17 17:44:50.379053 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:44:50.379036 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1091338ae83361785e3932eea04b5f811749ba855321a17dcd0f0e4da8291fd1\": container with ID starting with 1091338ae83361785e3932eea04b5f811749ba855321a17dcd0f0e4da8291fd1 not found: ID does not exist" containerID="1091338ae83361785e3932eea04b5f811749ba855321a17dcd0f0e4da8291fd1" Apr 17 17:44:50.379097 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:50.379057 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1091338ae83361785e3932eea04b5f811749ba855321a17dcd0f0e4da8291fd1"} err="failed to get container status \"1091338ae83361785e3932eea04b5f811749ba855321a17dcd0f0e4da8291fd1\": rpc error: code = NotFound desc = could not find container \"1091338ae83361785e3932eea04b5f811749ba855321a17dcd0f0e4da8291fd1\": container with ID starting with 1091338ae83361785e3932eea04b5f811749ba855321a17dcd0f0e4da8291fd1 not found: ID does not exist" Apr 17 17:44:50.383325 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:50.383300 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk"] Apr 17 17:44:50.387856 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:50.387818 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-d1a8d-predictor-656694f686-jhmhk"] Apr 17 17:44:51.094959 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:51.094931 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e9c6d8c-eb46-4b9d-88bd-6f402b235473" path="/var/lib/kubelet/pods/2e9c6d8c-eb46-4b9d-88bd-6f402b235473/volumes" Apr 17 17:44:53.319044 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:53.319006 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj" podUID="443d72a2-0912-4fb1-9fbf-1146f11b002f" containerName="ensemble-graph-d1a8d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:44:53.611795 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:53.611704 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn" podUID="06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9" containerName="sequence-graph-fa93d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:44:54.289362 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:54.289317 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" podUID="1aa5f184-67fd-454e-a1b7-11e64456aebe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 17 17:44:55.366619 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:55.366590 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" Apr 17 17:44:55.367054 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:55.366999 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" podUID="33b7e886-f7fd-4627-9183-0751e15881f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 17 17:44:56.290951 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:56.290926 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn" Apr 17 17:44:56.347746 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:56.347715 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9-openshift-service-ca-bundle\") pod \"06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9\" (UID: \"06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9\") " Apr 17 17:44:56.347906 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:56.347756 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9-proxy-tls\") pod \"06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9\" (UID: \"06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9\") " Apr 17 17:44:56.348045 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:56.348022 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9" (UID: "06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:44:56.349703 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:56.349675 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9" (UID: "06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:44:56.383486 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:56.383456 2575 generic.go:358] "Generic (PLEG): container finished" podID="06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9" containerID="0c323abe4280252d1e9ad69741bb4b3e8d7ee586c2918d3d13f4a25dcbda6288" exitCode=0 Apr 17 17:44:56.383875 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:56.383501 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn" event={"ID":"06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9","Type":"ContainerDied","Data":"0c323abe4280252d1e9ad69741bb4b3e8d7ee586c2918d3d13f4a25dcbda6288"} Apr 17 17:44:56.383875 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:56.383523 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn" Apr 17 17:44:56.383875 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:56.383531 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn" event={"ID":"06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9","Type":"ContainerDied","Data":"254dc9369179e842092a1463f124e45cd6ce089d0f21b40970d972341c51e35b"} Apr 17 17:44:56.383875 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:56.383553 2575 scope.go:117] "RemoveContainer" containerID="0c323abe4280252d1e9ad69741bb4b3e8d7ee586c2918d3d13f4a25dcbda6288" Apr 17 17:44:56.391952 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:56.391929 2575 scope.go:117] "RemoveContainer" containerID="0c323abe4280252d1e9ad69741bb4b3e8d7ee586c2918d3d13f4a25dcbda6288" Apr 17 17:44:56.392187 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:44:56.392165 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c323abe4280252d1e9ad69741bb4b3e8d7ee586c2918d3d13f4a25dcbda6288\": container with ID starting with 0c323abe4280252d1e9ad69741bb4b3e8d7ee586c2918d3d13f4a25dcbda6288 not found: ID does not exist" containerID="0c323abe4280252d1e9ad69741bb4b3e8d7ee586c2918d3d13f4a25dcbda6288" Apr 17 17:44:56.392253 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:56.392198 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c323abe4280252d1e9ad69741bb4b3e8d7ee586c2918d3d13f4a25dcbda6288"} err="failed to get container status \"0c323abe4280252d1e9ad69741bb4b3e8d7ee586c2918d3d13f4a25dcbda6288\": rpc error: code = NotFound desc = could not find container \"0c323abe4280252d1e9ad69741bb4b3e8d7ee586c2918d3d13f4a25dcbda6288\": container with ID starting with 0c323abe4280252d1e9ad69741bb4b3e8d7ee586c2918d3d13f4a25dcbda6288 not found: ID does not exist" Apr 17 17:44:56.403931 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:56.403881 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn"] Apr 17 17:44:56.407114 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:56.407094 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-fa93d-58f6759648-qxwxn"] Apr 17 17:44:56.448629 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:56.448605 2575 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9-openshift-service-ca-bundle\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:44:56.448629 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:56.448626 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9-proxy-tls\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:44:57.094745 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:57.094712 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9" path="/var/lib/kubelet/pods/06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9/volumes" Apr 17 17:44:58.319402 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:58.319367 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj" podUID="443d72a2-0912-4fb1-9fbf-1146f11b002f" containerName="ensemble-graph-d1a8d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:44:58.319886 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:44:58.319464 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj" Apr 17 17:45:03.318028 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:03.317989 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj" podUID="443d72a2-0912-4fb1-9fbf-1146f11b002f" containerName="ensemble-graph-d1a8d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:45:04.288666 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:04.288629 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" podUID="1aa5f184-67fd-454e-a1b7-11e64456aebe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 17 17:45:05.367948 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:05.367909 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" podUID="33b7e886-f7fd-4627-9183-0751e15881f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 17 17:45:08.319073 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:08.319035 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj" podUID="443d72a2-0912-4fb1-9fbf-1146f11b002f" containerName="ensemble-graph-d1a8d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:45:11.992819 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:45:11.992778 2575 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/daae974a900914ff451c0e87d3fa86b9fbf6c8fb559f14ae0421a17c8a51c742/diff" to get inode usage: stat /var/lib/containers/storage/overlay/daae974a900914ff451c0e87d3fa86b9fbf6c8fb559f14ae0421a17c8a51c742/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/kserve-ci-e2e-test_sequence-graph-fa93d-58f6759648-qxwxn_06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9/sequence-graph-fa93d/0.log" to get inode usage: stat /var/log/pods/kserve-ci-e2e-test_sequence-graph-fa93d-58f6759648-qxwxn_06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9/sequence-graph-fa93d/0.log: no such file or directory Apr 17 17:45:13.318587 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:13.318549 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj" podUID="443d72a2-0912-4fb1-9fbf-1146f11b002f" containerName="ensemble-graph-d1a8d" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:45:14.289661 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:14.289632 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" Apr 17 17:45:15.367064 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:15.367028 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" podUID="33b7e886-f7fd-4627-9183-0751e15881f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 17 17:45:16.850694 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:45:16.850644 2575 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod443d72a2_0912_4fb1_9fbf_1146f11b002f.slice/crio-17fe15dde51a089c716e9da55f95d4a95ef5698692c8da4104676b3f14333308\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06cf717f_8ff8_48d5_bb9d_a6ce56b0bbc9.slice/crio-254dc9369179e842092a1463f124e45cd6ce089d0f21b40970d972341c51e35b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06cf717f_8ff8_48d5_bb9d_a6ce56b0bbc9.slice/crio-conmon-0c323abe4280252d1e9ad69741bb4b3e8d7ee586c2918d3d13f4a25dcbda6288.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06cf717f_8ff8_48d5_bb9d_a6ce56b0bbc9.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06cf717f_8ff8_48d5_bb9d_a6ce56b0bbc9.slice/crio-0c323abe4280252d1e9ad69741bb4b3e8d7ee586c2918d3d13f4a25dcbda6288.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod443d72a2_0912_4fb1_9fbf_1146f11b002f.slice/crio-conmon-7369c530941dbfb5e32cb1d8e53dd94af880dfc432fba2f956ed922e34b75797.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod443d72a2_0912_4fb1_9fbf_1146f11b002f.slice/crio-7369c530941dbfb5e32cb1d8e53dd94af880dfc432fba2f956ed922e34b75797.scope\": RecentStats: unable to find data in memory cache]" Apr 17 17:45:16.851113 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:45:16.850695 2575 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod443d72a2_0912_4fb1_9fbf_1146f11b002f.slice/crio-17fe15dde51a089c716e9da55f95d4a95ef5698692c8da4104676b3f14333308\": RecentStats: unable to find data in memory cache]" Apr 17 17:45:16.851113 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:45:16.850686 2575 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06cf717f_8ff8_48d5_bb9d_a6ce56b0bbc9.slice/crio-254dc9369179e842092a1463f124e45cd6ce089d0f21b40970d972341c51e35b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06cf717f_8ff8_48d5_bb9d_a6ce56b0bbc9.slice/crio-0c323abe4280252d1e9ad69741bb4b3e8d7ee586c2918d3d13f4a25dcbda6288.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06cf717f_8ff8_48d5_bb9d_a6ce56b0bbc9.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod443d72a2_0912_4fb1_9fbf_1146f11b002f.slice/crio-conmon-7369c530941dbfb5e32cb1d8e53dd94af880dfc432fba2f956ed922e34b75797.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod443d72a2_0912_4fb1_9fbf_1146f11b002f.slice/crio-7369c530941dbfb5e32cb1d8e53dd94af880dfc432fba2f956ed922e34b75797.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06cf717f_8ff8_48d5_bb9d_a6ce56b0bbc9.slice/crio-conmon-0c323abe4280252d1e9ad69741bb4b3e8d7ee586c2918d3d13f4a25dcbda6288.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod443d72a2_0912_4fb1_9fbf_1146f11b002f.slice/crio-17fe15dde51a089c716e9da55f95d4a95ef5698692c8da4104676b3f14333308\": RecentStats: unable to find data in memory cache]" Apr 17 17:45:16.851113 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:45:16.850713 2575 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06cf717f_8ff8_48d5_bb9d_a6ce56b0bbc9.slice/crio-0c323abe4280252d1e9ad69741bb4b3e8d7ee586c2918d3d13f4a25dcbda6288.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod443d72a2_0912_4fb1_9fbf_1146f11b002f.slice/crio-17fe15dde51a089c716e9da55f95d4a95ef5698692c8da4104676b3f14333308\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod443d72a2_0912_4fb1_9fbf_1146f11b002f.slice/crio-7369c530941dbfb5e32cb1d8e53dd94af880dfc432fba2f956ed922e34b75797.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06cf717f_8ff8_48d5_bb9d_a6ce56b0bbc9.slice/crio-conmon-0c323abe4280252d1e9ad69741bb4b3e8d7ee586c2918d3d13f4a25dcbda6288.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod443d72a2_0912_4fb1_9fbf_1146f11b002f.slice/crio-conmon-7369c530941dbfb5e32cb1d8e53dd94af880dfc432fba2f956ed922e34b75797.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06cf717f_8ff8_48d5_bb9d_a6ce56b0bbc9.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06cf717f_8ff8_48d5_bb9d_a6ce56b0bbc9.slice/crio-254dc9369179e842092a1463f124e45cd6ce089d0f21b40970d972341c51e35b\": RecentStats: unable to find data in memory cache]" Apr 17 17:45:16.851113 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:45:16.850691 2575 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06cf717f_8ff8_48d5_bb9d_a6ce56b0bbc9.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod443d72a2_0912_4fb1_9fbf_1146f11b002f.slice/crio-17fe15dde51a089c716e9da55f95d4a95ef5698692c8da4104676b3f14333308\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06cf717f_8ff8_48d5_bb9d_a6ce56b0bbc9.slice/crio-0c323abe4280252d1e9ad69741bb4b3e8d7ee586c2918d3d13f4a25dcbda6288.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06cf717f_8ff8_48d5_bb9d_a6ce56b0bbc9.slice/crio-conmon-0c323abe4280252d1e9ad69741bb4b3e8d7ee586c2918d3d13f4a25dcbda6288.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod443d72a2_0912_4fb1_9fbf_1146f11b002f.slice/crio-7369c530941dbfb5e32cb1d8e53dd94af880dfc432fba2f956ed922e34b75797.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06cf717f_8ff8_48d5_bb9d_a6ce56b0bbc9.slice/crio-254dc9369179e842092a1463f124e45cd6ce089d0f21b40970d972341c51e35b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod443d72a2_0912_4fb1_9fbf_1146f11b002f.slice/crio-conmon-7369c530941dbfb5e32cb1d8e53dd94af880dfc432fba2f956ed922e34b75797.scope\": RecentStats: unable to find data in memory cache]" Apr 17 17:45:16.998868 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:16.998843 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj" Apr 17 17:45:17.116167 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:17.116081 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/443d72a2-0912-4fb1-9fbf-1146f11b002f-openshift-service-ca-bundle\") pod \"443d72a2-0912-4fb1-9fbf-1146f11b002f\" (UID: \"443d72a2-0912-4fb1-9fbf-1146f11b002f\") " Apr 17 17:45:17.116167 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:17.116171 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/443d72a2-0912-4fb1-9fbf-1146f11b002f-proxy-tls\") pod \"443d72a2-0912-4fb1-9fbf-1146f11b002f\" (UID: \"443d72a2-0912-4fb1-9fbf-1146f11b002f\") " Apr 17 17:45:17.116411 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:17.116387 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/443d72a2-0912-4fb1-9fbf-1146f11b002f-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "443d72a2-0912-4fb1-9fbf-1146f11b002f" (UID: "443d72a2-0912-4fb1-9fbf-1146f11b002f"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:45:17.118156 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:17.118134 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/443d72a2-0912-4fb1-9fbf-1146f11b002f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "443d72a2-0912-4fb1-9fbf-1146f11b002f" (UID: "443d72a2-0912-4fb1-9fbf-1146f11b002f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:45:17.216819 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:17.216768 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/443d72a2-0912-4fb1-9fbf-1146f11b002f-proxy-tls\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:45:17.216819 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:17.216818 2575 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/443d72a2-0912-4fb1-9fbf-1146f11b002f-openshift-service-ca-bundle\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:45:17.452651 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:17.452556 2575 generic.go:358] "Generic (PLEG): container finished" podID="443d72a2-0912-4fb1-9fbf-1146f11b002f" containerID="7369c530941dbfb5e32cb1d8e53dd94af880dfc432fba2f956ed922e34b75797" exitCode=137 Apr 17 17:45:17.452651 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:17.452617 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj" event={"ID":"443d72a2-0912-4fb1-9fbf-1146f11b002f","Type":"ContainerDied","Data":"7369c530941dbfb5e32cb1d8e53dd94af880dfc432fba2f956ed922e34b75797"} Apr 17 17:45:17.452651 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:17.452627 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj" Apr 17 17:45:17.452651 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:17.452642 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj" event={"ID":"443d72a2-0912-4fb1-9fbf-1146f11b002f","Type":"ContainerDied","Data":"17fe15dde51a089c716e9da55f95d4a95ef5698692c8da4104676b3f14333308"} Apr 17 17:45:17.453003 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:17.452659 2575 scope.go:117] "RemoveContainer" containerID="7369c530941dbfb5e32cb1d8e53dd94af880dfc432fba2f956ed922e34b75797" Apr 17 17:45:17.461492 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:17.461474 2575 scope.go:117] "RemoveContainer" containerID="7369c530941dbfb5e32cb1d8e53dd94af880dfc432fba2f956ed922e34b75797" Apr 17 17:45:17.461737 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:45:17.461716 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7369c530941dbfb5e32cb1d8e53dd94af880dfc432fba2f956ed922e34b75797\": container with ID starting with 7369c530941dbfb5e32cb1d8e53dd94af880dfc432fba2f956ed922e34b75797 not found: ID does not exist" containerID="7369c530941dbfb5e32cb1d8e53dd94af880dfc432fba2f956ed922e34b75797" Apr 17 17:45:17.461809 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:17.461751 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7369c530941dbfb5e32cb1d8e53dd94af880dfc432fba2f956ed922e34b75797"} err="failed to get container status \"7369c530941dbfb5e32cb1d8e53dd94af880dfc432fba2f956ed922e34b75797\": rpc error: code = NotFound desc = could not find container \"7369c530941dbfb5e32cb1d8e53dd94af880dfc432fba2f956ed922e34b75797\": container with ID starting with 7369c530941dbfb5e32cb1d8e53dd94af880dfc432fba2f956ed922e34b75797 not found: ID does not exist" Apr 17 17:45:17.475249 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:17.475220 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj"] Apr 17 17:45:17.478935 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:17.478910 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-d1a8d-548898d46-886wj"] Apr 17 17:45:19.094784 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:19.094751 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="443d72a2-0912-4fb1-9fbf-1146f11b002f" path="/var/lib/kubelet/pods/443d72a2-0912-4fb1-9fbf-1146f11b002f/volumes" Apr 17 17:45:25.367799 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:25.367758 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" podUID="33b7e886-f7fd-4627-9183-0751e15881f5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 17 17:45:26.346757 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:26.346717 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9"] Apr 17 17:45:26.347202 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:26.347182 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9" containerName="sequence-graph-fa93d" Apr 17 17:45:26.347301 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:26.347204 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9" containerName="sequence-graph-fa93d" Apr 17 17:45:26.347301 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:26.347226 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e9c6d8c-eb46-4b9d-88bd-6f402b235473" containerName="kube-rbac-proxy" Apr 17 17:45:26.347301 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:26.347234 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e9c6d8c-eb46-4b9d-88bd-6f402b235473" containerName="kube-rbac-proxy" Apr 17 17:45:26.347301 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:26.347257 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="443d72a2-0912-4fb1-9fbf-1146f11b002f" containerName="ensemble-graph-d1a8d" Apr 17 17:45:26.347301 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:26.347266 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="443d72a2-0912-4fb1-9fbf-1146f11b002f" containerName="ensemble-graph-d1a8d" Apr 17 17:45:26.347301 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:26.347282 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e9c6d8c-eb46-4b9d-88bd-6f402b235473" containerName="kserve-container" Apr 17 17:45:26.347301 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:26.347290 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e9c6d8c-eb46-4b9d-88bd-6f402b235473" containerName="kserve-container" Apr 17 17:45:26.347622 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:26.347368 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="443d72a2-0912-4fb1-9fbf-1146f11b002f" containerName="ensemble-graph-d1a8d" Apr 17 17:45:26.347622 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:26.347381 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="2e9c6d8c-eb46-4b9d-88bd-6f402b235473" containerName="kserve-container" Apr 17 17:45:26.347622 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:26.347392 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="06cf717f-8ff8-48d5-bb9d-a6ce56b0bbc9" containerName="sequence-graph-fa93d" Apr 17 17:45:26.347622 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:26.347403 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="2e9c6d8c-eb46-4b9d-88bd-6f402b235473" containerName="kube-rbac-proxy" Apr 17 17:45:26.351855 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:26.351810 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9" Apr 17 17:45:26.353963 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:26.353940 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-a1277-serving-cert\"" Apr 17 17:45:26.354065 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:26.353944 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-a1277-kube-rbac-proxy-sar-config\"" Apr 17 17:45:26.360671 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:26.360650 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9"] Apr 17 17:45:26.493330 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:26.493298 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53213afd-bfbe-44e9-a0ce-5136a80871b2-openshift-service-ca-bundle\") pod \"sequence-graph-a1277-6d9f5c6dd8-sqgk9\" (UID: \"53213afd-bfbe-44e9-a0ce-5136a80871b2\") " pod="kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9" Apr 17 17:45:26.493672 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:26.493378 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53213afd-bfbe-44e9-a0ce-5136a80871b2-proxy-tls\") pod \"sequence-graph-a1277-6d9f5c6dd8-sqgk9\" (UID: \"53213afd-bfbe-44e9-a0ce-5136a80871b2\") " pod="kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9" Apr 17 17:45:26.594135 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:26.594102 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53213afd-bfbe-44e9-a0ce-5136a80871b2-proxy-tls\") pod \"sequence-graph-a1277-6d9f5c6dd8-sqgk9\" (UID: \"53213afd-bfbe-44e9-a0ce-5136a80871b2\") " pod="kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9" Apr 17 17:45:26.594311 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:26.594156 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53213afd-bfbe-44e9-a0ce-5136a80871b2-openshift-service-ca-bundle\") pod \"sequence-graph-a1277-6d9f5c6dd8-sqgk9\" (UID: \"53213afd-bfbe-44e9-a0ce-5136a80871b2\") " pod="kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9" Apr 17 17:45:26.594761 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:26.594744 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53213afd-bfbe-44e9-a0ce-5136a80871b2-openshift-service-ca-bundle\") pod \"sequence-graph-a1277-6d9f5c6dd8-sqgk9\" (UID: \"53213afd-bfbe-44e9-a0ce-5136a80871b2\") " pod="kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9" Apr 17 17:45:26.596452 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:26.596432 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53213afd-bfbe-44e9-a0ce-5136a80871b2-proxy-tls\") pod \"sequence-graph-a1277-6d9f5c6dd8-sqgk9\" (UID: \"53213afd-bfbe-44e9-a0ce-5136a80871b2\") " pod="kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9" Apr 17 17:45:26.662696 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:26.662621 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9" Apr 17 17:45:26.785468 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:26.785393 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9"] Apr 17 17:45:26.788158 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:45:26.788123 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53213afd_bfbe_44e9_a0ce_5136a80871b2.slice/crio-b96cb17fe1102a61f8120fc9c57b78ba3b770a5144a189ed64ecdf22cdcb4468 WatchSource:0}: Error finding container b96cb17fe1102a61f8120fc9c57b78ba3b770a5144a189ed64ecdf22cdcb4468: Status 404 returned error can't find the container with id b96cb17fe1102a61f8120fc9c57b78ba3b770a5144a189ed64ecdf22cdcb4468 Apr 17 17:45:27.489266 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:27.489228 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9" event={"ID":"53213afd-bfbe-44e9-a0ce-5136a80871b2","Type":"ContainerStarted","Data":"c70f2106d8523259338690f63f4a73388c56eebf22c387a94ca9530a9206e148"} Apr 17 17:45:27.489266 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:27.489273 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9" event={"ID":"53213afd-bfbe-44e9-a0ce-5136a80871b2","Type":"ContainerStarted","Data":"b96cb17fe1102a61f8120fc9c57b78ba3b770a5144a189ed64ecdf22cdcb4468"} Apr 17 17:45:27.489527 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:27.489299 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9" Apr 17 17:45:27.507881 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:27.507810 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9" podStartSLOduration=1.507794699 podStartE2EDuration="1.507794699s" podCreationTimestamp="2026-04-17 17:45:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:45:27.505563642 +0000 UTC m=+1297.031244261" watchObservedRunningTime="2026-04-17 17:45:27.507794699 +0000 UTC m=+1297.033475322" Apr 17 17:45:33.499227 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:33.499197 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9" Apr 17 17:45:35.367984 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:35.367955 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" Apr 17 17:45:36.389862 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:36.389814 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9"] Apr 17 17:45:36.390308 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:36.390098 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9" podUID="53213afd-bfbe-44e9-a0ce-5136a80871b2" containerName="sequence-graph-a1277" containerID="cri-o://c70f2106d8523259338690f63f4a73388c56eebf22c387a94ca9530a9206e148" gracePeriod=30 Apr 17 17:45:36.508871 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:36.508815 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl"] Apr 17 17:45:36.509436 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:36.509403 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" podUID="1aa5f184-67fd-454e-a1b7-11e64456aebe" containerName="kserve-container" containerID="cri-o://833dbbad1c46b7d04ecb204b96e3cc2571d658951075b424f68468774c93631b" gracePeriod=30 Apr 17 17:45:36.513145 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:36.513115 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" podUID="1aa5f184-67fd-454e-a1b7-11e64456aebe" containerName="kube-rbac-proxy" containerID="cri-o://bbafa73134dc22f2764a0dc1929bbcc2fc083ab1fde2594c6c3ddbdc463a79bc" gracePeriod=30 Apr 17 17:45:36.533328 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:36.533304 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj"] Apr 17 17:45:36.537909 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:36.537891 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" Apr 17 17:45:36.540066 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:36.540040 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-dd512-predictor-serving-cert\"" Apr 17 17:45:36.540169 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:36.540043 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-dd512-kube-rbac-proxy-sar-config\"" Apr 17 17:45:36.549068 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:36.549046 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj"] Apr 17 17:45:36.680992 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:36.680920 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4a954bf-fc87-4e85-a790-dc1e57f4e5a5-proxy-tls\") pod \"success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj\" (UID: \"b4a954bf-fc87-4e85-a790-dc1e57f4e5a5\") " pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" Apr 17 17:45:36.680992 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:36.680967 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-dd512-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b4a954bf-fc87-4e85-a790-dc1e57f4e5a5-success-200-isvc-dd512-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj\" (UID: \"b4a954bf-fc87-4e85-a790-dc1e57f4e5a5\") " pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" Apr 17 17:45:36.681160 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:36.681034 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjwkn\" (UniqueName: \"kubernetes.io/projected/b4a954bf-fc87-4e85-a790-dc1e57f4e5a5-kube-api-access-xjwkn\") pod \"success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj\" (UID: \"b4a954bf-fc87-4e85-a790-dc1e57f4e5a5\") " pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" Apr 17 17:45:36.781759 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:36.781724 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4a954bf-fc87-4e85-a790-dc1e57f4e5a5-proxy-tls\") pod \"success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj\" (UID: \"b4a954bf-fc87-4e85-a790-dc1e57f4e5a5\") " pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" Apr 17 17:45:36.781969 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:36.781767 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-dd512-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b4a954bf-fc87-4e85-a790-dc1e57f4e5a5-success-200-isvc-dd512-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj\" (UID: \"b4a954bf-fc87-4e85-a790-dc1e57f4e5a5\") " pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" Apr 17 17:45:36.781969 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:36.781818 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xjwkn\" (UniqueName: \"kubernetes.io/projected/b4a954bf-fc87-4e85-a790-dc1e57f4e5a5-kube-api-access-xjwkn\") pod \"success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj\" (UID: \"b4a954bf-fc87-4e85-a790-dc1e57f4e5a5\") " pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" Apr 17 17:45:36.782475 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:36.782454 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-dd512-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b4a954bf-fc87-4e85-a790-dc1e57f4e5a5-success-200-isvc-dd512-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj\" (UID: \"b4a954bf-fc87-4e85-a790-dc1e57f4e5a5\") " pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" Apr 17 17:45:36.784200 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:36.784180 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4a954bf-fc87-4e85-a790-dc1e57f4e5a5-proxy-tls\") pod \"success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj\" (UID: \"b4a954bf-fc87-4e85-a790-dc1e57f4e5a5\") " pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" Apr 17 17:45:36.789636 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:36.789612 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjwkn\" (UniqueName: \"kubernetes.io/projected/b4a954bf-fc87-4e85-a790-dc1e57f4e5a5-kube-api-access-xjwkn\") pod \"success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj\" (UID: \"b4a954bf-fc87-4e85-a790-dc1e57f4e5a5\") " pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" Apr 17 17:45:36.848993 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:36.848954 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" Apr 17 17:45:36.979577 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:36.979498 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj"] Apr 17 17:45:36.982796 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:45:36.982766 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4a954bf_fc87_4e85_a790_dc1e57f4e5a5.slice/crio-c3511a4e03e1414fc7ffbd9b32927602de9769a42465b61f17272883e7ee3ba1 WatchSource:0}: Error finding container c3511a4e03e1414fc7ffbd9b32927602de9769a42465b61f17272883e7ee3ba1: Status 404 returned error can't find the container with id c3511a4e03e1414fc7ffbd9b32927602de9769a42465b61f17272883e7ee3ba1 Apr 17 17:45:37.523663 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:37.523623 2575 generic.go:358] "Generic (PLEG): container finished" podID="1aa5f184-67fd-454e-a1b7-11e64456aebe" containerID="bbafa73134dc22f2764a0dc1929bbcc2fc083ab1fde2594c6c3ddbdc463a79bc" exitCode=2 Apr 17 17:45:37.524093 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:37.523684 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" event={"ID":"1aa5f184-67fd-454e-a1b7-11e64456aebe","Type":"ContainerDied","Data":"bbafa73134dc22f2764a0dc1929bbcc2fc083ab1fde2594c6c3ddbdc463a79bc"} Apr 17 17:45:37.525340 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:37.525309 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" event={"ID":"b4a954bf-fc87-4e85-a790-dc1e57f4e5a5","Type":"ContainerStarted","Data":"93710dd0b47725c5d9487c1fd1f10c3acb374a608da756933cff1bbf27929b86"} Apr 17 17:45:37.525340 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:37.525337 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" event={"ID":"b4a954bf-fc87-4e85-a790-dc1e57f4e5a5","Type":"ContainerStarted","Data":"51fc8ea6d6fa647eda365c136ba5bc3ccaeab457ab3abb7159660c0063bd6ab7"} Apr 17 17:45:37.525541 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:37.525347 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" event={"ID":"b4a954bf-fc87-4e85-a790-dc1e57f4e5a5","Type":"ContainerStarted","Data":"c3511a4e03e1414fc7ffbd9b32927602de9769a42465b61f17272883e7ee3ba1"} Apr 17 17:45:37.525541 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:37.525432 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" Apr 17 17:45:37.545682 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:37.545640 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" podStartSLOduration=1.54562652 podStartE2EDuration="1.54562652s" podCreationTimestamp="2026-04-17 17:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:45:37.543330461 +0000 UTC m=+1307.069011081" watchObservedRunningTime="2026-04-17 17:45:37.54562652 +0000 UTC m=+1307.071307140" Apr 17 17:45:38.497114 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:38.497073 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9" podUID="53213afd-bfbe-44e9-a0ce-5136a80871b2" containerName="sequence-graph-a1277" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:45:38.529711 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:38.529666 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" Apr 17 17:45:38.531041 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:38.531006 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" podUID="b4a954bf-fc87-4e85-a790-dc1e57f4e5a5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 17 17:45:39.283623 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:39.283577 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" podUID="1aa5f184-67fd-454e-a1b7-11e64456aebe" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.39:8643/healthz\": dial tcp 10.132.0.39:8643: connect: connection refused" Apr 17 17:45:39.534391 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:39.534324 2575 generic.go:358] "Generic (PLEG): container finished" podID="1aa5f184-67fd-454e-a1b7-11e64456aebe" containerID="833dbbad1c46b7d04ecb204b96e3cc2571d658951075b424f68468774c93631b" exitCode=0 Apr 17 17:45:39.534391 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:39.534389 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" event={"ID":"1aa5f184-67fd-454e-a1b7-11e64456aebe","Type":"ContainerDied","Data":"833dbbad1c46b7d04ecb204b96e3cc2571d658951075b424f68468774c93631b"} Apr 17 17:45:39.534999 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:39.534975 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" podUID="b4a954bf-fc87-4e85-a790-dc1e57f4e5a5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 17 17:45:39.656321 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:39.656298 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" Apr 17 17:45:39.810779 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:39.810696 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-a1277-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1aa5f184-67fd-454e-a1b7-11e64456aebe-success-200-isvc-a1277-kube-rbac-proxy-sar-config\") pod \"1aa5f184-67fd-454e-a1b7-11e64456aebe\" (UID: \"1aa5f184-67fd-454e-a1b7-11e64456aebe\") " Apr 17 17:45:39.810779 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:39.810750 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwvl8\" (UniqueName: \"kubernetes.io/projected/1aa5f184-67fd-454e-a1b7-11e64456aebe-kube-api-access-nwvl8\") pod \"1aa5f184-67fd-454e-a1b7-11e64456aebe\" (UID: \"1aa5f184-67fd-454e-a1b7-11e64456aebe\") " Apr 17 17:45:39.810779 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:39.810778 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1aa5f184-67fd-454e-a1b7-11e64456aebe-proxy-tls\") pod \"1aa5f184-67fd-454e-a1b7-11e64456aebe\" (UID: \"1aa5f184-67fd-454e-a1b7-11e64456aebe\") " Apr 17 17:45:39.811149 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:39.811106 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aa5f184-67fd-454e-a1b7-11e64456aebe-success-200-isvc-a1277-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-a1277-kube-rbac-proxy-sar-config") pod "1aa5f184-67fd-454e-a1b7-11e64456aebe" (UID: "1aa5f184-67fd-454e-a1b7-11e64456aebe"). InnerVolumeSpecName "success-200-isvc-a1277-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:45:39.812950 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:39.812927 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aa5f184-67fd-454e-a1b7-11e64456aebe-kube-api-access-nwvl8" (OuterVolumeSpecName: "kube-api-access-nwvl8") pod "1aa5f184-67fd-454e-a1b7-11e64456aebe" (UID: "1aa5f184-67fd-454e-a1b7-11e64456aebe"). InnerVolumeSpecName "kube-api-access-nwvl8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:45:39.812950 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:39.812939 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aa5f184-67fd-454e-a1b7-11e64456aebe-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1aa5f184-67fd-454e-a1b7-11e64456aebe" (UID: "1aa5f184-67fd-454e-a1b7-11e64456aebe"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:45:39.912223 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:39.912191 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nwvl8\" (UniqueName: \"kubernetes.io/projected/1aa5f184-67fd-454e-a1b7-11e64456aebe-kube-api-access-nwvl8\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:45:39.912223 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:39.912220 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1aa5f184-67fd-454e-a1b7-11e64456aebe-proxy-tls\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:45:39.912396 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:39.912236 2575 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-a1277-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1aa5f184-67fd-454e-a1b7-11e64456aebe-success-200-isvc-a1277-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:45:40.540577 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:40.540541 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" event={"ID":"1aa5f184-67fd-454e-a1b7-11e64456aebe","Type":"ContainerDied","Data":"1d175d17d08b56daa812ff9bc07835a1735baddd6fd3d5b96d6143759e0a50ef"} Apr 17 17:45:40.540577 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:40.540582 2575 scope.go:117] "RemoveContainer" containerID="bbafa73134dc22f2764a0dc1929bbcc2fc083ab1fde2594c6c3ddbdc463a79bc" Apr 17 17:45:40.541066 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:40.540585 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl" Apr 17 17:45:40.548883 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:40.548868 2575 scope.go:117] "RemoveContainer" containerID="833dbbad1c46b7d04ecb204b96e3cc2571d658951075b424f68468774c93631b" Apr 17 17:45:40.562488 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:40.562466 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl"] Apr 17 17:45:40.566314 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:40.566293 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a1277-predictor-79b86f6bfd-4wtvl"] Apr 17 17:45:41.097993 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:41.097959 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aa5f184-67fd-454e-a1b7-11e64456aebe" path="/var/lib/kubelet/pods/1aa5f184-67fd-454e-a1b7-11e64456aebe/volumes" Apr 17 17:45:43.497054 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:43.497016 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9" podUID="53213afd-bfbe-44e9-a0ce-5136a80871b2" containerName="sequence-graph-a1277" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:45:44.538779 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:44.538752 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" Apr 17 17:45:44.539222 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:44.539197 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" podUID="b4a954bf-fc87-4e85-a790-dc1e57f4e5a5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 17 17:45:46.997804 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:46.997763 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk"] Apr 17 17:45:46.998253 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:46.998146 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1aa5f184-67fd-454e-a1b7-11e64456aebe" containerName="kserve-container" Apr 17 17:45:46.998253 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:46.998157 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa5f184-67fd-454e-a1b7-11e64456aebe" containerName="kserve-container" Apr 17 17:45:46.998253 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:46.998168 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1aa5f184-67fd-454e-a1b7-11e64456aebe" containerName="kube-rbac-proxy" Apr 17 17:45:46.998253 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:46.998173 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa5f184-67fd-454e-a1b7-11e64456aebe" containerName="kube-rbac-proxy" Apr 17 17:45:46.998253 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:46.998230 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1aa5f184-67fd-454e-a1b7-11e64456aebe" containerName="kube-rbac-proxy" Apr 17 17:45:46.998253 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:46.998240 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1aa5f184-67fd-454e-a1b7-11e64456aebe" containerName="kserve-container" Apr 17 17:45:47.001365 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:47.001348 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk" Apr 17 17:45:47.003407 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:47.003378 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-313c7-kube-rbac-proxy-sar-config\"" Apr 17 17:45:47.003407 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:47.003381 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"ensemble-graph-313c7-serving-cert\"" Apr 17 17:45:47.009900 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:47.009872 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk"] Apr 17 17:45:47.172622 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:47.172590 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4af22d05-2d05-48f9-baef-808a639508da-proxy-tls\") pod \"ensemble-graph-313c7-6b795f969b-wmjtk\" (UID: \"4af22d05-2d05-48f9-baef-808a639508da\") " pod="kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk" Apr 17 17:45:47.172783 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:47.172647 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4af22d05-2d05-48f9-baef-808a639508da-openshift-service-ca-bundle\") pod \"ensemble-graph-313c7-6b795f969b-wmjtk\" (UID: \"4af22d05-2d05-48f9-baef-808a639508da\") " pod="kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk" Apr 17 17:45:47.273310 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:47.273229 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4af22d05-2d05-48f9-baef-808a639508da-proxy-tls\") pod \"ensemble-graph-313c7-6b795f969b-wmjtk\" (UID: \"4af22d05-2d05-48f9-baef-808a639508da\") " pod="kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk" Apr 17 17:45:47.273310 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:47.273285 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4af22d05-2d05-48f9-baef-808a639508da-openshift-service-ca-bundle\") pod \"ensemble-graph-313c7-6b795f969b-wmjtk\" (UID: \"4af22d05-2d05-48f9-baef-808a639508da\") " pod="kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk" Apr 17 17:45:47.273547 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:45:47.273360 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/ensemble-graph-313c7-serving-cert: secret "ensemble-graph-313c7-serving-cert" not found Apr 17 17:45:47.273547 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:45:47.273417 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4af22d05-2d05-48f9-baef-808a639508da-proxy-tls podName:4af22d05-2d05-48f9-baef-808a639508da nodeName:}" failed. No retries permitted until 2026-04-17 17:45:47.773402269 +0000 UTC m=+1317.299082866 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4af22d05-2d05-48f9-baef-808a639508da-proxy-tls") pod "ensemble-graph-313c7-6b795f969b-wmjtk" (UID: "4af22d05-2d05-48f9-baef-808a639508da") : secret "ensemble-graph-313c7-serving-cert" not found Apr 17 17:45:47.273998 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:47.273974 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4af22d05-2d05-48f9-baef-808a639508da-openshift-service-ca-bundle\") pod \"ensemble-graph-313c7-6b795f969b-wmjtk\" (UID: \"4af22d05-2d05-48f9-baef-808a639508da\") " pod="kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk" Apr 17 17:45:47.777156 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:47.777107 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4af22d05-2d05-48f9-baef-808a639508da-proxy-tls\") pod \"ensemble-graph-313c7-6b795f969b-wmjtk\" (UID: \"4af22d05-2d05-48f9-baef-808a639508da\") " pod="kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk" Apr 17 17:45:47.779529 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:47.779506 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4af22d05-2d05-48f9-baef-808a639508da-proxy-tls\") pod \"ensemble-graph-313c7-6b795f969b-wmjtk\" (UID: \"4af22d05-2d05-48f9-baef-808a639508da\") " pod="kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk" Apr 17 17:45:47.912314 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:47.912281 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk" Apr 17 17:45:48.032560 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:48.032472 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk"] Apr 17 17:45:48.036884 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:45:48.036850 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4af22d05_2d05_48f9_baef_808a639508da.slice/crio-1cc3c2dd077eda9f2aa176e1d7c9191a1338337357a2c2096aabe295a7bc4bf1 WatchSource:0}: Error finding container 1cc3c2dd077eda9f2aa176e1d7c9191a1338337357a2c2096aabe295a7bc4bf1: Status 404 returned error can't find the container with id 1cc3c2dd077eda9f2aa176e1d7c9191a1338337357a2c2096aabe295a7bc4bf1 Apr 17 17:45:48.497786 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:48.497752 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9" podUID="53213afd-bfbe-44e9-a0ce-5136a80871b2" containerName="sequence-graph-a1277" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:45:48.497995 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:48.497900 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9" Apr 17 17:45:48.570117 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:48.570082 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk" event={"ID":"4af22d05-2d05-48f9-baef-808a639508da","Type":"ContainerStarted","Data":"2a1ed509dccbe33cf4e9795f1980cea2c5e0e31dcaab573486e0e475669e3e49"} Apr 17 17:45:48.570117 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:48.570121 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk" event={"ID":"4af22d05-2d05-48f9-baef-808a639508da","Type":"ContainerStarted","Data":"1cc3c2dd077eda9f2aa176e1d7c9191a1338337357a2c2096aabe295a7bc4bf1"} Apr 17 17:45:48.570319 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:48.570215 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk" Apr 17 17:45:48.587856 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:48.587790 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk" podStartSLOduration=2.587775206 podStartE2EDuration="2.587775206s" podCreationTimestamp="2026-04-17 17:45:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:45:48.585640716 +0000 UTC m=+1318.111321336" watchObservedRunningTime="2026-04-17 17:45:48.587775206 +0000 UTC m=+1318.113455824" Apr 17 17:45:53.497353 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:53.497308 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9" podUID="53213afd-bfbe-44e9-a0ce-5136a80871b2" containerName="sequence-graph-a1277" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:45:54.539195 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:54.539157 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" podUID="b4a954bf-fc87-4e85-a790-dc1e57f4e5a5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 17 17:45:54.582477 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:54.582450 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk" Apr 17 17:45:58.497689 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:45:58.497647 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9" podUID="53213afd-bfbe-44e9-a0ce-5136a80871b2" containerName="sequence-graph-a1277" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:46:03.497223 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:03.497180 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9" podUID="53213afd-bfbe-44e9-a0ce-5136a80871b2" containerName="sequence-graph-a1277" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:46:04.539225 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:04.539186 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" podUID="b4a954bf-fc87-4e85-a790-dc1e57f4e5a5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 17 17:46:06.422519 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:46:06.422481 2575 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53213afd_bfbe_44e9_a0ce_5136a80871b2.slice/crio-b96cb17fe1102a61f8120fc9c57b78ba3b770a5144a189ed64ecdf22cdcb4468\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53213afd_bfbe_44e9_a0ce_5136a80871b2.slice/crio-c70f2106d8523259338690f63f4a73388c56eebf22c387a94ca9530a9206e148.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53213afd_bfbe_44e9_a0ce_5136a80871b2.slice/crio-conmon-c70f2106d8523259338690f63f4a73388c56eebf22c387a94ca9530a9206e148.scope\": RecentStats: unable to find data in memory cache]" Apr 17 17:46:06.550375 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:06.550353 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9" Apr 17 17:46:06.624028 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:06.623997 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53213afd-bfbe-44e9-a0ce-5136a80871b2-proxy-tls\") pod \"53213afd-bfbe-44e9-a0ce-5136a80871b2\" (UID: \"53213afd-bfbe-44e9-a0ce-5136a80871b2\") " Apr 17 17:46:06.624202 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:06.624090 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53213afd-bfbe-44e9-a0ce-5136a80871b2-openshift-service-ca-bundle\") pod \"53213afd-bfbe-44e9-a0ce-5136a80871b2\" (UID: \"53213afd-bfbe-44e9-a0ce-5136a80871b2\") " Apr 17 17:46:06.624441 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:06.624418 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53213afd-bfbe-44e9-a0ce-5136a80871b2-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "53213afd-bfbe-44e9-a0ce-5136a80871b2" (UID: "53213afd-bfbe-44e9-a0ce-5136a80871b2"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:46:06.626157 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:06.626129 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53213afd-bfbe-44e9-a0ce-5136a80871b2-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "53213afd-bfbe-44e9-a0ce-5136a80871b2" (UID: "53213afd-bfbe-44e9-a0ce-5136a80871b2"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:46:06.633999 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:06.633967 2575 generic.go:358] "Generic (PLEG): container finished" podID="53213afd-bfbe-44e9-a0ce-5136a80871b2" containerID="c70f2106d8523259338690f63f4a73388c56eebf22c387a94ca9530a9206e148" exitCode=0 Apr 17 17:46:06.634111 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:06.634029 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9" Apr 17 17:46:06.634111 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:06.634043 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9" event={"ID":"53213afd-bfbe-44e9-a0ce-5136a80871b2","Type":"ContainerDied","Data":"c70f2106d8523259338690f63f4a73388c56eebf22c387a94ca9530a9206e148"} Apr 17 17:46:06.634111 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:06.634086 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9" event={"ID":"53213afd-bfbe-44e9-a0ce-5136a80871b2","Type":"ContainerDied","Data":"b96cb17fe1102a61f8120fc9c57b78ba3b770a5144a189ed64ecdf22cdcb4468"} Apr 17 17:46:06.634111 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:06.634103 2575 scope.go:117] "RemoveContainer" containerID="c70f2106d8523259338690f63f4a73388c56eebf22c387a94ca9530a9206e148" Apr 17 17:46:06.642810 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:06.642795 2575 scope.go:117] "RemoveContainer" containerID="c70f2106d8523259338690f63f4a73388c56eebf22c387a94ca9530a9206e148" Apr 17 17:46:06.643123 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:46:06.643081 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c70f2106d8523259338690f63f4a73388c56eebf22c387a94ca9530a9206e148\": container with ID starting with c70f2106d8523259338690f63f4a73388c56eebf22c387a94ca9530a9206e148 not found: ID does not exist" containerID="c70f2106d8523259338690f63f4a73388c56eebf22c387a94ca9530a9206e148" Apr 17 17:46:06.643176 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:06.643115 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c70f2106d8523259338690f63f4a73388c56eebf22c387a94ca9530a9206e148"} err="failed to get container status \"c70f2106d8523259338690f63f4a73388c56eebf22c387a94ca9530a9206e148\": rpc error: code = NotFound desc = could not find container \"c70f2106d8523259338690f63f4a73388c56eebf22c387a94ca9530a9206e148\": container with ID starting with c70f2106d8523259338690f63f4a73388c56eebf22c387a94ca9530a9206e148 not found: ID does not exist" Apr 17 17:46:06.654910 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:06.654884 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9"] Apr 17 17:46:06.659179 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:06.659160 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-a1277-6d9f5c6dd8-sqgk9"] Apr 17 17:46:06.724962 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:06.724935 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53213afd-bfbe-44e9-a0ce-5136a80871b2-proxy-tls\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:46:06.724962 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:06.724959 2575 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53213afd-bfbe-44e9-a0ce-5136a80871b2-openshift-service-ca-bundle\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:46:07.094816 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:07.094782 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53213afd-bfbe-44e9-a0ce-5136a80871b2" path="/var/lib/kubelet/pods/53213afd-bfbe-44e9-a0ce-5136a80871b2/volumes" Apr 17 17:46:14.539274 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:14.539236 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" podUID="b4a954bf-fc87-4e85-a790-dc1e57f4e5a5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 17 17:46:24.539695 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:24.539664 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" Apr 17 17:46:36.608484 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:36.608447 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9"] Apr 17 17:46:36.608988 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:36.608832 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53213afd-bfbe-44e9-a0ce-5136a80871b2" containerName="sequence-graph-a1277" Apr 17 17:46:36.608988 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:36.608845 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="53213afd-bfbe-44e9-a0ce-5136a80871b2" containerName="sequence-graph-a1277" Apr 17 17:46:36.608988 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:36.608909 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="53213afd-bfbe-44e9-a0ce-5136a80871b2" containerName="sequence-graph-a1277" Apr 17 17:46:36.612050 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:36.612029 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9" Apr 17 17:46:36.614265 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:36.614242 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-dd512-kube-rbac-proxy-sar-config\"" Apr 17 17:46:36.614363 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:36.614295 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sequence-graph-dd512-serving-cert\"" Apr 17 17:46:36.619663 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:36.619642 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9"] Apr 17 17:46:36.779923 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:36.779890 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99521847-f803-41bd-9c3b-2a7dcfe5411e-proxy-tls\") pod \"sequence-graph-dd512-7648f6b85c-9h6m9\" (UID: \"99521847-f803-41bd-9c3b-2a7dcfe5411e\") " pod="kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9" Apr 17 17:46:36.779923 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:36.779928 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99521847-f803-41bd-9c3b-2a7dcfe5411e-openshift-service-ca-bundle\") pod \"sequence-graph-dd512-7648f6b85c-9h6m9\" (UID: \"99521847-f803-41bd-9c3b-2a7dcfe5411e\") " pod="kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9" Apr 17 17:46:36.881187 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:36.881099 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99521847-f803-41bd-9c3b-2a7dcfe5411e-proxy-tls\") pod \"sequence-graph-dd512-7648f6b85c-9h6m9\" (UID: \"99521847-f803-41bd-9c3b-2a7dcfe5411e\") " pod="kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9" Apr 17 17:46:36.881187 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:36.881142 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99521847-f803-41bd-9c3b-2a7dcfe5411e-openshift-service-ca-bundle\") pod \"sequence-graph-dd512-7648f6b85c-9h6m9\" (UID: \"99521847-f803-41bd-9c3b-2a7dcfe5411e\") " pod="kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9" Apr 17 17:46:36.881804 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:36.881772 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99521847-f803-41bd-9c3b-2a7dcfe5411e-openshift-service-ca-bundle\") pod \"sequence-graph-dd512-7648f6b85c-9h6m9\" (UID: \"99521847-f803-41bd-9c3b-2a7dcfe5411e\") " pod="kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9" Apr 17 17:46:36.883422 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:36.883404 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99521847-f803-41bd-9c3b-2a7dcfe5411e-proxy-tls\") pod \"sequence-graph-dd512-7648f6b85c-9h6m9\" (UID: \"99521847-f803-41bd-9c3b-2a7dcfe5411e\") " pod="kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9" Apr 17 17:46:36.923061 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:36.923030 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9" Apr 17 17:46:37.046134 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:37.046104 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9"] Apr 17 17:46:37.049205 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:46:37.049174 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99521847_f803_41bd_9c3b_2a7dcfe5411e.slice/crio-f0f2e3e46ed825f59bb9de4bb18f61c3cc0eb5f6eacc0eeda34406b2177fcb59 WatchSource:0}: Error finding container f0f2e3e46ed825f59bb9de4bb18f61c3cc0eb5f6eacc0eeda34406b2177fcb59: Status 404 returned error can't find the container with id f0f2e3e46ed825f59bb9de4bb18f61c3cc0eb5f6eacc0eeda34406b2177fcb59 Apr 17 17:46:37.742442 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:37.742401 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9" event={"ID":"99521847-f803-41bd-9c3b-2a7dcfe5411e","Type":"ContainerStarted","Data":"8476f7a523dc433893be0f414ebe059ff3e10704aacd99a81b786a5f9c5db18e"} Apr 17 17:46:37.742442 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:37.742443 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9" event={"ID":"99521847-f803-41bd-9c3b-2a7dcfe5411e","Type":"ContainerStarted","Data":"f0f2e3e46ed825f59bb9de4bb18f61c3cc0eb5f6eacc0eeda34406b2177fcb59"} Apr 17 17:46:37.742884 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:37.742511 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9" Apr 17 17:46:37.759643 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:37.759595 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9" podStartSLOduration=1.759581795 podStartE2EDuration="1.759581795s" podCreationTimestamp="2026-04-17 17:46:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:46:37.757818555 +0000 UTC m=+1367.283499174" watchObservedRunningTime="2026-04-17 17:46:37.759581795 +0000 UTC m=+1367.285262413" Apr 17 17:46:43.750788 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:46:43.750756 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9" Apr 17 17:48:51.080185 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:48:51.080161 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctj42_a9e0a184-477b-45d6-835a-1606a973a5cf/console-operator/1.log" Apr 17 17:48:51.082557 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:48:51.082535 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctj42_a9e0a184-477b-45d6-835a-1606a973a5cf/console-operator/1.log" Apr 17 17:53:51.112409 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:53:51.112379 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctj42_a9e0a184-477b-45d6-835a-1606a973a5cf/console-operator/1.log" Apr 17 17:53:51.116671 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:53:51.116640 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctj42_a9e0a184-477b-45d6-835a-1606a973a5cf/console-operator/1.log" Apr 17 17:54:01.658334 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:01.658299 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk"] Apr 17 17:54:01.658694 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:01.658578 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk" podUID="4af22d05-2d05-48f9-baef-808a639508da" containerName="ensemble-graph-313c7" containerID="cri-o://2a1ed509dccbe33cf4e9795f1980cea2c5e0e31dcaab573486e0e475669e3e49" gracePeriod=30 Apr 17 17:54:01.765142 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:01.765107 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65"] Apr 17 17:54:01.765473 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:01.765418 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" podUID="33b7e886-f7fd-4627-9183-0751e15881f5" containerName="kserve-container" containerID="cri-o://f21eae66f4a6e92fcd98fb17c90ec5f6c6bbc134277a75fbfd579f1900bb5a4d" gracePeriod=30 Apr 17 17:54:01.765882 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:01.765487 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" podUID="33b7e886-f7fd-4627-9183-0751e15881f5" containerName="kube-rbac-proxy" containerID="cri-o://e56c23105a60f2dae8a8e06b52ead7f8036d397dbd83f2d3fcdb7b3a48bf6638" gracePeriod=30 Apr 17 17:54:01.855888 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:01.855849 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj"] Apr 17 17:54:01.859525 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:01.859505 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" Apr 17 17:54:01.861929 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:01.861896 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-e4a99-kube-rbac-proxy-sar-config\"" Apr 17 17:54:01.862058 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:01.861896 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-e4a99-predictor-serving-cert\"" Apr 17 17:54:01.877500 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:01.877473 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj"] Apr 17 17:54:01.942480 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:01.942404 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/531417bf-dd53-4252-8b1b-81c2027ff5c6-proxy-tls\") pod \"success-200-isvc-e4a99-predictor-58f48cbb54-d9szj\" (UID: \"531417bf-dd53-4252-8b1b-81c2027ff5c6\") " pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" Apr 17 17:54:01.942480 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:01.942440 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n6vx\" (UniqueName: \"kubernetes.io/projected/531417bf-dd53-4252-8b1b-81c2027ff5c6-kube-api-access-9n6vx\") pod \"success-200-isvc-e4a99-predictor-58f48cbb54-d9szj\" (UID: \"531417bf-dd53-4252-8b1b-81c2027ff5c6\") " pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" Apr 17 17:54:01.942659 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:01.942532 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-e4a99-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/531417bf-dd53-4252-8b1b-81c2027ff5c6-success-200-isvc-e4a99-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-e4a99-predictor-58f48cbb54-d9szj\" (UID: \"531417bf-dd53-4252-8b1b-81c2027ff5c6\") " pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" Apr 17 17:54:02.043641 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:02.043604 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/531417bf-dd53-4252-8b1b-81c2027ff5c6-proxy-tls\") pod \"success-200-isvc-e4a99-predictor-58f48cbb54-d9szj\" (UID: \"531417bf-dd53-4252-8b1b-81c2027ff5c6\") " pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" Apr 17 17:54:02.043641 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:02.043640 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9n6vx\" (UniqueName: \"kubernetes.io/projected/531417bf-dd53-4252-8b1b-81c2027ff5c6-kube-api-access-9n6vx\") pod \"success-200-isvc-e4a99-predictor-58f48cbb54-d9szj\" (UID: \"531417bf-dd53-4252-8b1b-81c2027ff5c6\") " pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" Apr 17 17:54:02.043905 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:02.043673 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-e4a99-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/531417bf-dd53-4252-8b1b-81c2027ff5c6-success-200-isvc-e4a99-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-e4a99-predictor-58f48cbb54-d9szj\" (UID: \"531417bf-dd53-4252-8b1b-81c2027ff5c6\") " pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" Apr 17 17:54:02.043905 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:54:02.043771 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-serving-cert: secret "success-200-isvc-e4a99-predictor-serving-cert" not found Apr 17 17:54:02.043905 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:54:02.043871 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/531417bf-dd53-4252-8b1b-81c2027ff5c6-proxy-tls podName:531417bf-dd53-4252-8b1b-81c2027ff5c6 nodeName:}" failed. No retries permitted until 2026-04-17 17:54:02.54384839 +0000 UTC m=+1812.069528991 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/531417bf-dd53-4252-8b1b-81c2027ff5c6-proxy-tls") pod "success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" (UID: "531417bf-dd53-4252-8b1b-81c2027ff5c6") : secret "success-200-isvc-e4a99-predictor-serving-cert" not found Apr 17 17:54:02.044377 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:02.044351 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-e4a99-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/531417bf-dd53-4252-8b1b-81c2027ff5c6-success-200-isvc-e4a99-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-e4a99-predictor-58f48cbb54-d9szj\" (UID: \"531417bf-dd53-4252-8b1b-81c2027ff5c6\") " pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" Apr 17 17:54:02.054419 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:02.054400 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n6vx\" (UniqueName: \"kubernetes.io/projected/531417bf-dd53-4252-8b1b-81c2027ff5c6-kube-api-access-9n6vx\") pod \"success-200-isvc-e4a99-predictor-58f48cbb54-d9szj\" (UID: \"531417bf-dd53-4252-8b1b-81c2027ff5c6\") " pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" Apr 17 17:54:02.208129 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:02.208043 2575 generic.go:358] "Generic (PLEG): container finished" podID="33b7e886-f7fd-4627-9183-0751e15881f5" containerID="e56c23105a60f2dae8a8e06b52ead7f8036d397dbd83f2d3fcdb7b3a48bf6638" exitCode=2 Apr 17 17:54:02.208129 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:02.208099 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" event={"ID":"33b7e886-f7fd-4627-9183-0751e15881f5","Type":"ContainerDied","Data":"e56c23105a60f2dae8a8e06b52ead7f8036d397dbd83f2d3fcdb7b3a48bf6638"} Apr 17 17:54:02.547397 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:02.547359 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/531417bf-dd53-4252-8b1b-81c2027ff5c6-proxy-tls\") pod \"success-200-isvc-e4a99-predictor-58f48cbb54-d9szj\" (UID: \"531417bf-dd53-4252-8b1b-81c2027ff5c6\") " pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" Apr 17 17:54:02.549737 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:02.549712 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/531417bf-dd53-4252-8b1b-81c2027ff5c6-proxy-tls\") pod \"success-200-isvc-e4a99-predictor-58f48cbb54-d9szj\" (UID: \"531417bf-dd53-4252-8b1b-81c2027ff5c6\") " pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" Apr 17 17:54:02.772447 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:02.772405 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" Apr 17 17:54:02.900605 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:02.900568 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj"] Apr 17 17:54:02.903386 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:54:02.903359 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod531417bf_dd53_4252_8b1b_81c2027ff5c6.slice/crio-b684644d5e226e37b14eae0cadda1ebb2a302fdf3e0ab0a7d9506c424f62e1fa WatchSource:0}: Error finding container b684644d5e226e37b14eae0cadda1ebb2a302fdf3e0ab0a7d9506c424f62e1fa: Status 404 returned error can't find the container with id b684644d5e226e37b14eae0cadda1ebb2a302fdf3e0ab0a7d9506c424f62e1fa Apr 17 17:54:02.905087 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:02.905071 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:54:03.213121 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:03.213032 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" event={"ID":"531417bf-dd53-4252-8b1b-81c2027ff5c6","Type":"ContainerStarted","Data":"f681ad65906074020a66c0d004b28a33715eddecdec6c5594f7736baa45de146"} Apr 17 17:54:03.213121 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:03.213067 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" event={"ID":"531417bf-dd53-4252-8b1b-81c2027ff5c6","Type":"ContainerStarted","Data":"e28ae3a4e73a2ca03f8aaf0dde2c487ce47d50eee2b58187958228ce6d000cd9"} Apr 17 17:54:03.213121 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:03.213078 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" event={"ID":"531417bf-dd53-4252-8b1b-81c2027ff5c6","Type":"ContainerStarted","Data":"b684644d5e226e37b14eae0cadda1ebb2a302fdf3e0ab0a7d9506c424f62e1fa"} Apr 17 17:54:03.213121 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:03.213091 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" Apr 17 17:54:03.213121 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:03.213116 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" Apr 17 17:54:03.214585 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:03.214556 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" podUID="531417bf-dd53-4252-8b1b-81c2027ff5c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 17 17:54:03.232388 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:03.232336 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" podStartSLOduration=2.232293759 podStartE2EDuration="2.232293759s" podCreationTimestamp="2026-04-17 17:54:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:54:03.23169259 +0000 UTC m=+1812.757373222" watchObservedRunningTime="2026-04-17 17:54:03.232293759 +0000 UTC m=+1812.757974567" Apr 17 17:54:04.216671 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:04.216633 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" podUID="531417bf-dd53-4252-8b1b-81c2027ff5c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 17 17:54:04.581225 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:04.581185 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk" podUID="4af22d05-2d05-48f9-baef-808a639508da" containerName="ensemble-graph-313c7" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:54:05.115185 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:05.115163 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" Apr 17 17:54:05.170335 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:05.170307 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-313c7-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/33b7e886-f7fd-4627-9183-0751e15881f5-success-200-isvc-313c7-kube-rbac-proxy-sar-config\") pod \"33b7e886-f7fd-4627-9183-0751e15881f5\" (UID: \"33b7e886-f7fd-4627-9183-0751e15881f5\") " Apr 17 17:54:05.170462 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:05.170380 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnrj2\" (UniqueName: \"kubernetes.io/projected/33b7e886-f7fd-4627-9183-0751e15881f5-kube-api-access-pnrj2\") pod \"33b7e886-f7fd-4627-9183-0751e15881f5\" (UID: \"33b7e886-f7fd-4627-9183-0751e15881f5\") " Apr 17 17:54:05.170462 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:05.170407 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33b7e886-f7fd-4627-9183-0751e15881f5-proxy-tls\") pod \"33b7e886-f7fd-4627-9183-0751e15881f5\" (UID: \"33b7e886-f7fd-4627-9183-0751e15881f5\") " Apr 17 17:54:05.170705 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:05.170662 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33b7e886-f7fd-4627-9183-0751e15881f5-success-200-isvc-313c7-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-313c7-kube-rbac-proxy-sar-config") pod "33b7e886-f7fd-4627-9183-0751e15881f5" (UID: "33b7e886-f7fd-4627-9183-0751e15881f5"). InnerVolumeSpecName "success-200-isvc-313c7-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:54:05.170815 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:05.170792 2575 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-313c7-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/33b7e886-f7fd-4627-9183-0751e15881f5-success-200-isvc-313c7-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:54:05.172720 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:05.172703 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33b7e886-f7fd-4627-9183-0751e15881f5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "33b7e886-f7fd-4627-9183-0751e15881f5" (UID: "33b7e886-f7fd-4627-9183-0751e15881f5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:54:05.173144 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:05.173118 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33b7e886-f7fd-4627-9183-0751e15881f5-kube-api-access-pnrj2" (OuterVolumeSpecName: "kube-api-access-pnrj2") pod "33b7e886-f7fd-4627-9183-0751e15881f5" (UID: "33b7e886-f7fd-4627-9183-0751e15881f5"). InnerVolumeSpecName "kube-api-access-pnrj2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:54:05.222424 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:05.222350 2575 generic.go:358] "Generic (PLEG): container finished" podID="33b7e886-f7fd-4627-9183-0751e15881f5" containerID="f21eae66f4a6e92fcd98fb17c90ec5f6c6bbc134277a75fbfd579f1900bb5a4d" exitCode=0 Apr 17 17:54:05.222424 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:05.222410 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" event={"ID":"33b7e886-f7fd-4627-9183-0751e15881f5","Type":"ContainerDied","Data":"f21eae66f4a6e92fcd98fb17c90ec5f6c6bbc134277a75fbfd579f1900bb5a4d"} Apr 17 17:54:05.222424 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:05.222420 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" Apr 17 17:54:05.222923 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:05.222435 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65" event={"ID":"33b7e886-f7fd-4627-9183-0751e15881f5","Type":"ContainerDied","Data":"cfc5a0d516054c5318671ffab6c9e0de18bf274a9209f7191fbcab9fd96fa6e7"} Apr 17 17:54:05.222923 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:05.222451 2575 scope.go:117] "RemoveContainer" containerID="e56c23105a60f2dae8a8e06b52ead7f8036d397dbd83f2d3fcdb7b3a48bf6638" Apr 17 17:54:05.232349 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:05.232332 2575 scope.go:117] "RemoveContainer" containerID="f21eae66f4a6e92fcd98fb17c90ec5f6c6bbc134277a75fbfd579f1900bb5a4d" Apr 17 17:54:05.239438 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:05.239420 2575 scope.go:117] "RemoveContainer" containerID="e56c23105a60f2dae8a8e06b52ead7f8036d397dbd83f2d3fcdb7b3a48bf6638" Apr 17 17:54:05.239673 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:54:05.239655 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e56c23105a60f2dae8a8e06b52ead7f8036d397dbd83f2d3fcdb7b3a48bf6638\": container with ID starting with e56c23105a60f2dae8a8e06b52ead7f8036d397dbd83f2d3fcdb7b3a48bf6638 not found: ID does not exist" containerID="e56c23105a60f2dae8a8e06b52ead7f8036d397dbd83f2d3fcdb7b3a48bf6638" Apr 17 17:54:05.239736 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:05.239680 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e56c23105a60f2dae8a8e06b52ead7f8036d397dbd83f2d3fcdb7b3a48bf6638"} err="failed to get container status \"e56c23105a60f2dae8a8e06b52ead7f8036d397dbd83f2d3fcdb7b3a48bf6638\": rpc error: code = NotFound desc = could not find container \"e56c23105a60f2dae8a8e06b52ead7f8036d397dbd83f2d3fcdb7b3a48bf6638\": container with ID starting with e56c23105a60f2dae8a8e06b52ead7f8036d397dbd83f2d3fcdb7b3a48bf6638 not found: ID does not exist" Apr 17 17:54:05.239736 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:05.239698 2575 scope.go:117] "RemoveContainer" containerID="f21eae66f4a6e92fcd98fb17c90ec5f6c6bbc134277a75fbfd579f1900bb5a4d" Apr 17 17:54:05.239929 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:54:05.239912 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f21eae66f4a6e92fcd98fb17c90ec5f6c6bbc134277a75fbfd579f1900bb5a4d\": container with ID starting with f21eae66f4a6e92fcd98fb17c90ec5f6c6bbc134277a75fbfd579f1900bb5a4d not found: ID does not exist" containerID="f21eae66f4a6e92fcd98fb17c90ec5f6c6bbc134277a75fbfd579f1900bb5a4d" Apr 17 17:54:05.239976 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:05.239936 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f21eae66f4a6e92fcd98fb17c90ec5f6c6bbc134277a75fbfd579f1900bb5a4d"} err="failed to get container status \"f21eae66f4a6e92fcd98fb17c90ec5f6c6bbc134277a75fbfd579f1900bb5a4d\": rpc error: code = NotFound desc = could not find container \"f21eae66f4a6e92fcd98fb17c90ec5f6c6bbc134277a75fbfd579f1900bb5a4d\": container with ID starting with f21eae66f4a6e92fcd98fb17c90ec5f6c6bbc134277a75fbfd579f1900bb5a4d not found: ID does not exist" Apr 17 17:54:05.245030 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:05.245008 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65"] Apr 17 17:54:05.246526 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:05.246506 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-313c7-predictor-5bddb7798d-j8c65"] Apr 17 17:54:05.272076 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:05.272058 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pnrj2\" (UniqueName: \"kubernetes.io/projected/33b7e886-f7fd-4627-9183-0751e15881f5-kube-api-access-pnrj2\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:54:05.272153 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:05.272078 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/33b7e886-f7fd-4627-9183-0751e15881f5-proxy-tls\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:54:07.099360 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:07.099323 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33b7e886-f7fd-4627-9183-0751e15881f5" path="/var/lib/kubelet/pods/33b7e886-f7fd-4627-9183-0751e15881f5/volumes" Apr 17 17:54:09.221080 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:09.221049 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" Apr 17 17:54:09.221603 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:09.221573 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" podUID="531417bf-dd53-4252-8b1b-81c2027ff5c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 17 17:54:09.580857 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:09.580795 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk" podUID="4af22d05-2d05-48f9-baef-808a639508da" containerName="ensemble-graph-313c7" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:54:14.581039 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:14.581001 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk" podUID="4af22d05-2d05-48f9-baef-808a639508da" containerName="ensemble-graph-313c7" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:54:14.581497 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:14.581110 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk" Apr 17 17:54:19.221908 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:19.221869 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" podUID="531417bf-dd53-4252-8b1b-81c2027ff5c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 17 17:54:19.580338 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:19.580299 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk" podUID="4af22d05-2d05-48f9-baef-808a639508da" containerName="ensemble-graph-313c7" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:54:24.580846 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:24.580785 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk" podUID="4af22d05-2d05-48f9-baef-808a639508da" containerName="ensemble-graph-313c7" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:54:29.222222 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:29.222179 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" podUID="531417bf-dd53-4252-8b1b-81c2027ff5c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 17 17:54:29.580471 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:29.580429 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk" podUID="4af22d05-2d05-48f9-baef-808a639508da" containerName="ensemble-graph-313c7" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:54:31.799412 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:31.799389 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk" Apr 17 17:54:31.888459 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:31.888431 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4af22d05-2d05-48f9-baef-808a639508da-proxy-tls\") pod \"4af22d05-2d05-48f9-baef-808a639508da\" (UID: \"4af22d05-2d05-48f9-baef-808a639508da\") " Apr 17 17:54:31.888623 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:31.888479 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4af22d05-2d05-48f9-baef-808a639508da-openshift-service-ca-bundle\") pod \"4af22d05-2d05-48f9-baef-808a639508da\" (UID: \"4af22d05-2d05-48f9-baef-808a639508da\") " Apr 17 17:54:31.888873 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:31.888852 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4af22d05-2d05-48f9-baef-808a639508da-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "4af22d05-2d05-48f9-baef-808a639508da" (UID: "4af22d05-2d05-48f9-baef-808a639508da"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:54:31.890499 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:31.890477 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4af22d05-2d05-48f9-baef-808a639508da-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4af22d05-2d05-48f9-baef-808a639508da" (UID: "4af22d05-2d05-48f9-baef-808a639508da"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:54:31.990007 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:31.989933 2575 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4af22d05-2d05-48f9-baef-808a639508da-openshift-service-ca-bundle\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:54:31.990007 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:31.989957 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4af22d05-2d05-48f9-baef-808a639508da-proxy-tls\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:54:32.312722 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:32.312684 2575 generic.go:358] "Generic (PLEG): container finished" podID="4af22d05-2d05-48f9-baef-808a639508da" containerID="2a1ed509dccbe33cf4e9795f1980cea2c5e0e31dcaab573486e0e475669e3e49" exitCode=0 Apr 17 17:54:32.312902 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:32.312745 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk" event={"ID":"4af22d05-2d05-48f9-baef-808a639508da","Type":"ContainerDied","Data":"2a1ed509dccbe33cf4e9795f1980cea2c5e0e31dcaab573486e0e475669e3e49"} Apr 17 17:54:32.312902 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:32.312774 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk" event={"ID":"4af22d05-2d05-48f9-baef-808a639508da","Type":"ContainerDied","Data":"1cc3c2dd077eda9f2aa176e1d7c9191a1338337357a2c2096aabe295a7bc4bf1"} Apr 17 17:54:32.312902 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:32.312790 2575 scope.go:117] "RemoveContainer" containerID="2a1ed509dccbe33cf4e9795f1980cea2c5e0e31dcaab573486e0e475669e3e49" Apr 17 17:54:32.312902 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:32.312752 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk" Apr 17 17:54:32.321046 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:32.321027 2575 scope.go:117] "RemoveContainer" containerID="2a1ed509dccbe33cf4e9795f1980cea2c5e0e31dcaab573486e0e475669e3e49" Apr 17 17:54:32.321300 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:54:32.321280 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a1ed509dccbe33cf4e9795f1980cea2c5e0e31dcaab573486e0e475669e3e49\": container with ID starting with 2a1ed509dccbe33cf4e9795f1980cea2c5e0e31dcaab573486e0e475669e3e49 not found: ID does not exist" containerID="2a1ed509dccbe33cf4e9795f1980cea2c5e0e31dcaab573486e0e475669e3e49" Apr 17 17:54:32.321353 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:32.321310 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a1ed509dccbe33cf4e9795f1980cea2c5e0e31dcaab573486e0e475669e3e49"} err="failed to get container status \"2a1ed509dccbe33cf4e9795f1980cea2c5e0e31dcaab573486e0e475669e3e49\": rpc error: code = NotFound desc = could not find container \"2a1ed509dccbe33cf4e9795f1980cea2c5e0e31dcaab573486e0e475669e3e49\": container with ID starting with 2a1ed509dccbe33cf4e9795f1980cea2c5e0e31dcaab573486e0e475669e3e49 not found: ID does not exist" Apr 17 17:54:32.332729 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:32.332705 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk"] Apr 17 17:54:32.336553 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:32.336527 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/ensemble-graph-313c7-6b795f969b-wmjtk"] Apr 17 17:54:33.093943 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:33.093908 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4af22d05-2d05-48f9-baef-808a639508da" path="/var/lib/kubelet/pods/4af22d05-2d05-48f9-baef-808a639508da/volumes" Apr 17 17:54:39.221758 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:39.221718 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" podUID="531417bf-dd53-4252-8b1b-81c2027ff5c6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 17 17:54:49.222160 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:49.222127 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" Apr 17 17:54:51.191183 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:51.191155 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9"] Apr 17 17:54:51.191483 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:51.191397 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9" podUID="99521847-f803-41bd-9c3b-2a7dcfe5411e" containerName="sequence-graph-dd512" containerID="cri-o://8476f7a523dc433893be0f414ebe059ff3e10704aacd99a81b786a5f9c5db18e" gracePeriod=30 Apr 17 17:54:51.307522 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:51.307491 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj"] Apr 17 17:54:51.307803 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:51.307776 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" podUID="b4a954bf-fc87-4e85-a790-dc1e57f4e5a5" containerName="kserve-container" containerID="cri-o://51fc8ea6d6fa647eda365c136ba5bc3ccaeab457ab3abb7159660c0063bd6ab7" gracePeriod=30 Apr 17 17:54:51.307907 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:51.307794 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" podUID="b4a954bf-fc87-4e85-a790-dc1e57f4e5a5" containerName="kube-rbac-proxy" containerID="cri-o://93710dd0b47725c5d9487c1fd1f10c3acb374a608da756933cff1bbf27929b86" gracePeriod=30 Apr 17 17:54:51.365254 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:51.365221 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27"] Apr 17 17:54:51.365762 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:51.365745 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4af22d05-2d05-48f9-baef-808a639508da" containerName="ensemble-graph-313c7" Apr 17 17:54:51.365762 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:51.365763 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af22d05-2d05-48f9-baef-808a639508da" containerName="ensemble-graph-313c7" Apr 17 17:54:51.365966 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:51.365804 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33b7e886-f7fd-4627-9183-0751e15881f5" containerName="kserve-container" Apr 17 17:54:51.365966 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:51.365814 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="33b7e886-f7fd-4627-9183-0751e15881f5" containerName="kserve-container" Apr 17 17:54:51.365966 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:51.365841 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33b7e886-f7fd-4627-9183-0751e15881f5" containerName="kube-rbac-proxy" Apr 17 17:54:51.365966 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:51.365851 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="33b7e886-f7fd-4627-9183-0751e15881f5" containerName="kube-rbac-proxy" Apr 17 17:54:51.365966 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:51.365940 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="33b7e886-f7fd-4627-9183-0751e15881f5" containerName="kube-rbac-proxy" Apr 17 17:54:51.365966 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:51.365954 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="33b7e886-f7fd-4627-9183-0751e15881f5" containerName="kserve-container" Apr 17 17:54:51.365966 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:51.365968 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4af22d05-2d05-48f9-baef-808a639508da" containerName="ensemble-graph-313c7" Apr 17 17:54:51.371779 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:51.371739 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" Apr 17 17:54:51.374078 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:51.374051 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-0283f-kube-rbac-proxy-sar-config\"" Apr 17 17:54:51.374194 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:51.374102 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-0283f-predictor-serving-cert\"" Apr 17 17:54:51.379653 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:51.379609 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27"] Apr 17 17:54:51.548973 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:51.548939 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtrx4\" (UniqueName: \"kubernetes.io/projected/d97c20bd-af97-487f-8815-e0497431b94d-kube-api-access-vtrx4\") pod \"success-200-isvc-0283f-predictor-bfc486968-gvj27\" (UID: \"d97c20bd-af97-487f-8815-e0497431b94d\") " pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" Apr 17 17:54:51.549153 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:51.548998 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d97c20bd-af97-487f-8815-e0497431b94d-proxy-tls\") pod \"success-200-isvc-0283f-predictor-bfc486968-gvj27\" (UID: \"d97c20bd-af97-487f-8815-e0497431b94d\") " pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" Apr 17 17:54:51.549153 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:51.549094 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-0283f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d97c20bd-af97-487f-8815-e0497431b94d-success-200-isvc-0283f-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-0283f-predictor-bfc486968-gvj27\" (UID: \"d97c20bd-af97-487f-8815-e0497431b94d\") " pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" Apr 17 17:54:51.649755 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:51.649723 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d97c20bd-af97-487f-8815-e0497431b94d-proxy-tls\") pod \"success-200-isvc-0283f-predictor-bfc486968-gvj27\" (UID: \"d97c20bd-af97-487f-8815-e0497431b94d\") " pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" Apr 17 17:54:51.649964 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:51.649871 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-0283f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d97c20bd-af97-487f-8815-e0497431b94d-success-200-isvc-0283f-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-0283f-predictor-bfc486968-gvj27\" (UID: \"d97c20bd-af97-487f-8815-e0497431b94d\") " pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" Apr 17 17:54:51.649964 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:54:51.649900 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/success-200-isvc-0283f-predictor-serving-cert: secret "success-200-isvc-0283f-predictor-serving-cert" not found Apr 17 17:54:51.649964 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:51.649956 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtrx4\" (UniqueName: \"kubernetes.io/projected/d97c20bd-af97-487f-8815-e0497431b94d-kube-api-access-vtrx4\") pod \"success-200-isvc-0283f-predictor-bfc486968-gvj27\" (UID: \"d97c20bd-af97-487f-8815-e0497431b94d\") " pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" Apr 17 17:54:51.650152 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:54:51.649969 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d97c20bd-af97-487f-8815-e0497431b94d-proxy-tls podName:d97c20bd-af97-487f-8815-e0497431b94d nodeName:}" failed. No retries permitted until 2026-04-17 17:54:52.149952 +0000 UTC m=+1861.675632602 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d97c20bd-af97-487f-8815-e0497431b94d-proxy-tls") pod "success-200-isvc-0283f-predictor-bfc486968-gvj27" (UID: "d97c20bd-af97-487f-8815-e0497431b94d") : secret "success-200-isvc-0283f-predictor-serving-cert" not found Apr 17 17:54:51.650521 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:51.650497 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-0283f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d97c20bd-af97-487f-8815-e0497431b94d-success-200-isvc-0283f-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-0283f-predictor-bfc486968-gvj27\" (UID: \"d97c20bd-af97-487f-8815-e0497431b94d\") " pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" Apr 17 17:54:51.660674 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:51.660655 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtrx4\" (UniqueName: \"kubernetes.io/projected/d97c20bd-af97-487f-8815-e0497431b94d-kube-api-access-vtrx4\") pod \"success-200-isvc-0283f-predictor-bfc486968-gvj27\" (UID: \"d97c20bd-af97-487f-8815-e0497431b94d\") " pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" Apr 17 17:54:52.153996 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:52.153965 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d97c20bd-af97-487f-8815-e0497431b94d-proxy-tls\") pod \"success-200-isvc-0283f-predictor-bfc486968-gvj27\" (UID: \"d97c20bd-af97-487f-8815-e0497431b94d\") " pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" Apr 17 17:54:52.156443 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:52.156417 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d97c20bd-af97-487f-8815-e0497431b94d-proxy-tls\") pod \"success-200-isvc-0283f-predictor-bfc486968-gvj27\" (UID: \"d97c20bd-af97-487f-8815-e0497431b94d\") " pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" Apr 17 17:54:52.286365 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:52.286330 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" Apr 17 17:54:52.385963 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:52.385933 2575 generic.go:358] "Generic (PLEG): container finished" podID="b4a954bf-fc87-4e85-a790-dc1e57f4e5a5" containerID="93710dd0b47725c5d9487c1fd1f10c3acb374a608da756933cff1bbf27929b86" exitCode=2 Apr 17 17:54:52.386102 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:52.385972 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" event={"ID":"b4a954bf-fc87-4e85-a790-dc1e57f4e5a5","Type":"ContainerDied","Data":"93710dd0b47725c5d9487c1fd1f10c3acb374a608da756933cff1bbf27929b86"} Apr 17 17:54:52.409015 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:52.408962 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27"] Apr 17 17:54:52.411558 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:54:52.411530 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd97c20bd_af97_487f_8815_e0497431b94d.slice/crio-7c305b77c2b5d4ddd64fa770cc69843a0cac9930d7a466b07cb686628c7c5106 WatchSource:0}: Error finding container 7c305b77c2b5d4ddd64fa770cc69843a0cac9930d7a466b07cb686628c7c5106: Status 404 returned error can't find the container with id 7c305b77c2b5d4ddd64fa770cc69843a0cac9930d7a466b07cb686628c7c5106 Apr 17 17:54:53.390975 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:53.390934 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" event={"ID":"d97c20bd-af97-487f-8815-e0497431b94d","Type":"ContainerStarted","Data":"0518db7470d82f5a892a5d4a5a3e2d2e9a48cbe7e15c5a8668ebb9e5bb078db5"} Apr 17 17:54:53.390975 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:53.390974 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" event={"ID":"d97c20bd-af97-487f-8815-e0497431b94d","Type":"ContainerStarted","Data":"7f689c84389d0451d1746f13b04e1c6f7af4249be976818ffe7321a2341dd0dc"} Apr 17 17:54:53.391491 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:53.390986 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" event={"ID":"d97c20bd-af97-487f-8815-e0497431b94d","Type":"ContainerStarted","Data":"7c305b77c2b5d4ddd64fa770cc69843a0cac9930d7a466b07cb686628c7c5106"} Apr 17 17:54:53.391491 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:53.391089 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" Apr 17 17:54:53.411740 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:53.411691 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" podStartSLOduration=2.411677135 podStartE2EDuration="2.411677135s" podCreationTimestamp="2026-04-17 17:54:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:54:53.409459036 +0000 UTC m=+1862.935139681" watchObservedRunningTime="2026-04-17 17:54:53.411677135 +0000 UTC m=+1862.937357733" Apr 17 17:54:53.749985 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:53.749947 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9" podUID="99521847-f803-41bd-9c3b-2a7dcfe5411e" containerName="sequence-graph-dd512" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:54:54.350469 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:54.350443 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" Apr 17 17:54:54.395460 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:54.395376 2575 generic.go:358] "Generic (PLEG): container finished" podID="b4a954bf-fc87-4e85-a790-dc1e57f4e5a5" containerID="51fc8ea6d6fa647eda365c136ba5bc3ccaeab457ab3abb7159660c0063bd6ab7" exitCode=0 Apr 17 17:54:54.395460 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:54.395448 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" Apr 17 17:54:54.395922 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:54.395463 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" event={"ID":"b4a954bf-fc87-4e85-a790-dc1e57f4e5a5","Type":"ContainerDied","Data":"51fc8ea6d6fa647eda365c136ba5bc3ccaeab457ab3abb7159660c0063bd6ab7"} Apr 17 17:54:54.395922 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:54.395501 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj" event={"ID":"b4a954bf-fc87-4e85-a790-dc1e57f4e5a5","Type":"ContainerDied","Data":"c3511a4e03e1414fc7ffbd9b32927602de9769a42465b61f17272883e7ee3ba1"} Apr 17 17:54:54.395922 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:54.395520 2575 scope.go:117] "RemoveContainer" containerID="93710dd0b47725c5d9487c1fd1f10c3acb374a608da756933cff1bbf27929b86" Apr 17 17:54:54.396090 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:54.395954 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" Apr 17 17:54:54.397076 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:54.397053 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" podUID="d97c20bd-af97-487f-8815-e0497431b94d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 17 17:54:54.403504 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:54.403487 2575 scope.go:117] "RemoveContainer" containerID="51fc8ea6d6fa647eda365c136ba5bc3ccaeab457ab3abb7159660c0063bd6ab7" Apr 17 17:54:54.410236 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:54.410220 2575 scope.go:117] "RemoveContainer" containerID="93710dd0b47725c5d9487c1fd1f10c3acb374a608da756933cff1bbf27929b86" Apr 17 17:54:54.410495 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:54:54.410465 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93710dd0b47725c5d9487c1fd1f10c3acb374a608da756933cff1bbf27929b86\": container with ID starting with 93710dd0b47725c5d9487c1fd1f10c3acb374a608da756933cff1bbf27929b86 not found: ID does not exist" containerID="93710dd0b47725c5d9487c1fd1f10c3acb374a608da756933cff1bbf27929b86" Apr 17 17:54:54.410593 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:54.410496 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93710dd0b47725c5d9487c1fd1f10c3acb374a608da756933cff1bbf27929b86"} err="failed to get container status \"93710dd0b47725c5d9487c1fd1f10c3acb374a608da756933cff1bbf27929b86\": rpc error: code = NotFound desc = could not find container \"93710dd0b47725c5d9487c1fd1f10c3acb374a608da756933cff1bbf27929b86\": container with ID starting with 93710dd0b47725c5d9487c1fd1f10c3acb374a608da756933cff1bbf27929b86 not found: ID does not exist" Apr 17 17:54:54.410593 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:54.410512 2575 scope.go:117] "RemoveContainer" containerID="51fc8ea6d6fa647eda365c136ba5bc3ccaeab457ab3abb7159660c0063bd6ab7" Apr 17 17:54:54.410727 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:54:54.410710 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51fc8ea6d6fa647eda365c136ba5bc3ccaeab457ab3abb7159660c0063bd6ab7\": container with ID starting with 51fc8ea6d6fa647eda365c136ba5bc3ccaeab457ab3abb7159660c0063bd6ab7 not found: ID does not exist" containerID="51fc8ea6d6fa647eda365c136ba5bc3ccaeab457ab3abb7159660c0063bd6ab7" Apr 17 17:54:54.410765 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:54.410733 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51fc8ea6d6fa647eda365c136ba5bc3ccaeab457ab3abb7159660c0063bd6ab7"} err="failed to get container status \"51fc8ea6d6fa647eda365c136ba5bc3ccaeab457ab3abb7159660c0063bd6ab7\": rpc error: code = NotFound desc = could not find container \"51fc8ea6d6fa647eda365c136ba5bc3ccaeab457ab3abb7159660c0063bd6ab7\": container with ID starting with 51fc8ea6d6fa647eda365c136ba5bc3ccaeab457ab3abb7159660c0063bd6ab7 not found: ID does not exist" Apr 17 17:54:54.474209 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:54.474179 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjwkn\" (UniqueName: \"kubernetes.io/projected/b4a954bf-fc87-4e85-a790-dc1e57f4e5a5-kube-api-access-xjwkn\") pod \"b4a954bf-fc87-4e85-a790-dc1e57f4e5a5\" (UID: \"b4a954bf-fc87-4e85-a790-dc1e57f4e5a5\") " Apr 17 17:54:54.474351 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:54.474215 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4a954bf-fc87-4e85-a790-dc1e57f4e5a5-proxy-tls\") pod \"b4a954bf-fc87-4e85-a790-dc1e57f4e5a5\" (UID: \"b4a954bf-fc87-4e85-a790-dc1e57f4e5a5\") " Apr 17 17:54:54.474351 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:54.474245 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-dd512-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b4a954bf-fc87-4e85-a790-dc1e57f4e5a5-success-200-isvc-dd512-kube-rbac-proxy-sar-config\") pod \"b4a954bf-fc87-4e85-a790-dc1e57f4e5a5\" (UID: \"b4a954bf-fc87-4e85-a790-dc1e57f4e5a5\") " Apr 17 17:54:54.474629 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:54.474604 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4a954bf-fc87-4e85-a790-dc1e57f4e5a5-success-200-isvc-dd512-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-dd512-kube-rbac-proxy-sar-config") pod "b4a954bf-fc87-4e85-a790-dc1e57f4e5a5" (UID: "b4a954bf-fc87-4e85-a790-dc1e57f4e5a5"). InnerVolumeSpecName "success-200-isvc-dd512-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:54:54.476233 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:54.476204 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4a954bf-fc87-4e85-a790-dc1e57f4e5a5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b4a954bf-fc87-4e85-a790-dc1e57f4e5a5" (UID: "b4a954bf-fc87-4e85-a790-dc1e57f4e5a5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:54:54.476314 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:54.476239 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4a954bf-fc87-4e85-a790-dc1e57f4e5a5-kube-api-access-xjwkn" (OuterVolumeSpecName: "kube-api-access-xjwkn") pod "b4a954bf-fc87-4e85-a790-dc1e57f4e5a5" (UID: "b4a954bf-fc87-4e85-a790-dc1e57f4e5a5"). InnerVolumeSpecName "kube-api-access-xjwkn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:54:54.574941 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:54.574905 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xjwkn\" (UniqueName: \"kubernetes.io/projected/b4a954bf-fc87-4e85-a790-dc1e57f4e5a5-kube-api-access-xjwkn\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:54:54.574941 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:54.574933 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4a954bf-fc87-4e85-a790-dc1e57f4e5a5-proxy-tls\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:54:54.574941 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:54.574945 2575 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-dd512-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b4a954bf-fc87-4e85-a790-dc1e57f4e5a5-success-200-isvc-dd512-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:54:54.722242 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:54.722213 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj"] Apr 17 17:54:54.726218 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:54.726195 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-dd512-predictor-7d56f7b46b-t6mqj"] Apr 17 17:54:55.093941 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:55.093910 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4a954bf-fc87-4e85-a790-dc1e57f4e5a5" path="/var/lib/kubelet/pods/b4a954bf-fc87-4e85-a790-dc1e57f4e5a5/volumes" Apr 17 17:54:55.399952 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:55.399861 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" podUID="d97c20bd-af97-487f-8815-e0497431b94d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 17 17:54:58.751210 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:54:58.751168 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9" podUID="99521847-f803-41bd-9c3b-2a7dcfe5411e" containerName="sequence-graph-dd512" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:55:00.404732 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:00.404706 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" Apr 17 17:55:00.405158 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:00.405132 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" podUID="d97c20bd-af97-487f-8815-e0497431b94d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 17 17:55:01.873397 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:01.873364 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd"] Apr 17 17:55:01.873782 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:01.873742 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4a954bf-fc87-4e85-a790-dc1e57f4e5a5" containerName="kserve-container" Apr 17 17:55:01.873782 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:01.873753 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4a954bf-fc87-4e85-a790-dc1e57f4e5a5" containerName="kserve-container" Apr 17 17:55:01.873782 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:01.873769 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4a954bf-fc87-4e85-a790-dc1e57f4e5a5" containerName="kube-rbac-proxy" Apr 17 17:55:01.873782 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:01.873775 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4a954bf-fc87-4e85-a790-dc1e57f4e5a5" containerName="kube-rbac-proxy" Apr 17 17:55:01.873965 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:01.873850 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4a954bf-fc87-4e85-a790-dc1e57f4e5a5" containerName="kserve-container" Apr 17 17:55:01.873965 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:01.873864 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4a954bf-fc87-4e85-a790-dc1e57f4e5a5" containerName="kube-rbac-proxy" Apr 17 17:55:01.878142 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:01.878126 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd" Apr 17 17:55:01.880242 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:01.880217 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-e4a99-kube-rbac-proxy-sar-config\"" Apr 17 17:55:01.880242 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:01.880232 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-e4a99-serving-cert\"" Apr 17 17:55:01.883594 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:01.883572 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd"] Apr 17 17:55:02.033721 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:02.033673 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc5a3cf8-71bb-43ef-8350-472babdebd07-proxy-tls\") pod \"splitter-graph-e4a99-7d99d74f5f-459vd\" (UID: \"fc5a3cf8-71bb-43ef-8350-472babdebd07\") " pod="kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd" Apr 17 17:55:02.033928 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:02.033783 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc5a3cf8-71bb-43ef-8350-472babdebd07-openshift-service-ca-bundle\") pod \"splitter-graph-e4a99-7d99d74f5f-459vd\" (UID: \"fc5a3cf8-71bb-43ef-8350-472babdebd07\") " pod="kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd" Apr 17 17:55:02.134318 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:02.134230 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc5a3cf8-71bb-43ef-8350-472babdebd07-openshift-service-ca-bundle\") pod \"splitter-graph-e4a99-7d99d74f5f-459vd\" (UID: \"fc5a3cf8-71bb-43ef-8350-472babdebd07\") " pod="kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd" Apr 17 17:55:02.134318 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:02.134288 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc5a3cf8-71bb-43ef-8350-472babdebd07-proxy-tls\") pod \"splitter-graph-e4a99-7d99d74f5f-459vd\" (UID: \"fc5a3cf8-71bb-43ef-8350-472babdebd07\") " pod="kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd" Apr 17 17:55:02.134502 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:55:02.134389 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-e4a99-serving-cert: secret "splitter-graph-e4a99-serving-cert" not found Apr 17 17:55:02.134502 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:55:02.134446 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc5a3cf8-71bb-43ef-8350-472babdebd07-proxy-tls podName:fc5a3cf8-71bb-43ef-8350-472babdebd07 nodeName:}" failed. No retries permitted until 2026-04-17 17:55:02.634424619 +0000 UTC m=+1872.160105215 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/fc5a3cf8-71bb-43ef-8350-472babdebd07-proxy-tls") pod "splitter-graph-e4a99-7d99d74f5f-459vd" (UID: "fc5a3cf8-71bb-43ef-8350-472babdebd07") : secret "splitter-graph-e4a99-serving-cert" not found Apr 17 17:55:02.134834 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:02.134801 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc5a3cf8-71bb-43ef-8350-472babdebd07-openshift-service-ca-bundle\") pod \"splitter-graph-e4a99-7d99d74f5f-459vd\" (UID: \"fc5a3cf8-71bb-43ef-8350-472babdebd07\") " pod="kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd" Apr 17 17:55:02.639021 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:02.638980 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc5a3cf8-71bb-43ef-8350-472babdebd07-proxy-tls\") pod \"splitter-graph-e4a99-7d99d74f5f-459vd\" (UID: \"fc5a3cf8-71bb-43ef-8350-472babdebd07\") " pod="kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd" Apr 17 17:55:02.641415 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:02.641392 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc5a3cf8-71bb-43ef-8350-472babdebd07-proxy-tls\") pod \"splitter-graph-e4a99-7d99d74f5f-459vd\" (UID: \"fc5a3cf8-71bb-43ef-8350-472babdebd07\") " pod="kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd" Apr 17 17:55:02.789215 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:02.789183 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd" Apr 17 17:55:02.915766 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:02.915483 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd"] Apr 17 17:55:03.426661 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:03.426625 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd" event={"ID":"fc5a3cf8-71bb-43ef-8350-472babdebd07","Type":"ContainerStarted","Data":"82bc0b8d34543b424418dfea4f629c3368c470d0d93b667f7beee6a154042f4d"} Apr 17 17:55:03.426661 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:03.426663 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd" event={"ID":"fc5a3cf8-71bb-43ef-8350-472babdebd07","Type":"ContainerStarted","Data":"f3f86d3503856f360fa5ce9f17321193f8c5c78db9230c08dc80162ac5113121"} Apr 17 17:55:03.426920 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:03.426707 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd" Apr 17 17:55:03.447175 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:03.447136 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd" podStartSLOduration=2.447124057 podStartE2EDuration="2.447124057s" podCreationTimestamp="2026-04-17 17:55:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:55:03.446559079 +0000 UTC m=+1872.972239699" watchObservedRunningTime="2026-04-17 17:55:03.447124057 +0000 UTC m=+1872.972804675" Apr 17 17:55:03.749593 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:03.749560 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9" podUID="99521847-f803-41bd-9c3b-2a7dcfe5411e" containerName="sequence-graph-dd512" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:55:03.749734 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:03.749679 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9" Apr 17 17:55:08.750279 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:08.750196 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9" podUID="99521847-f803-41bd-9c3b-2a7dcfe5411e" containerName="sequence-graph-dd512" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:55:09.437692 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:09.437664 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd" Apr 17 17:55:10.405406 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:10.405364 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" podUID="d97c20bd-af97-487f-8815-e0497431b94d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 17 17:55:11.950163 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:11.950089 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd"] Apr 17 17:55:11.950561 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:11.950310 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd" podUID="fc5a3cf8-71bb-43ef-8350-472babdebd07" containerName="splitter-graph-e4a99" containerID="cri-o://82bc0b8d34543b424418dfea4f629c3368c470d0d93b667f7beee6a154042f4d" gracePeriod=30 Apr 17 17:55:12.071577 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:12.071544 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj"] Apr 17 17:55:12.071872 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:12.071840 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" podUID="531417bf-dd53-4252-8b1b-81c2027ff5c6" containerName="kserve-container" containerID="cri-o://e28ae3a4e73a2ca03f8aaf0dde2c487ce47d50eee2b58187958228ce6d000cd9" gracePeriod=30 Apr 17 17:55:12.072015 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:12.071897 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" podUID="531417bf-dd53-4252-8b1b-81c2027ff5c6" containerName="kube-rbac-proxy" containerID="cri-o://f681ad65906074020a66c0d004b28a33715eddecdec6c5594f7736baa45de146" gracePeriod=30 Apr 17 17:55:12.092190 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:12.092166 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5"] Apr 17 17:55:12.096739 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:12.096722 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" Apr 17 17:55:12.098999 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:12.098981 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-a935a-predictor-serving-cert\"" Apr 17 17:55:12.099100 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:12.099006 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"success-200-isvc-a935a-kube-rbac-proxy-sar-config\"" Apr 17 17:55:12.104799 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:12.104779 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5"] Apr 17 17:55:12.213001 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:12.212926 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"success-200-isvc-a935a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c2d86f51-524c-4513-84e7-f560a56dcde3-success-200-isvc-a935a-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5\" (UID: \"c2d86f51-524c-4513-84e7-f560a56dcde3\") " pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" Apr 17 17:55:12.213144 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:12.212997 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2d86f51-524c-4513-84e7-f560a56dcde3-proxy-tls\") pod \"success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5\" (UID: \"c2d86f51-524c-4513-84e7-f560a56dcde3\") " pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" Apr 17 17:55:12.213144 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:12.213067 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsbsj\" (UniqueName: \"kubernetes.io/projected/c2d86f51-524c-4513-84e7-f560a56dcde3-kube-api-access-dsbsj\") pod \"success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5\" (UID: \"c2d86f51-524c-4513-84e7-f560a56dcde3\") " pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" Apr 17 17:55:12.314395 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:12.314360 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"success-200-isvc-a935a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c2d86f51-524c-4513-84e7-f560a56dcde3-success-200-isvc-a935a-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5\" (UID: \"c2d86f51-524c-4513-84e7-f560a56dcde3\") " pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" Apr 17 17:55:12.314556 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:12.314412 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2d86f51-524c-4513-84e7-f560a56dcde3-proxy-tls\") pod \"success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5\" (UID: \"c2d86f51-524c-4513-84e7-f560a56dcde3\") " pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" Apr 17 17:55:12.314556 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:12.314443 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dsbsj\" (UniqueName: \"kubernetes.io/projected/c2d86f51-524c-4513-84e7-f560a56dcde3-kube-api-access-dsbsj\") pod \"success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5\" (UID: \"c2d86f51-524c-4513-84e7-f560a56dcde3\") " pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" Apr 17 17:55:12.315051 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:12.315028 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"success-200-isvc-a935a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c2d86f51-524c-4513-84e7-f560a56dcde3-success-200-isvc-a935a-kube-rbac-proxy-sar-config\") pod \"success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5\" (UID: \"c2d86f51-524c-4513-84e7-f560a56dcde3\") " pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" Apr 17 17:55:12.316730 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:12.316706 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2d86f51-524c-4513-84e7-f560a56dcde3-proxy-tls\") pod \"success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5\" (UID: \"c2d86f51-524c-4513-84e7-f560a56dcde3\") " pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" Apr 17 17:55:12.322594 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:12.322570 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsbsj\" (UniqueName: \"kubernetes.io/projected/c2d86f51-524c-4513-84e7-f560a56dcde3-kube-api-access-dsbsj\") pod \"success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5\" (UID: \"c2d86f51-524c-4513-84e7-f560a56dcde3\") " pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" Apr 17 17:55:12.407129 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:12.407097 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" Apr 17 17:55:12.458262 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:12.458224 2575 generic.go:358] "Generic (PLEG): container finished" podID="531417bf-dd53-4252-8b1b-81c2027ff5c6" containerID="f681ad65906074020a66c0d004b28a33715eddecdec6c5594f7736baa45de146" exitCode=2 Apr 17 17:55:12.458385 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:12.458300 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" event={"ID":"531417bf-dd53-4252-8b1b-81c2027ff5c6","Type":"ContainerDied","Data":"f681ad65906074020a66c0d004b28a33715eddecdec6c5594f7736baa45de146"} Apr 17 17:55:12.532438 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:12.532410 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5"] Apr 17 17:55:12.534465 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:55:12.534434 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2d86f51_524c_4513_84e7_f560a56dcde3.slice/crio-03095707d55f2493fb7a1fb7b4cbffe323f8109d3e35069d4d236bbb10dc5eba WatchSource:0}: Error finding container 03095707d55f2493fb7a1fb7b4cbffe323f8109d3e35069d4d236bbb10dc5eba: Status 404 returned error can't find the container with id 03095707d55f2493fb7a1fb7b4cbffe323f8109d3e35069d4d236bbb10dc5eba Apr 17 17:55:13.464179 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:13.464144 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" event={"ID":"c2d86f51-524c-4513-84e7-f560a56dcde3","Type":"ContainerStarted","Data":"80f43a34c9a2373a1d826178c544326a93f6e742c3cf999347568a5d086d653c"} Apr 17 17:55:13.464179 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:13.464179 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" event={"ID":"c2d86f51-524c-4513-84e7-f560a56dcde3","Type":"ContainerStarted","Data":"2f8a042f5e7742707fae5f21e3e8afe54dfb338c572d48ed3b594052833482d6"} Apr 17 17:55:13.464179 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:13.464188 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" event={"ID":"c2d86f51-524c-4513-84e7-f560a56dcde3","Type":"ContainerStarted","Data":"03095707d55f2493fb7a1fb7b4cbffe323f8109d3e35069d4d236bbb10dc5eba"} Apr 17 17:55:13.464775 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:13.464283 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" Apr 17 17:55:13.484002 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:13.483957 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" podStartSLOduration=1.48394323 podStartE2EDuration="1.48394323s" podCreationTimestamp="2026-04-17 17:55:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:55:13.481891068 +0000 UTC m=+1883.007571684" watchObservedRunningTime="2026-04-17 17:55:13.48394323 +0000 UTC m=+1883.009623848" Apr 17 17:55:13.750426 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:13.750386 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9" podUID="99521847-f803-41bd-9c3b-2a7dcfe5411e" containerName="sequence-graph-dd512" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:55:14.217614 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:14.217528 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" podUID="531417bf-dd53-4252-8b1b-81c2027ff5c6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.46:8643/healthz\": dial tcp 10.132.0.46:8643: connect: connection refused" Apr 17 17:55:14.435977 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:14.435927 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd" podUID="fc5a3cf8-71bb-43ef-8350-472babdebd07" containerName="splitter-graph-e4a99" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:55:14.467942 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:14.467864 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" Apr 17 17:55:14.469476 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:14.469446 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" podUID="c2d86f51-524c-4513-84e7-f560a56dcde3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 17 17:55:15.128210 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:15.128185 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" Apr 17 17:55:15.239297 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:15.239209 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-e4a99-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/531417bf-dd53-4252-8b1b-81c2027ff5c6-success-200-isvc-e4a99-kube-rbac-proxy-sar-config\") pod \"531417bf-dd53-4252-8b1b-81c2027ff5c6\" (UID: \"531417bf-dd53-4252-8b1b-81c2027ff5c6\") " Apr 17 17:55:15.239297 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:15.239255 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n6vx\" (UniqueName: \"kubernetes.io/projected/531417bf-dd53-4252-8b1b-81c2027ff5c6-kube-api-access-9n6vx\") pod \"531417bf-dd53-4252-8b1b-81c2027ff5c6\" (UID: \"531417bf-dd53-4252-8b1b-81c2027ff5c6\") " Apr 17 17:55:15.239297 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:15.239279 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/531417bf-dd53-4252-8b1b-81c2027ff5c6-proxy-tls\") pod \"531417bf-dd53-4252-8b1b-81c2027ff5c6\" (UID: \"531417bf-dd53-4252-8b1b-81c2027ff5c6\") " Apr 17 17:55:15.239637 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:15.239612 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/531417bf-dd53-4252-8b1b-81c2027ff5c6-success-200-isvc-e4a99-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-e4a99-kube-rbac-proxy-sar-config") pod "531417bf-dd53-4252-8b1b-81c2027ff5c6" (UID: "531417bf-dd53-4252-8b1b-81c2027ff5c6"). InnerVolumeSpecName "success-200-isvc-e4a99-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:55:15.241392 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:15.241364 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/531417bf-dd53-4252-8b1b-81c2027ff5c6-kube-api-access-9n6vx" (OuterVolumeSpecName: "kube-api-access-9n6vx") pod "531417bf-dd53-4252-8b1b-81c2027ff5c6" (UID: "531417bf-dd53-4252-8b1b-81c2027ff5c6"). InnerVolumeSpecName "kube-api-access-9n6vx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:55:15.241392 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:15.241369 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/531417bf-dd53-4252-8b1b-81c2027ff5c6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "531417bf-dd53-4252-8b1b-81c2027ff5c6" (UID: "531417bf-dd53-4252-8b1b-81c2027ff5c6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:55:15.340065 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:15.340028 2575 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-e4a99-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/531417bf-dd53-4252-8b1b-81c2027ff5c6-success-200-isvc-e4a99-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:55:15.340065 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:15.340064 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9n6vx\" (UniqueName: \"kubernetes.io/projected/531417bf-dd53-4252-8b1b-81c2027ff5c6-kube-api-access-9n6vx\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:55:15.340259 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:15.340081 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/531417bf-dd53-4252-8b1b-81c2027ff5c6-proxy-tls\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:55:15.472463 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:15.472429 2575 generic.go:358] "Generic (PLEG): container finished" podID="531417bf-dd53-4252-8b1b-81c2027ff5c6" containerID="e28ae3a4e73a2ca03f8aaf0dde2c487ce47d50eee2b58187958228ce6d000cd9" exitCode=0 Apr 17 17:55:15.472955 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:15.472493 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" Apr 17 17:55:15.472955 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:15.472509 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" event={"ID":"531417bf-dd53-4252-8b1b-81c2027ff5c6","Type":"ContainerDied","Data":"e28ae3a4e73a2ca03f8aaf0dde2c487ce47d50eee2b58187958228ce6d000cd9"} Apr 17 17:55:15.472955 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:15.472545 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj" event={"ID":"531417bf-dd53-4252-8b1b-81c2027ff5c6","Type":"ContainerDied","Data":"b684644d5e226e37b14eae0cadda1ebb2a302fdf3e0ab0a7d9506c424f62e1fa"} Apr 17 17:55:15.472955 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:15.472562 2575 scope.go:117] "RemoveContainer" containerID="f681ad65906074020a66c0d004b28a33715eddecdec6c5594f7736baa45de146" Apr 17 17:55:15.473186 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:15.473132 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" podUID="c2d86f51-524c-4513-84e7-f560a56dcde3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 17 17:55:15.481781 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:15.481765 2575 scope.go:117] "RemoveContainer" containerID="e28ae3a4e73a2ca03f8aaf0dde2c487ce47d50eee2b58187958228ce6d000cd9" Apr 17 17:55:15.491280 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:15.491263 2575 scope.go:117] "RemoveContainer" containerID="f681ad65906074020a66c0d004b28a33715eddecdec6c5594f7736baa45de146" Apr 17 17:55:15.491498 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:55:15.491480 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f681ad65906074020a66c0d004b28a33715eddecdec6c5594f7736baa45de146\": container with ID starting with f681ad65906074020a66c0d004b28a33715eddecdec6c5594f7736baa45de146 not found: ID does not exist" containerID="f681ad65906074020a66c0d004b28a33715eddecdec6c5594f7736baa45de146" Apr 17 17:55:15.491557 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:15.491503 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f681ad65906074020a66c0d004b28a33715eddecdec6c5594f7736baa45de146"} err="failed to get container status \"f681ad65906074020a66c0d004b28a33715eddecdec6c5594f7736baa45de146\": rpc error: code = NotFound desc = could not find container \"f681ad65906074020a66c0d004b28a33715eddecdec6c5594f7736baa45de146\": container with ID starting with f681ad65906074020a66c0d004b28a33715eddecdec6c5594f7736baa45de146 not found: ID does not exist" Apr 17 17:55:15.491557 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:15.491518 2575 scope.go:117] "RemoveContainer" containerID="e28ae3a4e73a2ca03f8aaf0dde2c487ce47d50eee2b58187958228ce6d000cd9" Apr 17 17:55:15.491727 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:55:15.491713 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e28ae3a4e73a2ca03f8aaf0dde2c487ce47d50eee2b58187958228ce6d000cd9\": container with ID starting with e28ae3a4e73a2ca03f8aaf0dde2c487ce47d50eee2b58187958228ce6d000cd9 not found: ID does not exist" containerID="e28ae3a4e73a2ca03f8aaf0dde2c487ce47d50eee2b58187958228ce6d000cd9" Apr 17 17:55:15.491765 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:15.491731 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e28ae3a4e73a2ca03f8aaf0dde2c487ce47d50eee2b58187958228ce6d000cd9"} err="failed to get container status \"e28ae3a4e73a2ca03f8aaf0dde2c487ce47d50eee2b58187958228ce6d000cd9\": rpc error: code = NotFound desc = could not find container \"e28ae3a4e73a2ca03f8aaf0dde2c487ce47d50eee2b58187958228ce6d000cd9\": container with ID starting with e28ae3a4e73a2ca03f8aaf0dde2c487ce47d50eee2b58187958228ce6d000cd9 not found: ID does not exist" Apr 17 17:55:15.496470 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:15.496432 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj"] Apr 17 17:55:15.500451 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:15.500432 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-e4a99-predictor-58f48cbb54-d9szj"] Apr 17 17:55:17.094986 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:17.094951 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="531417bf-dd53-4252-8b1b-81c2027ff5c6" path="/var/lib/kubelet/pods/531417bf-dd53-4252-8b1b-81c2027ff5c6/volumes" Apr 17 17:55:18.751754 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:18.751707 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9" podUID="99521847-f803-41bd-9c3b-2a7dcfe5411e" containerName="sequence-graph-dd512" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:55:19.435518 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:19.435481 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd" podUID="fc5a3cf8-71bb-43ef-8350-472babdebd07" containerName="splitter-graph-e4a99" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:55:20.406064 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:20.406021 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" podUID="d97c20bd-af97-487f-8815-e0497431b94d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 17 17:55:20.477625 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:20.477597 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" Apr 17 17:55:20.478211 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:20.478178 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" podUID="c2d86f51-524c-4513-84e7-f560a56dcde3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 17 17:55:21.335131 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:21.335106 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9" Apr 17 17:55:21.390908 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:21.390876 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99521847-f803-41bd-9c3b-2a7dcfe5411e-openshift-service-ca-bundle\") pod \"99521847-f803-41bd-9c3b-2a7dcfe5411e\" (UID: \"99521847-f803-41bd-9c3b-2a7dcfe5411e\") " Apr 17 17:55:21.391083 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:21.390933 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99521847-f803-41bd-9c3b-2a7dcfe5411e-proxy-tls\") pod \"99521847-f803-41bd-9c3b-2a7dcfe5411e\" (UID: \"99521847-f803-41bd-9c3b-2a7dcfe5411e\") " Apr 17 17:55:21.391305 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:21.391282 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99521847-f803-41bd-9c3b-2a7dcfe5411e-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "99521847-f803-41bd-9c3b-2a7dcfe5411e" (UID: "99521847-f803-41bd-9c3b-2a7dcfe5411e"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:55:21.393023 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:21.392998 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99521847-f803-41bd-9c3b-2a7dcfe5411e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "99521847-f803-41bd-9c3b-2a7dcfe5411e" (UID: "99521847-f803-41bd-9c3b-2a7dcfe5411e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:55:21.491665 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:21.491570 2575 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99521847-f803-41bd-9c3b-2a7dcfe5411e-openshift-service-ca-bundle\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:55:21.491665 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:21.491615 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99521847-f803-41bd-9c3b-2a7dcfe5411e-proxy-tls\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:55:21.493819 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:21.493793 2575 generic.go:358] "Generic (PLEG): container finished" podID="99521847-f803-41bd-9c3b-2a7dcfe5411e" containerID="8476f7a523dc433893be0f414ebe059ff3e10704aacd99a81b786a5f9c5db18e" exitCode=137 Apr 17 17:55:21.493958 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:21.493873 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9" event={"ID":"99521847-f803-41bd-9c3b-2a7dcfe5411e","Type":"ContainerDied","Data":"8476f7a523dc433893be0f414ebe059ff3e10704aacd99a81b786a5f9c5db18e"} Apr 17 17:55:21.493958 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:21.493911 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9" event={"ID":"99521847-f803-41bd-9c3b-2a7dcfe5411e","Type":"ContainerDied","Data":"f0f2e3e46ed825f59bb9de4bb18f61c3cc0eb5f6eacc0eeda34406b2177fcb59"} Apr 17 17:55:21.493958 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:21.493925 2575 scope.go:117] "RemoveContainer" containerID="8476f7a523dc433893be0f414ebe059ff3e10704aacd99a81b786a5f9c5db18e" Apr 17 17:55:21.494076 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:21.493886 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9" Apr 17 17:55:21.502603 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:21.502584 2575 scope.go:117] "RemoveContainer" containerID="8476f7a523dc433893be0f414ebe059ff3e10704aacd99a81b786a5f9c5db18e" Apr 17 17:55:21.502857 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:55:21.502815 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8476f7a523dc433893be0f414ebe059ff3e10704aacd99a81b786a5f9c5db18e\": container with ID starting with 8476f7a523dc433893be0f414ebe059ff3e10704aacd99a81b786a5f9c5db18e not found: ID does not exist" containerID="8476f7a523dc433893be0f414ebe059ff3e10704aacd99a81b786a5f9c5db18e" Apr 17 17:55:21.502962 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:21.502862 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8476f7a523dc433893be0f414ebe059ff3e10704aacd99a81b786a5f9c5db18e"} err="failed to get container status \"8476f7a523dc433893be0f414ebe059ff3e10704aacd99a81b786a5f9c5db18e\": rpc error: code = NotFound desc = could not find container \"8476f7a523dc433893be0f414ebe059ff3e10704aacd99a81b786a5f9c5db18e\": container with ID starting with 8476f7a523dc433893be0f414ebe059ff3e10704aacd99a81b786a5f9c5db18e not found: ID does not exist" Apr 17 17:55:21.514266 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:21.514245 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9"] Apr 17 17:55:21.518160 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:21.518139 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sequence-graph-dd512-7648f6b85c-9h6m9"] Apr 17 17:55:23.094775 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:23.094742 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99521847-f803-41bd-9c3b-2a7dcfe5411e" path="/var/lib/kubelet/pods/99521847-f803-41bd-9c3b-2a7dcfe5411e/volumes" Apr 17 17:55:24.435139 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:24.435094 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd" podUID="fc5a3cf8-71bb-43ef-8350-472babdebd07" containerName="splitter-graph-e4a99" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:55:24.435620 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:24.435195 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd" Apr 17 17:55:29.435083 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:29.435042 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd" podUID="fc5a3cf8-71bb-43ef-8350-472babdebd07" containerName="splitter-graph-e4a99" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:55:30.405173 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:30.405125 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" podUID="d97c20bd-af97-487f-8815-e0497431b94d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 17 17:55:30.478443 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:30.478404 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" podUID="c2d86f51-524c-4513-84e7-f560a56dcde3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 17 17:55:34.436510 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:34.436467 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd" podUID="fc5a3cf8-71bb-43ef-8350-472babdebd07" containerName="splitter-graph-e4a99" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:55:39.435307 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:39.435268 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd" podUID="fc5a3cf8-71bb-43ef-8350-472babdebd07" containerName="splitter-graph-e4a99" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:55:40.405959 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:40.405925 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" Apr 17 17:55:40.478757 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:40.478712 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" podUID="c2d86f51-524c-4513-84e7-f560a56dcde3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 17 17:55:42.089876 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:42.089854 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd" Apr 17 17:55:42.163495 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:42.163467 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc5a3cf8-71bb-43ef-8350-472babdebd07-openshift-service-ca-bundle\") pod \"fc5a3cf8-71bb-43ef-8350-472babdebd07\" (UID: \"fc5a3cf8-71bb-43ef-8350-472babdebd07\") " Apr 17 17:55:42.163653 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:42.163506 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc5a3cf8-71bb-43ef-8350-472babdebd07-proxy-tls\") pod \"fc5a3cf8-71bb-43ef-8350-472babdebd07\" (UID: \"fc5a3cf8-71bb-43ef-8350-472babdebd07\") " Apr 17 17:55:42.163861 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:42.163840 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc5a3cf8-71bb-43ef-8350-472babdebd07-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "fc5a3cf8-71bb-43ef-8350-472babdebd07" (UID: "fc5a3cf8-71bb-43ef-8350-472babdebd07"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:55:42.165532 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:42.165507 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc5a3cf8-71bb-43ef-8350-472babdebd07-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fc5a3cf8-71bb-43ef-8350-472babdebd07" (UID: "fc5a3cf8-71bb-43ef-8350-472babdebd07"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:55:42.265099 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:42.265071 2575 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc5a3cf8-71bb-43ef-8350-472babdebd07-openshift-service-ca-bundle\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:55:42.265247 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:42.265101 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc5a3cf8-71bb-43ef-8350-472babdebd07-proxy-tls\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 17:55:42.566148 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:42.566058 2575 generic.go:358] "Generic (PLEG): container finished" podID="fc5a3cf8-71bb-43ef-8350-472babdebd07" containerID="82bc0b8d34543b424418dfea4f629c3368c470d0d93b667f7beee6a154042f4d" exitCode=0 Apr 17 17:55:42.566148 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:42.566123 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd" Apr 17 17:55:42.566372 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:42.566144 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd" event={"ID":"fc5a3cf8-71bb-43ef-8350-472babdebd07","Type":"ContainerDied","Data":"82bc0b8d34543b424418dfea4f629c3368c470d0d93b667f7beee6a154042f4d"} Apr 17 17:55:42.566372 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:42.566189 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd" event={"ID":"fc5a3cf8-71bb-43ef-8350-472babdebd07","Type":"ContainerDied","Data":"f3f86d3503856f360fa5ce9f17321193f8c5c78db9230c08dc80162ac5113121"} Apr 17 17:55:42.566372 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:42.566210 2575 scope.go:117] "RemoveContainer" containerID="82bc0b8d34543b424418dfea4f629c3368c470d0d93b667f7beee6a154042f4d" Apr 17 17:55:42.574268 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:42.574248 2575 scope.go:117] "RemoveContainer" containerID="82bc0b8d34543b424418dfea4f629c3368c470d0d93b667f7beee6a154042f4d" Apr 17 17:55:42.574521 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:55:42.574500 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82bc0b8d34543b424418dfea4f629c3368c470d0d93b667f7beee6a154042f4d\": container with ID starting with 82bc0b8d34543b424418dfea4f629c3368c470d0d93b667f7beee6a154042f4d not found: ID does not exist" containerID="82bc0b8d34543b424418dfea4f629c3368c470d0d93b667f7beee6a154042f4d" Apr 17 17:55:42.574603 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:42.574531 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82bc0b8d34543b424418dfea4f629c3368c470d0d93b667f7beee6a154042f4d"} err="failed to get container status \"82bc0b8d34543b424418dfea4f629c3368c470d0d93b667f7beee6a154042f4d\": rpc error: code = NotFound desc = could not find container \"82bc0b8d34543b424418dfea4f629c3368c470d0d93b667f7beee6a154042f4d\": container with ID starting with 82bc0b8d34543b424418dfea4f629c3368c470d0d93b667f7beee6a154042f4d not found: ID does not exist" Apr 17 17:55:42.585685 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:42.585640 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd"] Apr 17 17:55:42.591352 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:42.591331 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-e4a99-7d99d74f5f-459vd"] Apr 17 17:55:43.094073 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:43.094040 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc5a3cf8-71bb-43ef-8350-472babdebd07" path="/var/lib/kubelet/pods/fc5a3cf8-71bb-43ef-8350-472babdebd07/volumes" Apr 17 17:55:50.478134 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:50.478091 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" podUID="c2d86f51-524c-4513-84e7-f560a56dcde3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 17 17:55:51.408980 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:51.408949 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw"] Apr 17 17:55:51.409326 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:51.409314 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99521847-f803-41bd-9c3b-2a7dcfe5411e" containerName="sequence-graph-dd512" Apr 17 17:55:51.409375 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:51.409331 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="99521847-f803-41bd-9c3b-2a7dcfe5411e" containerName="sequence-graph-dd512" Apr 17 17:55:51.409375 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:51.409349 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="531417bf-dd53-4252-8b1b-81c2027ff5c6" containerName="kserve-container" Apr 17 17:55:51.409375 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:51.409354 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="531417bf-dd53-4252-8b1b-81c2027ff5c6" containerName="kserve-container" Apr 17 17:55:51.409375 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:51.409361 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc5a3cf8-71bb-43ef-8350-472babdebd07" containerName="splitter-graph-e4a99" Apr 17 17:55:51.409375 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:51.409366 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc5a3cf8-71bb-43ef-8350-472babdebd07" containerName="splitter-graph-e4a99" Apr 17 17:55:51.409522 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:51.409379 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="531417bf-dd53-4252-8b1b-81c2027ff5c6" containerName="kube-rbac-proxy" Apr 17 17:55:51.409522 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:51.409384 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="531417bf-dd53-4252-8b1b-81c2027ff5c6" containerName="kube-rbac-proxy" Apr 17 17:55:51.409522 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:51.409439 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="531417bf-dd53-4252-8b1b-81c2027ff5c6" containerName="kserve-container" Apr 17 17:55:51.409522 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:51.409446 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="531417bf-dd53-4252-8b1b-81c2027ff5c6" containerName="kube-rbac-proxy" Apr 17 17:55:51.409522 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:51.409451 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="fc5a3cf8-71bb-43ef-8350-472babdebd07" containerName="splitter-graph-e4a99" Apr 17 17:55:51.409522 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:51.409460 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="99521847-f803-41bd-9c3b-2a7dcfe5411e" containerName="sequence-graph-dd512" Apr 17 17:55:51.413930 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:51.413900 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw" Apr 17 17:55:51.416086 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:51.416066 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-0283f-serving-cert\"" Apr 17 17:55:51.416199 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:51.416084 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"switch-graph-0283f-kube-rbac-proxy-sar-config\"" Apr 17 17:55:51.419779 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:51.419757 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw"] Apr 17 17:55:51.542746 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:51.542712 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b1804bb-494c-452d-9ab6-c616587d7f91-openshift-service-ca-bundle\") pod \"switch-graph-0283f-7dfb866856-9q6fw\" (UID: \"7b1804bb-494c-452d-9ab6-c616587d7f91\") " pod="kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw" Apr 17 17:55:51.542746 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:51.542758 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b1804bb-494c-452d-9ab6-c616587d7f91-proxy-tls\") pod \"switch-graph-0283f-7dfb866856-9q6fw\" (UID: \"7b1804bb-494c-452d-9ab6-c616587d7f91\") " pod="kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw" Apr 17 17:55:51.643851 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:51.643795 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b1804bb-494c-452d-9ab6-c616587d7f91-proxy-tls\") pod \"switch-graph-0283f-7dfb866856-9q6fw\" (UID: \"7b1804bb-494c-452d-9ab6-c616587d7f91\") " pod="kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw" Apr 17 17:55:51.644044 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:51.643938 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b1804bb-494c-452d-9ab6-c616587d7f91-openshift-service-ca-bundle\") pod \"switch-graph-0283f-7dfb866856-9q6fw\" (UID: \"7b1804bb-494c-452d-9ab6-c616587d7f91\") " pod="kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw" Apr 17 17:55:51.644044 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:55:51.643943 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/switch-graph-0283f-serving-cert: secret "switch-graph-0283f-serving-cert" not found Apr 17 17:55:51.644165 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:55:51.644045 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b1804bb-494c-452d-9ab6-c616587d7f91-proxy-tls podName:7b1804bb-494c-452d-9ab6-c616587d7f91 nodeName:}" failed. No retries permitted until 2026-04-17 17:55:52.144024176 +0000 UTC m=+1921.669704780 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/7b1804bb-494c-452d-9ab6-c616587d7f91-proxy-tls") pod "switch-graph-0283f-7dfb866856-9q6fw" (UID: "7b1804bb-494c-452d-9ab6-c616587d7f91") : secret "switch-graph-0283f-serving-cert" not found Apr 17 17:55:51.644463 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:51.644439 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b1804bb-494c-452d-9ab6-c616587d7f91-openshift-service-ca-bundle\") pod \"switch-graph-0283f-7dfb866856-9q6fw\" (UID: \"7b1804bb-494c-452d-9ab6-c616587d7f91\") " pod="kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw" Apr 17 17:55:52.147289 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:52.147250 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b1804bb-494c-452d-9ab6-c616587d7f91-proxy-tls\") pod \"switch-graph-0283f-7dfb866856-9q6fw\" (UID: \"7b1804bb-494c-452d-9ab6-c616587d7f91\") " pod="kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw" Apr 17 17:55:52.149681 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:52.149651 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b1804bb-494c-452d-9ab6-c616587d7f91-proxy-tls\") pod \"switch-graph-0283f-7dfb866856-9q6fw\" (UID: \"7b1804bb-494c-452d-9ab6-c616587d7f91\") " pod="kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw" Apr 17 17:55:52.326388 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:52.326353 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw" Apr 17 17:55:52.443569 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:52.443500 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw"] Apr 17 17:55:52.446174 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:55:52.446144 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b1804bb_494c_452d_9ab6_c616587d7f91.slice/crio-a50d48fc2f00aa8037916c59bf4febc253c4ee0c5918ec5c3009149e681fdfa7 WatchSource:0}: Error finding container a50d48fc2f00aa8037916c59bf4febc253c4ee0c5918ec5c3009149e681fdfa7: Status 404 returned error can't find the container with id a50d48fc2f00aa8037916c59bf4febc253c4ee0c5918ec5c3009149e681fdfa7 Apr 17 17:55:52.601033 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:52.601000 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw" event={"ID":"7b1804bb-494c-452d-9ab6-c616587d7f91","Type":"ContainerStarted","Data":"299650f52becbc993f2b8991ddd86ce6caca4a6948c2a22724519304c692674c"} Apr 17 17:55:52.601033 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:52.601039 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw" Apr 17 17:55:52.601441 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:52.601054 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw" event={"ID":"7b1804bb-494c-452d-9ab6-c616587d7f91","Type":"ContainerStarted","Data":"a50d48fc2f00aa8037916c59bf4febc253c4ee0c5918ec5c3009149e681fdfa7"} Apr 17 17:55:52.617216 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:52.617162 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw" podStartSLOduration=1.617148405 podStartE2EDuration="1.617148405s" podCreationTimestamp="2026-04-17 17:55:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:55:52.61534116 +0000 UTC m=+1922.141021779" watchObservedRunningTime="2026-04-17 17:55:52.617148405 +0000 UTC m=+1922.142829025" Apr 17 17:55:58.610272 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:55:58.610237 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw" Apr 17 17:56:00.478768 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:56:00.478738 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" Apr 17 17:56:12.266788 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:56:12.266754 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q"] Apr 17 17:56:12.271550 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:56:12.271531 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q" Apr 17 17:56:12.276113 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:56:12.276093 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-a935a-serving-cert\"" Apr 17 17:56:12.276229 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:56:12.276097 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"splitter-graph-a935a-kube-rbac-proxy-sar-config\"" Apr 17 17:56:12.292063 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:56:12.292036 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q"] Apr 17 17:56:12.307941 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:56:12.307911 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d71cc4b-fcea-41f7-b193-ab0017ab8850-openshift-service-ca-bundle\") pod \"splitter-graph-a935a-584b858c64-b458q\" (UID: \"7d71cc4b-fcea-41f7-b193-ab0017ab8850\") " pod="kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q" Apr 17 17:56:12.308061 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:56:12.308004 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d71cc4b-fcea-41f7-b193-ab0017ab8850-proxy-tls\") pod \"splitter-graph-a935a-584b858c64-b458q\" (UID: \"7d71cc4b-fcea-41f7-b193-ab0017ab8850\") " pod="kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q" Apr 17 17:56:12.409097 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:56:12.409064 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d71cc4b-fcea-41f7-b193-ab0017ab8850-openshift-service-ca-bundle\") pod \"splitter-graph-a935a-584b858c64-b458q\" (UID: \"7d71cc4b-fcea-41f7-b193-ab0017ab8850\") " pod="kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q" Apr 17 17:56:12.409252 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:56:12.409152 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d71cc4b-fcea-41f7-b193-ab0017ab8850-proxy-tls\") pod \"splitter-graph-a935a-584b858c64-b458q\" (UID: \"7d71cc4b-fcea-41f7-b193-ab0017ab8850\") " pod="kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q" Apr 17 17:56:12.409300 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:56:12.409281 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/splitter-graph-a935a-serving-cert: secret "splitter-graph-a935a-serving-cert" not found Apr 17 17:56:12.409360 ip-10-0-140-33 kubenswrapper[2575]: E0417 17:56:12.409349 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d71cc4b-fcea-41f7-b193-ab0017ab8850-proxy-tls podName:7d71cc4b-fcea-41f7-b193-ab0017ab8850 nodeName:}" failed. No retries permitted until 2026-04-17 17:56:12.909327318 +0000 UTC m=+1942.435007924 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/7d71cc4b-fcea-41f7-b193-ab0017ab8850-proxy-tls") pod "splitter-graph-a935a-584b858c64-b458q" (UID: "7d71cc4b-fcea-41f7-b193-ab0017ab8850") : secret "splitter-graph-a935a-serving-cert" not found Apr 17 17:56:12.409654 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:56:12.409634 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d71cc4b-fcea-41f7-b193-ab0017ab8850-openshift-service-ca-bundle\") pod \"splitter-graph-a935a-584b858c64-b458q\" (UID: \"7d71cc4b-fcea-41f7-b193-ab0017ab8850\") " pod="kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q" Apr 17 17:56:12.913222 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:56:12.913182 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d71cc4b-fcea-41f7-b193-ab0017ab8850-proxy-tls\") pod \"splitter-graph-a935a-584b858c64-b458q\" (UID: \"7d71cc4b-fcea-41f7-b193-ab0017ab8850\") " pod="kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q" Apr 17 17:56:12.915523 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:56:12.915502 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d71cc4b-fcea-41f7-b193-ab0017ab8850-proxy-tls\") pod \"splitter-graph-a935a-584b858c64-b458q\" (UID: \"7d71cc4b-fcea-41f7-b193-ab0017ab8850\") " pod="kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q" Apr 17 17:56:13.181718 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:56:13.181635 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q" Apr 17 17:56:13.312940 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:56:13.312908 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q"] Apr 17 17:56:13.317370 ip-10-0-140-33 kubenswrapper[2575]: W0417 17:56:13.317345 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d71cc4b_fcea_41f7_b193_ab0017ab8850.slice/crio-15bf3af7ce93c3296e3bad295b9eec9f0cd42f8148643ac3190b9f6e2b66c0fc WatchSource:0}: Error finding container 15bf3af7ce93c3296e3bad295b9eec9f0cd42f8148643ac3190b9f6e2b66c0fc: Status 404 returned error can't find the container with id 15bf3af7ce93c3296e3bad295b9eec9f0cd42f8148643ac3190b9f6e2b66c0fc Apr 17 17:56:13.668970 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:56:13.668914 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q" event={"ID":"7d71cc4b-fcea-41f7-b193-ab0017ab8850","Type":"ContainerStarted","Data":"d0a40dd45ff9e5176459c3d54d8b41dd739835c4974216995124a985085ed10a"} Apr 17 17:56:13.668970 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:56:13.668971 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q" event={"ID":"7d71cc4b-fcea-41f7-b193-ab0017ab8850","Type":"ContainerStarted","Data":"15bf3af7ce93c3296e3bad295b9eec9f0cd42f8148643ac3190b9f6e2b66c0fc"} Apr 17 17:56:13.669195 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:56:13.669002 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q" Apr 17 17:56:13.697100 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:56:13.697059 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q" podStartSLOduration=1.697046894 podStartE2EDuration="1.697046894s" podCreationTimestamp="2026-04-17 17:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:56:13.692051616 +0000 UTC m=+1943.217732234" watchObservedRunningTime="2026-04-17 17:56:13.697046894 +0000 UTC m=+1943.222727513" Apr 17 17:56:19.677363 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:56:19.677333 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q" Apr 17 17:58:51.134042 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:58:51.134007 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctj42_a9e0a184-477b-45d6-835a-1606a973a5cf/console-operator/1.log" Apr 17 17:58:51.139285 ip-10-0-140-33 kubenswrapper[2575]: I0417 17:58:51.139266 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctj42_a9e0a184-477b-45d6-835a-1606a973a5cf/console-operator/1.log" Apr 17 18:03:51.155620 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:03:51.155513 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctj42_a9e0a184-477b-45d6-835a-1606a973a5cf/console-operator/1.log" Apr 17 18:03:51.163733 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:03:51.163702 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctj42_a9e0a184-477b-45d6-835a-1606a973a5cf/console-operator/1.log" Apr 17 18:04:26.714499 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:26.714410 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q"] Apr 17 18:04:26.715096 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:26.714729 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q" podUID="7d71cc4b-fcea-41f7-b193-ab0017ab8850" containerName="splitter-graph-a935a" containerID="cri-o://d0a40dd45ff9e5176459c3d54d8b41dd739835c4974216995124a985085ed10a" gracePeriod=30 Apr 17 18:04:26.802302 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:26.802268 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5"] Apr 17 18:04:26.802577 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:26.802554 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" podUID="c2d86f51-524c-4513-84e7-f560a56dcde3" containerName="kserve-container" containerID="cri-o://2f8a042f5e7742707fae5f21e3e8afe54dfb338c572d48ed3b594052833482d6" gracePeriod=30 Apr 17 18:04:26.802687 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:26.802611 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" podUID="c2d86f51-524c-4513-84e7-f560a56dcde3" containerName="kube-rbac-proxy" containerID="cri-o://80f43a34c9a2373a1d826178c544326a93f6e742c3cf999347568a5d086d653c" gracePeriod=30 Apr 17 18:04:27.295190 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:27.295159 2575 generic.go:358] "Generic (PLEG): container finished" podID="c2d86f51-524c-4513-84e7-f560a56dcde3" containerID="80f43a34c9a2373a1d826178c544326a93f6e742c3cf999347568a5d086d653c" exitCode=2 Apr 17 18:04:27.295392 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:27.295229 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" event={"ID":"c2d86f51-524c-4513-84e7-f560a56dcde3","Type":"ContainerDied","Data":"80f43a34c9a2373a1d826178c544326a93f6e742c3cf999347568a5d086d653c"} Apr 17 18:04:29.676072 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:29.676036 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q" podUID="7d71cc4b-fcea-41f7-b193-ab0017ab8850" containerName="splitter-graph-a935a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 18:04:29.845860 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:29.845814 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" Apr 17 18:04:29.928467 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:29.928374 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2d86f51-524c-4513-84e7-f560a56dcde3-proxy-tls\") pod \"c2d86f51-524c-4513-84e7-f560a56dcde3\" (UID: \"c2d86f51-524c-4513-84e7-f560a56dcde3\") " Apr 17 18:04:29.928636 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:29.928499 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsbsj\" (UniqueName: \"kubernetes.io/projected/c2d86f51-524c-4513-84e7-f560a56dcde3-kube-api-access-dsbsj\") pod \"c2d86f51-524c-4513-84e7-f560a56dcde3\" (UID: \"c2d86f51-524c-4513-84e7-f560a56dcde3\") " Apr 17 18:04:29.928636 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:29.928527 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-a935a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c2d86f51-524c-4513-84e7-f560a56dcde3-success-200-isvc-a935a-kube-rbac-proxy-sar-config\") pod \"c2d86f51-524c-4513-84e7-f560a56dcde3\" (UID: \"c2d86f51-524c-4513-84e7-f560a56dcde3\") " Apr 17 18:04:29.928955 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:29.928928 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2d86f51-524c-4513-84e7-f560a56dcde3-success-200-isvc-a935a-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-a935a-kube-rbac-proxy-sar-config") pod "c2d86f51-524c-4513-84e7-f560a56dcde3" (UID: "c2d86f51-524c-4513-84e7-f560a56dcde3"). InnerVolumeSpecName "success-200-isvc-a935a-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:04:29.930562 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:29.930536 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2d86f51-524c-4513-84e7-f560a56dcde3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c2d86f51-524c-4513-84e7-f560a56dcde3" (UID: "c2d86f51-524c-4513-84e7-f560a56dcde3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:04:29.930562 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:29.930540 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2d86f51-524c-4513-84e7-f560a56dcde3-kube-api-access-dsbsj" (OuterVolumeSpecName: "kube-api-access-dsbsj") pod "c2d86f51-524c-4513-84e7-f560a56dcde3" (UID: "c2d86f51-524c-4513-84e7-f560a56dcde3"). InnerVolumeSpecName "kube-api-access-dsbsj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:04:30.029129 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:30.029097 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dsbsj\" (UniqueName: \"kubernetes.io/projected/c2d86f51-524c-4513-84e7-f560a56dcde3-kube-api-access-dsbsj\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 18:04:30.029129 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:30.029128 2575 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-a935a-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c2d86f51-524c-4513-84e7-f560a56dcde3-success-200-isvc-a935a-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 18:04:30.029324 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:30.029140 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2d86f51-524c-4513-84e7-f560a56dcde3-proxy-tls\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 18:04:30.307002 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:30.306963 2575 generic.go:358] "Generic (PLEG): container finished" podID="c2d86f51-524c-4513-84e7-f560a56dcde3" containerID="2f8a042f5e7742707fae5f21e3e8afe54dfb338c572d48ed3b594052833482d6" exitCode=0 Apr 17 18:04:30.307174 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:30.307041 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" event={"ID":"c2d86f51-524c-4513-84e7-f560a56dcde3","Type":"ContainerDied","Data":"2f8a042f5e7742707fae5f21e3e8afe54dfb338c572d48ed3b594052833482d6"} Apr 17 18:04:30.307174 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:30.307062 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" Apr 17 18:04:30.307174 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:30.307074 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5" event={"ID":"c2d86f51-524c-4513-84e7-f560a56dcde3","Type":"ContainerDied","Data":"03095707d55f2493fb7a1fb7b4cbffe323f8109d3e35069d4d236bbb10dc5eba"} Apr 17 18:04:30.307174 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:30.307092 2575 scope.go:117] "RemoveContainer" containerID="80f43a34c9a2373a1d826178c544326a93f6e742c3cf999347568a5d086d653c" Apr 17 18:04:30.315488 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:30.315471 2575 scope.go:117] "RemoveContainer" containerID="2f8a042f5e7742707fae5f21e3e8afe54dfb338c572d48ed3b594052833482d6" Apr 17 18:04:30.322364 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:30.322343 2575 scope.go:117] "RemoveContainer" containerID="80f43a34c9a2373a1d826178c544326a93f6e742c3cf999347568a5d086d653c" Apr 17 18:04:30.322593 ip-10-0-140-33 kubenswrapper[2575]: E0417 18:04:30.322574 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80f43a34c9a2373a1d826178c544326a93f6e742c3cf999347568a5d086d653c\": container with ID starting with 80f43a34c9a2373a1d826178c544326a93f6e742c3cf999347568a5d086d653c not found: ID does not exist" containerID="80f43a34c9a2373a1d826178c544326a93f6e742c3cf999347568a5d086d653c" Apr 17 18:04:30.322643 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:30.322603 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80f43a34c9a2373a1d826178c544326a93f6e742c3cf999347568a5d086d653c"} err="failed to get container status \"80f43a34c9a2373a1d826178c544326a93f6e742c3cf999347568a5d086d653c\": rpc error: code = NotFound desc = could not find container \"80f43a34c9a2373a1d826178c544326a93f6e742c3cf999347568a5d086d653c\": container with ID starting with 80f43a34c9a2373a1d826178c544326a93f6e742c3cf999347568a5d086d653c not found: ID does not exist" Apr 17 18:04:30.322643 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:30.322621 2575 scope.go:117] "RemoveContainer" containerID="2f8a042f5e7742707fae5f21e3e8afe54dfb338c572d48ed3b594052833482d6" Apr 17 18:04:30.322807 ip-10-0-140-33 kubenswrapper[2575]: E0417 18:04:30.322787 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f8a042f5e7742707fae5f21e3e8afe54dfb338c572d48ed3b594052833482d6\": container with ID starting with 2f8a042f5e7742707fae5f21e3e8afe54dfb338c572d48ed3b594052833482d6 not found: ID does not exist" containerID="2f8a042f5e7742707fae5f21e3e8afe54dfb338c572d48ed3b594052833482d6" Apr 17 18:04:30.322871 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:30.322813 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f8a042f5e7742707fae5f21e3e8afe54dfb338c572d48ed3b594052833482d6"} err="failed to get container status \"2f8a042f5e7742707fae5f21e3e8afe54dfb338c572d48ed3b594052833482d6\": rpc error: code = NotFound desc = could not find container \"2f8a042f5e7742707fae5f21e3e8afe54dfb338c572d48ed3b594052833482d6\": container with ID starting with 2f8a042f5e7742707fae5f21e3e8afe54dfb338c572d48ed3b594052833482d6 not found: ID does not exist" Apr 17 18:04:30.327427 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:30.327405 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5"] Apr 17 18:04:30.329736 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:30.329715 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-a935a-predictor-7c5f58bdc6-7m6x5"] Apr 17 18:04:31.094513 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:31.094472 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2d86f51-524c-4513-84e7-f560a56dcde3" path="/var/lib/kubelet/pods/c2d86f51-524c-4513-84e7-f560a56dcde3/volumes" Apr 17 18:04:34.676545 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:34.676504 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q" podUID="7d71cc4b-fcea-41f7-b193-ab0017ab8850" containerName="splitter-graph-a935a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 18:04:39.675671 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:39.675630 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q" podUID="7d71cc4b-fcea-41f7-b193-ab0017ab8850" containerName="splitter-graph-a935a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 18:04:39.676059 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:39.675749 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q" Apr 17 18:04:44.675677 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:44.675636 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q" podUID="7d71cc4b-fcea-41f7-b193-ab0017ab8850" containerName="splitter-graph-a935a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 18:04:49.675638 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:49.675596 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q" podUID="7d71cc4b-fcea-41f7-b193-ab0017ab8850" containerName="splitter-graph-a935a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 18:04:54.675992 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:54.675951 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q" podUID="7d71cc4b-fcea-41f7-b193-ab0017ab8850" containerName="splitter-graph-a935a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 18:04:56.859387 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:56.859362 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q" Apr 17 18:04:56.952046 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:56.952005 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d71cc4b-fcea-41f7-b193-ab0017ab8850-proxy-tls\") pod \"7d71cc4b-fcea-41f7-b193-ab0017ab8850\" (UID: \"7d71cc4b-fcea-41f7-b193-ab0017ab8850\") " Apr 17 18:04:56.952219 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:56.952109 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d71cc4b-fcea-41f7-b193-ab0017ab8850-openshift-service-ca-bundle\") pod \"7d71cc4b-fcea-41f7-b193-ab0017ab8850\" (UID: \"7d71cc4b-fcea-41f7-b193-ab0017ab8850\") " Apr 17 18:04:56.952462 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:56.952436 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d71cc4b-fcea-41f7-b193-ab0017ab8850-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "7d71cc4b-fcea-41f7-b193-ab0017ab8850" (UID: "7d71cc4b-fcea-41f7-b193-ab0017ab8850"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:04:56.954120 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:56.954093 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d71cc4b-fcea-41f7-b193-ab0017ab8850-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7d71cc4b-fcea-41f7-b193-ab0017ab8850" (UID: "7d71cc4b-fcea-41f7-b193-ab0017ab8850"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:04:57.053407 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:57.053373 2575 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d71cc4b-fcea-41f7-b193-ab0017ab8850-openshift-service-ca-bundle\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 18:04:57.053407 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:57.053403 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d71cc4b-fcea-41f7-b193-ab0017ab8850-proxy-tls\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 18:04:57.395653 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:57.395567 2575 generic.go:358] "Generic (PLEG): container finished" podID="7d71cc4b-fcea-41f7-b193-ab0017ab8850" containerID="d0a40dd45ff9e5176459c3d54d8b41dd739835c4974216995124a985085ed10a" exitCode=0 Apr 17 18:04:57.395653 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:57.395636 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q" Apr 17 18:04:57.395851 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:57.395626 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q" event={"ID":"7d71cc4b-fcea-41f7-b193-ab0017ab8850","Type":"ContainerDied","Data":"d0a40dd45ff9e5176459c3d54d8b41dd739835c4974216995124a985085ed10a"} Apr 17 18:04:57.395851 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:57.395752 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q" event={"ID":"7d71cc4b-fcea-41f7-b193-ab0017ab8850","Type":"ContainerDied","Data":"15bf3af7ce93c3296e3bad295b9eec9f0cd42f8148643ac3190b9f6e2b66c0fc"} Apr 17 18:04:57.395851 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:57.395772 2575 scope.go:117] "RemoveContainer" containerID="d0a40dd45ff9e5176459c3d54d8b41dd739835c4974216995124a985085ed10a" Apr 17 18:04:57.403752 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:57.403733 2575 scope.go:117] "RemoveContainer" containerID="d0a40dd45ff9e5176459c3d54d8b41dd739835c4974216995124a985085ed10a" Apr 17 18:04:57.403988 ip-10-0-140-33 kubenswrapper[2575]: E0417 18:04:57.403973 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0a40dd45ff9e5176459c3d54d8b41dd739835c4974216995124a985085ed10a\": container with ID starting with d0a40dd45ff9e5176459c3d54d8b41dd739835c4974216995124a985085ed10a not found: ID does not exist" containerID="d0a40dd45ff9e5176459c3d54d8b41dd739835c4974216995124a985085ed10a" Apr 17 18:04:57.404046 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:57.403996 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0a40dd45ff9e5176459c3d54d8b41dd739835c4974216995124a985085ed10a"} err="failed to get container status \"d0a40dd45ff9e5176459c3d54d8b41dd739835c4974216995124a985085ed10a\": rpc error: code = NotFound desc = could not find container \"d0a40dd45ff9e5176459c3d54d8b41dd739835c4974216995124a985085ed10a\": container with ID starting with d0a40dd45ff9e5176459c3d54d8b41dd739835c4974216995124a985085ed10a not found: ID does not exist" Apr 17 18:04:57.410933 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:57.410904 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q"] Apr 17 18:04:57.415635 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:57.415614 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/splitter-graph-a935a-584b858c64-b458q"] Apr 17 18:04:59.093992 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:04:59.093953 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d71cc4b-fcea-41f7-b193-ab0017ab8850" path="/var/lib/kubelet/pods/7d71cc4b-fcea-41f7-b193-ab0017ab8850/volumes" Apr 17 18:08:51.176392 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:08:51.176279 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctj42_a9e0a184-477b-45d6-835a-1606a973a5cf/console-operator/1.log" Apr 17 18:08:51.185624 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:08:51.185605 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctj42_a9e0a184-477b-45d6-835a-1606a973a5cf/console-operator/1.log" Apr 17 18:12:10.800740 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:10.800543 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw"] Apr 17 18:12:10.801313 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:10.800887 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw" podUID="7b1804bb-494c-452d-9ab6-c616587d7f91" containerName="switch-graph-0283f" containerID="cri-o://299650f52becbc993f2b8991ddd86ce6caca4a6948c2a22724519304c692674c" gracePeriod=30 Apr 17 18:12:10.921470 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:10.921433 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27"] Apr 17 18:12:10.921876 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:10.921790 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" podUID="d97c20bd-af97-487f-8815-e0497431b94d" containerName="kserve-container" containerID="cri-o://7f689c84389d0451d1746f13b04e1c6f7af4249be976818ffe7321a2341dd0dc" gracePeriod=30 Apr 17 18:12:10.922029 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:10.921856 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" podUID="d97c20bd-af97-487f-8815-e0497431b94d" containerName="kube-rbac-proxy" containerID="cri-o://0518db7470d82f5a892a5d4a5a3e2d2e9a48cbe7e15c5a8668ebb9e5bb078db5" gracePeriod=30 Apr 17 18:12:11.832201 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:11.832165 2575 generic.go:358] "Generic (PLEG): container finished" podID="d97c20bd-af97-487f-8815-e0497431b94d" containerID="0518db7470d82f5a892a5d4a5a3e2d2e9a48cbe7e15c5a8668ebb9e5bb078db5" exitCode=2 Apr 17 18:12:11.832586 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:11.832233 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" event={"ID":"d97c20bd-af97-487f-8815-e0497431b94d","Type":"ContainerDied","Data":"0518db7470d82f5a892a5d4a5a3e2d2e9a48cbe7e15c5a8668ebb9e5bb078db5"} Apr 17 18:12:13.607881 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:13.607837 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw" podUID="7b1804bb-494c-452d-9ab6-c616587d7f91" containerName="switch-graph-0283f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 18:12:13.841930 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:13.841898 2575 generic.go:358] "Generic (PLEG): container finished" podID="d97c20bd-af97-487f-8815-e0497431b94d" containerID="7f689c84389d0451d1746f13b04e1c6f7af4249be976818ffe7321a2341dd0dc" exitCode=0 Apr 17 18:12:13.842074 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:13.841975 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" event={"ID":"d97c20bd-af97-487f-8815-e0497431b94d","Type":"ContainerDied","Data":"7f689c84389d0451d1746f13b04e1c6f7af4249be976818ffe7321a2341dd0dc"} Apr 17 18:12:13.861768 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:13.861716 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" Apr 17 18:12:13.907230 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:13.907205 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtrx4\" (UniqueName: \"kubernetes.io/projected/d97c20bd-af97-487f-8815-e0497431b94d-kube-api-access-vtrx4\") pod \"d97c20bd-af97-487f-8815-e0497431b94d\" (UID: \"d97c20bd-af97-487f-8815-e0497431b94d\") " Apr 17 18:12:13.907357 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:13.907239 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d97c20bd-af97-487f-8815-e0497431b94d-proxy-tls\") pod \"d97c20bd-af97-487f-8815-e0497431b94d\" (UID: \"d97c20bd-af97-487f-8815-e0497431b94d\") " Apr 17 18:12:13.907357 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:13.907295 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"success-200-isvc-0283f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d97c20bd-af97-487f-8815-e0497431b94d-success-200-isvc-0283f-kube-rbac-proxy-sar-config\") pod \"d97c20bd-af97-487f-8815-e0497431b94d\" (UID: \"d97c20bd-af97-487f-8815-e0497431b94d\") " Apr 17 18:12:13.907687 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:13.907652 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d97c20bd-af97-487f-8815-e0497431b94d-success-200-isvc-0283f-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "success-200-isvc-0283f-kube-rbac-proxy-sar-config") pod "d97c20bd-af97-487f-8815-e0497431b94d" (UID: "d97c20bd-af97-487f-8815-e0497431b94d"). InnerVolumeSpecName "success-200-isvc-0283f-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:12:13.909312 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:13.909281 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d97c20bd-af97-487f-8815-e0497431b94d-kube-api-access-vtrx4" (OuterVolumeSpecName: "kube-api-access-vtrx4") pod "d97c20bd-af97-487f-8815-e0497431b94d" (UID: "d97c20bd-af97-487f-8815-e0497431b94d"). InnerVolumeSpecName "kube-api-access-vtrx4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:12:13.909312 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:13.909298 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d97c20bd-af97-487f-8815-e0497431b94d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d97c20bd-af97-487f-8815-e0497431b94d" (UID: "d97c20bd-af97-487f-8815-e0497431b94d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:12:14.007803 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:14.007775 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vtrx4\" (UniqueName: \"kubernetes.io/projected/d97c20bd-af97-487f-8815-e0497431b94d-kube-api-access-vtrx4\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 18:12:14.007803 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:14.007799 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d97c20bd-af97-487f-8815-e0497431b94d-proxy-tls\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 18:12:14.007977 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:14.007811 2575 reconciler_common.go:299] "Volume detached for volume \"success-200-isvc-0283f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d97c20bd-af97-487f-8815-e0497431b94d-success-200-isvc-0283f-kube-rbac-proxy-sar-config\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 18:12:14.846723 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:14.846688 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" event={"ID":"d97c20bd-af97-487f-8815-e0497431b94d","Type":"ContainerDied","Data":"7c305b77c2b5d4ddd64fa770cc69843a0cac9930d7a466b07cb686628c7c5106"} Apr 17 18:12:14.847139 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:14.846737 2575 scope.go:117] "RemoveContainer" containerID="0518db7470d82f5a892a5d4a5a3e2d2e9a48cbe7e15c5a8668ebb9e5bb078db5" Apr 17 18:12:14.847139 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:14.846761 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27" Apr 17 18:12:14.855723 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:14.855705 2575 scope.go:117] "RemoveContainer" containerID="7f689c84389d0451d1746f13b04e1c6f7af4249be976818ffe7321a2341dd0dc" Apr 17 18:12:14.867488 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:14.867466 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27"] Apr 17 18:12:14.870579 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:14.870556 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/success-200-isvc-0283f-predictor-bfc486968-gvj27"] Apr 17 18:12:15.094665 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:15.094632 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d97c20bd-af97-487f-8815-e0497431b94d" path="/var/lib/kubelet/pods/d97c20bd-af97-487f-8815-e0497431b94d/volumes" Apr 17 18:12:18.607977 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:18.607934 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw" podUID="7b1804bb-494c-452d-9ab6-c616587d7f91" containerName="switch-graph-0283f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 18:12:23.608575 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:23.608535 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw" podUID="7b1804bb-494c-452d-9ab6-c616587d7f91" containerName="switch-graph-0283f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 18:12:23.609047 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:23.608650 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw" Apr 17 18:12:26.450089 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:26.450060 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-0283f-7dfb866856-9q6fw_7b1804bb-494c-452d-9ab6-c616587d7f91/switch-graph-0283f/0.log" Apr 17 18:12:27.231332 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:27.231300 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-0283f-7dfb866856-9q6fw_7b1804bb-494c-452d-9ab6-c616587d7f91/switch-graph-0283f/0.log" Apr 17 18:12:28.004512 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:28.004478 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-0283f-7dfb866856-9q6fw_7b1804bb-494c-452d-9ab6-c616587d7f91/switch-graph-0283f/0.log" Apr 17 18:12:28.608066 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:28.608023 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw" podUID="7b1804bb-494c-452d-9ab6-c616587d7f91" containerName="switch-graph-0283f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 18:12:28.784007 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:28.783963 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-0283f-7dfb866856-9q6fw_7b1804bb-494c-452d-9ab6-c616587d7f91/switch-graph-0283f/0.log" Apr 17 18:12:29.548189 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:29.548156 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-0283f-7dfb866856-9q6fw_7b1804bb-494c-452d-9ab6-c616587d7f91/switch-graph-0283f/0.log" Apr 17 18:12:30.305939 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:30.305904 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-0283f-7dfb866856-9q6fw_7b1804bb-494c-452d-9ab6-c616587d7f91/switch-graph-0283f/0.log" Apr 17 18:12:31.098513 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:31.098472 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-0283f-7dfb866856-9q6fw_7b1804bb-494c-452d-9ab6-c616587d7f91/switch-graph-0283f/0.log" Apr 17 18:12:31.877642 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:31.877612 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-0283f-7dfb866856-9q6fw_7b1804bb-494c-452d-9ab6-c616587d7f91/switch-graph-0283f/0.log" Apr 17 18:12:32.647118 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:32.647089 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-0283f-7dfb866856-9q6fw_7b1804bb-494c-452d-9ab6-c616587d7f91/switch-graph-0283f/0.log" Apr 17 18:12:33.426416 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:33.426382 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-0283f-7dfb866856-9q6fw_7b1804bb-494c-452d-9ab6-c616587d7f91/switch-graph-0283f/0.log" Apr 17 18:12:33.608342 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:33.608300 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw" podUID="7b1804bb-494c-452d-9ab6-c616587d7f91" containerName="switch-graph-0283f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 18:12:34.249945 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:34.249912 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-0283f-7dfb866856-9q6fw_7b1804bb-494c-452d-9ab6-c616587d7f91/switch-graph-0283f/0.log" Apr 17 18:12:35.056659 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:35.056624 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_switch-graph-0283f-7dfb866856-9q6fw_7b1804bb-494c-452d-9ab6-c616587d7f91/switch-graph-0283f/0.log" Apr 17 18:12:38.608052 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:38.608014 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw" podUID="7b1804bb-494c-452d-9ab6-c616587d7f91" containerName="switch-graph-0283f" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 18:12:39.981339 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:39.981305 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-xkj74_909ae085-82dd-4695-a517-7bf565c61c39/global-pull-secret-syncer/0.log" Apr 17 18:12:40.057972 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:40.057945 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-nhs5m_ccb0409d-b96e-4a69-8ff3-e260b869ecdf/konnectivity-agent/0.log" Apr 17 18:12:40.167273 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:40.167243 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-33.ec2.internal_f04a4241565e19b6ce163e5a36620c34/haproxy/0.log" Apr 17 18:12:40.935671 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:40.935641 2575 generic.go:358] "Generic (PLEG): container finished" podID="7b1804bb-494c-452d-9ab6-c616587d7f91" containerID="299650f52becbc993f2b8991ddd86ce6caca4a6948c2a22724519304c692674c" exitCode=0 Apr 17 18:12:40.935855 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:40.935716 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw" event={"ID":"7b1804bb-494c-452d-9ab6-c616587d7f91","Type":"ContainerDied","Data":"299650f52becbc993f2b8991ddd86ce6caca4a6948c2a22724519304c692674c"} Apr 17 18:12:40.935855 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:40.935748 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw" event={"ID":"7b1804bb-494c-452d-9ab6-c616587d7f91","Type":"ContainerDied","Data":"a50d48fc2f00aa8037916c59bf4febc253c4ee0c5918ec5c3009149e681fdfa7"} Apr 17 18:12:40.935855 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:40.935758 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a50d48fc2f00aa8037916c59bf4febc253c4ee0c5918ec5c3009149e681fdfa7" Apr 17 18:12:40.948939 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:40.948915 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw" Apr 17 18:12:41.018529 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:41.018501 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b1804bb-494c-452d-9ab6-c616587d7f91-openshift-service-ca-bundle\") pod \"7b1804bb-494c-452d-9ab6-c616587d7f91\" (UID: \"7b1804bb-494c-452d-9ab6-c616587d7f91\") " Apr 17 18:12:41.018969 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:41.018552 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b1804bb-494c-452d-9ab6-c616587d7f91-proxy-tls\") pod \"7b1804bb-494c-452d-9ab6-c616587d7f91\" (UID: \"7b1804bb-494c-452d-9ab6-c616587d7f91\") " Apr 17 18:12:41.018969 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:41.018860 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b1804bb-494c-452d-9ab6-c616587d7f91-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "7b1804bb-494c-452d-9ab6-c616587d7f91" (UID: "7b1804bb-494c-452d-9ab6-c616587d7f91"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:12:41.020585 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:41.020564 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b1804bb-494c-452d-9ab6-c616587d7f91-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7b1804bb-494c-452d-9ab6-c616587d7f91" (UID: "7b1804bb-494c-452d-9ab6-c616587d7f91"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:12:41.119877 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:41.119817 2575 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b1804bb-494c-452d-9ab6-c616587d7f91-openshift-service-ca-bundle\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 18:12:41.119877 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:41.119871 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7b1804bb-494c-452d-9ab6-c616587d7f91-proxy-tls\") on node \"ip-10-0-140-33.ec2.internal\" DevicePath \"\"" Apr 17 18:12:41.939199 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:41.939164 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw" Apr 17 18:12:41.953390 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:41.953355 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw"] Apr 17 18:12:41.958994 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:41.958971 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/switch-graph-0283f-7dfb866856-9q6fw"] Apr 17 18:12:43.094369 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:43.094336 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b1804bb-494c-452d-9ab6-c616587d7f91" path="/var/lib/kubelet/pods/7b1804bb-494c-452d-9ab6-c616587d7f91/volumes" Apr 17 18:12:43.683619 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:43.683586 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-hclv4_3d9c3135-c5a6-4882-8da7-486b724f3469/cluster-monitoring-operator/0.log" Apr 17 18:12:43.913587 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:43.913557 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-x2x5j_3af1ae5b-3d27-42ae-84c6-15885f57a08c/node-exporter/0.log" Apr 17 18:12:43.933131 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:43.933103 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-x2x5j_3af1ae5b-3d27-42ae-84c6-15885f57a08c/kube-rbac-proxy/0.log" Apr 17 18:12:43.956417 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:43.956355 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-x2x5j_3af1ae5b-3d27-42ae-84c6-15885f57a08c/init-textfile/0.log" Apr 17 18:12:44.262316 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:44.262292 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-m7vkk_15d3f0c8-e1a6-41dc-b07d-e2ec68fa4a9c/prometheus-operator/0.log" Apr 17 18:12:44.280354 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:44.280332 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-m7vkk_15d3f0c8-e1a6-41dc-b07d-e2ec68fa4a9c/kube-rbac-proxy/0.log" Apr 17 18:12:44.408268 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:44.408244 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-dcc87df79-scgw5_f5901f9d-b096-438d-a602-d00595295e12/thanos-query/0.log" Apr 17 18:12:44.426052 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:44.426030 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-dcc87df79-scgw5_f5901f9d-b096-438d-a602-d00595295e12/kube-rbac-proxy-web/0.log" Apr 17 18:12:44.447135 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:44.447114 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-dcc87df79-scgw5_f5901f9d-b096-438d-a602-d00595295e12/kube-rbac-proxy/0.log" Apr 17 18:12:44.468222 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:44.468194 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-dcc87df79-scgw5_f5901f9d-b096-438d-a602-d00595295e12/prom-label-proxy/0.log" Apr 17 18:12:44.488059 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:44.488036 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-dcc87df79-scgw5_f5901f9d-b096-438d-a602-d00595295e12/kube-rbac-proxy-rules/0.log" Apr 17 18:12:44.509684 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:44.509659 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-dcc87df79-scgw5_f5901f9d-b096-438d-a602-d00595295e12/kube-rbac-proxy-metrics/0.log" Apr 17 18:12:45.691605 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:45.691573 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-5kx2x_663900d4-af09-4c17-bf7a-cf4e08cc616a/networking-console-plugin/0.log" Apr 17 18:12:46.107891 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:46.107860 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctj42_a9e0a184-477b-45d6-835a-1606a973a5cf/console-operator/1.log" Apr 17 18:12:46.113247 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:46.113227 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ctj42_a9e0a184-477b-45d6-835a-1606a973a5cf/console-operator/2.log" Apr 17 18:12:46.495968 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:46.495897 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-gp2f2_6e03d97b-c9c0-40c2-bba9-99c7c0c1b298/download-server/0.log" Apr 17 18:12:46.868671 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:46.868637 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-nk6g8_fc8eeee7-5da1-42b9-bbb1-9eaca3ec2a13/volume-data-source-validator/0.log" Apr 17 18:12:47.177786 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.177685 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kkzt9/perf-node-gather-daemonset-xst64"] Apr 17 18:12:47.178407 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.178384 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2d86f51-524c-4513-84e7-f560a56dcde3" containerName="kube-rbac-proxy" Apr 17 18:12:47.178502 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.178416 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d86f51-524c-4513-84e7-f560a56dcde3" containerName="kube-rbac-proxy" Apr 17 18:12:47.178502 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.178448 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d97c20bd-af97-487f-8815-e0497431b94d" containerName="kube-rbac-proxy" Apr 17 18:12:47.178502 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.178458 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d97c20bd-af97-487f-8815-e0497431b94d" containerName="kube-rbac-proxy" Apr 17 18:12:47.178502 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.178469 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d71cc4b-fcea-41f7-b193-ab0017ab8850" containerName="splitter-graph-a935a" Apr 17 18:12:47.178502 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.178477 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d71cc4b-fcea-41f7-b193-ab0017ab8850" containerName="splitter-graph-a935a" Apr 17 18:12:47.178502 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.178501 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d97c20bd-af97-487f-8815-e0497431b94d" containerName="kserve-container" Apr 17 18:12:47.178791 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.178510 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d97c20bd-af97-487f-8815-e0497431b94d" containerName="kserve-container" Apr 17 18:12:47.178791 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.178532 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2d86f51-524c-4513-84e7-f560a56dcde3" containerName="kserve-container" Apr 17 18:12:47.178791 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.178540 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d86f51-524c-4513-84e7-f560a56dcde3" containerName="kserve-container" Apr 17 18:12:47.178791 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.178549 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b1804bb-494c-452d-9ab6-c616587d7f91" containerName="switch-graph-0283f" Apr 17 18:12:47.178791 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.178557 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b1804bb-494c-452d-9ab6-c616587d7f91" containerName="switch-graph-0283f" Apr 17 18:12:47.178791 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.178710 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b1804bb-494c-452d-9ab6-c616587d7f91" containerName="switch-graph-0283f" Apr 17 18:12:47.178791 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.178724 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d97c20bd-af97-487f-8815-e0497431b94d" containerName="kube-rbac-proxy" Apr 17 18:12:47.178791 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.178742 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2d86f51-524c-4513-84e7-f560a56dcde3" containerName="kube-rbac-proxy" Apr 17 18:12:47.178791 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.178752 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2d86f51-524c-4513-84e7-f560a56dcde3" containerName="kserve-container" Apr 17 18:12:47.178791 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.178771 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d71cc4b-fcea-41f7-b193-ab0017ab8850" containerName="splitter-graph-a935a" Apr 17 18:12:47.178791 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.178783 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d97c20bd-af97-487f-8815-e0497431b94d" containerName="kserve-container" Apr 17 18:12:47.183657 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.183635 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-xst64" Apr 17 18:12:47.186938 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.186919 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kkzt9\"/\"kube-root-ca.crt\"" Apr 17 18:12:47.187468 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.187453 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-kkzt9\"/\"default-dockercfg-jtmx4\"" Apr 17 18:12:47.187729 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.187712 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kkzt9\"/\"openshift-service-ca.crt\"" Apr 17 18:12:47.191000 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.190977 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kkzt9/perf-node-gather-daemonset-xst64"] Apr 17 18:12:47.267472 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.267443 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t662\" (UniqueName: \"kubernetes.io/projected/21ccd6dc-c49b-4c72-8a6e-1e233ba101cc-kube-api-access-7t662\") pod \"perf-node-gather-daemonset-xst64\" (UID: \"21ccd6dc-c49b-4c72-8a6e-1e233ba101cc\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-xst64" Apr 17 18:12:47.267656 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.267498 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/21ccd6dc-c49b-4c72-8a6e-1e233ba101cc-sys\") pod \"perf-node-gather-daemonset-xst64\" (UID: \"21ccd6dc-c49b-4c72-8a6e-1e233ba101cc\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-xst64" Apr 17 18:12:47.267656 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.267602 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/21ccd6dc-c49b-4c72-8a6e-1e233ba101cc-proc\") pod \"perf-node-gather-daemonset-xst64\" (UID: \"21ccd6dc-c49b-4c72-8a6e-1e233ba101cc\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-xst64" Apr 17 18:12:47.267656 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.267642 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/21ccd6dc-c49b-4c72-8a6e-1e233ba101cc-lib-modules\") pod \"perf-node-gather-daemonset-xst64\" (UID: \"21ccd6dc-c49b-4c72-8a6e-1e233ba101cc\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-xst64" Apr 17 18:12:47.267806 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.267690 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/21ccd6dc-c49b-4c72-8a6e-1e233ba101cc-podres\") pod \"perf-node-gather-daemonset-xst64\" (UID: \"21ccd6dc-c49b-4c72-8a6e-1e233ba101cc\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-xst64" Apr 17 18:12:47.368544 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.368511 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/21ccd6dc-c49b-4c72-8a6e-1e233ba101cc-podres\") pod \"perf-node-gather-daemonset-xst64\" (UID: \"21ccd6dc-c49b-4c72-8a6e-1e233ba101cc\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-xst64" Apr 17 18:12:47.368737 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.368552 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7t662\" (UniqueName: \"kubernetes.io/projected/21ccd6dc-c49b-4c72-8a6e-1e233ba101cc-kube-api-access-7t662\") pod \"perf-node-gather-daemonset-xst64\" (UID: \"21ccd6dc-c49b-4c72-8a6e-1e233ba101cc\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-xst64" Apr 17 18:12:47.368737 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.368594 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/21ccd6dc-c49b-4c72-8a6e-1e233ba101cc-sys\") pod \"perf-node-gather-daemonset-xst64\" (UID: \"21ccd6dc-c49b-4c72-8a6e-1e233ba101cc\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-xst64" Apr 17 18:12:47.368737 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.368636 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/21ccd6dc-c49b-4c72-8a6e-1e233ba101cc-proc\") pod \"perf-node-gather-daemonset-xst64\" (UID: \"21ccd6dc-c49b-4c72-8a6e-1e233ba101cc\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-xst64" Apr 17 18:12:47.368737 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.368662 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/21ccd6dc-c49b-4c72-8a6e-1e233ba101cc-lib-modules\") pod \"perf-node-gather-daemonset-xst64\" (UID: \"21ccd6dc-c49b-4c72-8a6e-1e233ba101cc\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-xst64" Apr 17 18:12:47.368737 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.368689 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/21ccd6dc-c49b-4c72-8a6e-1e233ba101cc-podres\") pod \"perf-node-gather-daemonset-xst64\" (UID: \"21ccd6dc-c49b-4c72-8a6e-1e233ba101cc\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-xst64" Apr 17 18:12:47.368737 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.368713 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/21ccd6dc-c49b-4c72-8a6e-1e233ba101cc-sys\") pod \"perf-node-gather-daemonset-xst64\" (UID: \"21ccd6dc-c49b-4c72-8a6e-1e233ba101cc\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-xst64" Apr 17 18:12:47.368737 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.368730 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/21ccd6dc-c49b-4c72-8a6e-1e233ba101cc-proc\") pod \"perf-node-gather-daemonset-xst64\" (UID: \"21ccd6dc-c49b-4c72-8a6e-1e233ba101cc\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-xst64" Apr 17 18:12:47.369048 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.368774 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/21ccd6dc-c49b-4c72-8a6e-1e233ba101cc-lib-modules\") pod \"perf-node-gather-daemonset-xst64\" (UID: \"21ccd6dc-c49b-4c72-8a6e-1e233ba101cc\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-xst64" Apr 17 18:12:47.377729 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.377700 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t662\" (UniqueName: \"kubernetes.io/projected/21ccd6dc-c49b-4c72-8a6e-1e233ba101cc-kube-api-access-7t662\") pod \"perf-node-gather-daemonset-xst64\" (UID: \"21ccd6dc-c49b-4c72-8a6e-1e233ba101cc\") " pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-xst64" Apr 17 18:12:47.494298 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.494205 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-xst64" Apr 17 18:12:47.614677 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.614610 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kkzt9/perf-node-gather-daemonset-xst64"] Apr 17 18:12:47.618555 ip-10-0-140-33 kubenswrapper[2575]: W0417 18:12:47.618528 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod21ccd6dc_c49b_4c72_8a6e_1e233ba101cc.slice/crio-bd4dff6afe0964649e69d86a133cead1702eb6a2e67d9a286f52d603dcd322f5 WatchSource:0}: Error finding container bd4dff6afe0964649e69d86a133cead1702eb6a2e67d9a286f52d603dcd322f5: Status 404 returned error can't find the container with id bd4dff6afe0964649e69d86a133cead1702eb6a2e67d9a286f52d603dcd322f5 Apr 17 18:12:47.620313 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.620295 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 18:12:47.705608 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.705583 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wqv6n_cb895382-4679-4cac-97c0-92e3122b7ba0/dns/0.log" Apr 17 18:12:47.734464 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.734435 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wqv6n_cb895382-4679-4cac-97c0-92e3122b7ba0/kube-rbac-proxy/0.log" Apr 17 18:12:47.812529 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.812500 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qd2qr_adfae8ba-8d02-4f3c-85a7-b2ae828b0579/dns-node-resolver/0.log" Apr 17 18:12:47.956917 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.956885 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-xst64" event={"ID":"21ccd6dc-c49b-4c72-8a6e-1e233ba101cc","Type":"ContainerStarted","Data":"174b5cef03e413977b4c6efb67ff028940a217dd7a3641bbbfb40e67e87f867e"} Apr 17 18:12:47.956917 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.956922 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-xst64" event={"ID":"21ccd6dc-c49b-4c72-8a6e-1e233ba101cc","Type":"ContainerStarted","Data":"bd4dff6afe0964649e69d86a133cead1702eb6a2e67d9a286f52d603dcd322f5"} Apr 17 18:12:47.957321 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.956949 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-xst64" Apr 17 18:12:47.974150 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:47.974106 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-xst64" podStartSLOduration=0.974093598 podStartE2EDuration="974.093598ms" podCreationTimestamp="2026-04-17 18:12:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:12:47.972222901 +0000 UTC m=+2937.497903520" watchObservedRunningTime="2026-04-17 18:12:47.974093598 +0000 UTC m=+2937.499774217" Apr 17 18:12:48.276841 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:48.276797 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fxfbb_70cf02df-8c16-47aa-8b0e-1b1ee895fe07/node-ca/0.log" Apr 17 18:12:48.977570 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:48.977544 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-6c56c99d76-p95sx_e9a77eac-7075-491c-a2b9-080151e1cac9/router/0.log" Apr 17 18:12:49.276535 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:49.276490 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6r887_2d29d8c9-146b-4e7e-988e-c3d984ff39e7/serve-healthcheck-canary/0.log" Apr 17 18:12:49.665974 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:49.665901 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-9dmmr_eb7ea451-8e84-4c6c-9ca4-85e14c54d30a/insights-operator/0.log" Apr 17 18:12:49.666765 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:49.666749 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-9dmmr_eb7ea451-8e84-4c6c-9ca4-85e14c54d30a/insights-operator/1.log" Apr 17 18:12:49.812681 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:49.812655 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-js8tl_75caf319-8363-49ec-8384-f25af95fb9d1/kube-rbac-proxy/0.log" Apr 17 18:12:49.834141 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:49.834113 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-js8tl_75caf319-8363-49ec-8384-f25af95fb9d1/exporter/0.log" Apr 17 18:12:49.854564 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:49.854546 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-js8tl_75caf319-8363-49ec-8384-f25af95fb9d1/extractor/0.log" Apr 17 18:12:51.286353 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:51.286318 2575 scope.go:117] "RemoveContainer" containerID="299650f52becbc993f2b8991ddd86ce6caca4a6948c2a22724519304c692674c" Apr 17 18:12:51.731497 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:51.731422 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-85dd7cfb4d-pzkh6_4efd0ded-0386-4ccb-9635-7fa20c9a0364/manager/0.log" Apr 17 18:12:51.772277 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:51.772251 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-5mvbh_f7dcf894-ebef-4a1d-a53a-ba8d755b8497/server/0.log" Apr 17 18:12:52.003426 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:52.003395 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-cg49l_c4b47ad4-4b7e-46b1-b529-319ee4352e17/manager/0.log" Apr 17 18:12:53.969997 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:53.969967 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-kkzt9/perf-node-gather-daemonset-xst64" Apr 17 18:12:55.980690 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:55.980613 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-qrgfm_f3cccdeb-674a-4c4a-882c-679c52c9c0a9/kube-storage-version-migrator-operator/1.log" Apr 17 18:12:55.981449 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:55.981424 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-qrgfm_f3cccdeb-674a-4c4a-882c-679c52c9c0a9/kube-storage-version-migrator-operator/0.log" Apr 17 18:12:57.173100 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:57.173071 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hprdr_adeb035c-390a-4439-9413-491fc20cac69/kube-multus-additional-cni-plugins/0.log" Apr 17 18:12:57.192200 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:57.192174 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hprdr_adeb035c-390a-4439-9413-491fc20cac69/egress-router-binary-copy/0.log" Apr 17 18:12:57.210040 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:57.210019 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hprdr_adeb035c-390a-4439-9413-491fc20cac69/cni-plugins/0.log" Apr 17 18:12:57.228156 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:57.228135 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hprdr_adeb035c-390a-4439-9413-491fc20cac69/bond-cni-plugin/0.log" Apr 17 18:12:57.247873 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:57.247851 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hprdr_adeb035c-390a-4439-9413-491fc20cac69/routeoverride-cni/0.log" Apr 17 18:12:57.267107 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:57.267085 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hprdr_adeb035c-390a-4439-9413-491fc20cac69/whereabouts-cni-bincopy/0.log" Apr 17 18:12:57.289030 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:57.289009 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hprdr_adeb035c-390a-4439-9413-491fc20cac69/whereabouts-cni/0.log" Apr 17 18:12:57.452759 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:57.452685 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wgh2j_3f7324f4-55c2-40da-9869-47ec0880aec3/kube-multus/0.log" Apr 17 18:12:57.533857 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:57.533812 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mqlsd_41eeea20-b1c0-4cb6-8da8-a4a26a60423d/network-metrics-daemon/0.log" Apr 17 18:12:57.550398 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:57.550376 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mqlsd_41eeea20-b1c0-4cb6-8da8-a4a26a60423d/kube-rbac-proxy/0.log" Apr 17 18:12:58.694867 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:58.694819 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2pbl_3ff73a5e-853e-4f01-b8d7-e995977da39f/ovn-controller/0.log" Apr 17 18:12:58.737927 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:58.737902 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2pbl_3ff73a5e-853e-4f01-b8d7-e995977da39f/ovn-acl-logging/0.log" Apr 17 18:12:58.763397 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:58.763372 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2pbl_3ff73a5e-853e-4f01-b8d7-e995977da39f/kube-rbac-proxy-node/0.log" Apr 17 18:12:58.804159 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:58.804134 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2pbl_3ff73a5e-853e-4f01-b8d7-e995977da39f/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 18:12:58.844385 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:58.844318 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2pbl_3ff73a5e-853e-4f01-b8d7-e995977da39f/northd/0.log" Apr 17 18:12:58.876883 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:58.876866 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2pbl_3ff73a5e-853e-4f01-b8d7-e995977da39f/nbdb/0.log" Apr 17 18:12:58.917619 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:58.917595 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2pbl_3ff73a5e-853e-4f01-b8d7-e995977da39f/sbdb/0.log" Apr 17 18:12:59.032948 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:12:59.032920 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2pbl_3ff73a5e-853e-4f01-b8d7-e995977da39f/ovnkube-controller/0.log" Apr 17 18:13:00.287176 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:13:00.287131 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-gbxth_931285f5-ffa4-47a1-9453-a80716fcd1e5/check-endpoints/0.log" Apr 17 18:13:00.328912 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:13:00.328881 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-tlwhr_e4fd5db6-98ca-462f-b950-10c0fe775718/network-check-target-container/0.log" Apr 17 18:13:01.216098 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:13:01.216071 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-rj22k_06cbfbab-93e8-4744-be0c-f8d8adfb094d/iptables-alerter/0.log" Apr 17 18:13:01.847355 ip-10-0-140-33 kubenswrapper[2575]: I0417 18:13:01.847327 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-cx8bv_ae488321-502d-450d-a483-234f0aff8bb3/tuned/0.log"