Apr 22 19:57:55.636298 ip-10-0-128-239 systemd[1]: Starting Kubernetes Kubelet... Apr 22 19:57:56.067974 ip-10-0-128-239 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:57:56.067974 ip-10-0-128-239 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 19:57:56.067974 ip-10-0-128-239 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:57:56.067974 ip-10-0-128-239 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 19:57:56.067974 ip-10-0-128-239 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:57:56.069284 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.069199 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 19:57:56.073523 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073508 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:57:56.073554 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073524 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:57:56.073554 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073528 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:57:56.073554 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073531 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:57:56.073554 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073534 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:57:56.073554 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073537 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:57:56.073554 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073539 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:57:56.073554 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073542 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:57:56.073554 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073545 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:57:56.073554 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073548 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:57:56.073554 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073551 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:57:56.073554 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073554 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:57:56.073554 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073556 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:57:56.073554 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073559 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:57:56.073861 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073562 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:57:56.073861 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073565 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:57:56.073861 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073569 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:57:56.073861 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073573 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:57:56.073861 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073576 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:57:56.073861 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073579 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:57:56.073861 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073582 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:57:56.073861 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073589 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:57:56.073861 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073592 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:57:56.073861 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073594 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:57:56.073861 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073597 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:57:56.073861 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073599 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:57:56.073861 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073602 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:57:56.073861 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073604 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:57:56.073861 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073607 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:57:56.073861 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073609 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:57:56.073861 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073615 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:57:56.073861 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073618 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:57:56.073861 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073620 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:57:56.073861 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073623 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:57:56.074356 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073626 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:57:56.074356 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073629 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:57:56.074356 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073631 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:57:56.074356 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073634 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:57:56.074356 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073636 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:57:56.074356 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073639 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:57:56.074356 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073641 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:57:56.074356 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073644 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:57:56.074356 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073646 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:57:56.074356 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073649 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:57:56.074356 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073652 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:57:56.074356 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073654 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:57:56.074356 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073657 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:57:56.074356 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073661 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:57:56.074356 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073663 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:57:56.074356 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073666 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:57:56.074356 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073669 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:57:56.074356 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073672 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:57:56.074356 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073675 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:57:56.074356 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073677 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:57:56.074855 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073680 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:57:56.074855 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073682 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:57:56.074855 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073685 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:57:56.074855 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073688 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:57:56.074855 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073690 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:57:56.074855 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073693 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:57:56.074855 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073695 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:57:56.074855 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073698 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:57:56.074855 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073701 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:57:56.074855 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073703 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:57:56.074855 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073706 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:57:56.074855 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073708 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:57:56.074855 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073711 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:57:56.074855 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073713 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:57:56.074855 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073716 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:57:56.074855 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073718 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:57:56.074855 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073721 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:57:56.074855 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073723 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:57:56.074855 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073726 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:57:56.074855 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073728 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:57:56.075347 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073731 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:57:56.075347 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073734 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:57:56.075347 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073736 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:57:56.075347 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073739 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:57:56.075347 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073741 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:57:56.075347 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073746 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:57:56.075347 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073749 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:57:56.075347 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073752 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:57:56.075347 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073754 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:57:56.075347 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073758 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:57:56.075347 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073762 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:57:56.075347 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.073765 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:57:56.075347 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074146 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:57:56.075347 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074151 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:57:56.075347 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074154 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:57:56.075347 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074157 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:57:56.075347 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074160 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:57:56.075347 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074162 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:57:56.075347 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074165 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:57:56.075799 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074168 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:57:56.075799 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074170 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:57:56.075799 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074173 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:57:56.075799 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074176 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:57:56.075799 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074178 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:57:56.075799 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074181 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:57:56.075799 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074183 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:57:56.075799 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074186 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:57:56.075799 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074188 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:57:56.075799 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074191 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:57:56.075799 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074193 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:57:56.075799 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074196 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:57:56.075799 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074198 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:57:56.075799 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074201 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:57:56.075799 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074204 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:57:56.075799 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074206 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:57:56.075799 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074209 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:57:56.075799 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074212 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:57:56.075799 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074216 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:57:56.075799 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074219 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:57:56.076300 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074221 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:57:56.076300 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074224 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:57:56.076300 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074226 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:57:56.076300 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074229 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:57:56.076300 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074231 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:57:56.076300 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074234 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:57:56.076300 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074237 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:57:56.076300 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074239 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:57:56.076300 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074242 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:57:56.076300 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074244 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:57:56.076300 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074247 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:57:56.076300 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074249 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:57:56.076300 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074251 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:57:56.076300 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074254 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:57:56.076300 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074256 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:57:56.076300 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074259 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:57:56.076300 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074261 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:57:56.076300 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074264 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:57:56.076300 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074267 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:57:56.076300 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074269 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:57:56.076785 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074272 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:57:56.076785 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074274 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:57:56.076785 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074277 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:57:56.076785 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074279 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:57:56.076785 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074282 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:57:56.076785 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074284 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:57:56.076785 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074288 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:57:56.076785 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074292 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:57:56.076785 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074295 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:57:56.076785 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074303 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:57:56.076785 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074308 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:57:56.076785 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074311 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:57:56.076785 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074315 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:57:56.076785 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074318 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:57:56.076785 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074321 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:57:56.076785 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074323 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:57:56.076785 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074326 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:57:56.076785 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074329 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:57:56.076785 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074331 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:57:56.077268 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074334 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:57:56.077268 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074336 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:57:56.077268 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074339 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:57:56.077268 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074341 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:57:56.077268 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074344 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:57:56.077268 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074347 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:57:56.077268 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074349 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:57:56.077268 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074352 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:57:56.077268 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074355 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:57:56.077268 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074358 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:57:56.077268 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074361 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:57:56.077268 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074364 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:57:56.077268 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074367 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:57:56.077268 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074369 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:57:56.077268 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074372 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:57:56.077268 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074374 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:57:56.077268 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074377 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:57:56.077268 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074379 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:57:56.077268 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074382 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:57:56.077733 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.074384 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:57:56.077733 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075764 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 19:57:56.077733 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075773 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 19:57:56.077733 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075778 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 19:57:56.077733 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075784 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 19:57:56.077733 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075789 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 19:57:56.077733 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075792 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 19:57:56.077733 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075796 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 19:57:56.077733 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075801 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 19:57:56.077733 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075804 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 19:57:56.077733 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075807 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 19:57:56.077733 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075811 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 19:57:56.077733 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075814 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 19:57:56.077733 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075817 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 19:57:56.077733 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075820 2575 flags.go:64] FLAG: --cgroup-root="" Apr 22 19:57:56.077733 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075823 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 19:57:56.077733 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075826 2575 flags.go:64] FLAG: --client-ca-file="" Apr 22 19:57:56.077733 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075829 2575 flags.go:64] FLAG: --cloud-config="" Apr 22 19:57:56.077733 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075832 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 22 19:57:56.077733 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075835 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 19:57:56.077733 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075839 2575 flags.go:64] FLAG: --cluster-domain="" Apr 22 19:57:56.077733 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075842 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 19:57:56.077733 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075845 2575 flags.go:64] FLAG: --config-dir="" Apr 22 19:57:56.077733 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075848 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 19:57:56.078349 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075851 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 19:57:56.078349 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075855 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 19:57:56.078349 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075858 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 19:57:56.078349 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075861 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 19:57:56.078349 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075864 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 19:57:56.078349 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075867 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 22 19:57:56.078349 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075870 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 19:57:56.078349 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075873 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 19:57:56.078349 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075876 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 19:57:56.078349 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075879 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 19:57:56.078349 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075883 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 19:57:56.078349 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075887 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 19:57:56.078349 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075891 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 19:57:56.078349 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075894 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 19:57:56.078349 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075897 2575 flags.go:64] FLAG: --enable-server="true" Apr 22 19:57:56.078349 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075899 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 19:57:56.078349 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075903 2575 flags.go:64] FLAG: --event-burst="100" Apr 22 19:57:56.078349 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075907 2575 flags.go:64] FLAG: --event-qps="50" Apr 22 19:57:56.078349 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075910 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 19:57:56.078349 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075912 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 19:57:56.078349 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075915 2575 flags.go:64] FLAG: --eviction-hard="" Apr 22 19:57:56.078349 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075919 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 19:57:56.078349 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075922 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 19:57:56.078349 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075925 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 19:57:56.078349 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075928 2575 flags.go:64] FLAG: --eviction-soft="" Apr 22 19:57:56.078952 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075931 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 19:57:56.078952 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075934 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 19:57:56.078952 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075937 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 19:57:56.078952 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075940 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 19:57:56.078952 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075943 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 19:57:56.078952 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075946 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 19:57:56.078952 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075949 2575 flags.go:64] FLAG: --feature-gates="" Apr 22 19:57:56.078952 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075953 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 19:57:56.078952 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075956 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 19:57:56.078952 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075958 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 19:57:56.078952 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075962 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 19:57:56.078952 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075965 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 22 19:57:56.078952 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075969 2575 flags.go:64] FLAG: --help="false" Apr 22 19:57:56.078952 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075972 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-128-239.ec2.internal" Apr 22 19:57:56.078952 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075975 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 19:57:56.078952 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075978 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 19:57:56.078952 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075981 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 19:57:56.078952 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075985 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 19:57:56.078952 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075988 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 19:57:56.078952 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075992 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 19:57:56.078952 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075995 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 19:57:56.078952 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.075998 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 19:57:56.078952 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076000 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 19:57:56.079530 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076003 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 19:57:56.079530 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076006 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 19:57:56.079530 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076009 2575 flags.go:64] FLAG: --kube-reserved="" Apr 22 19:57:56.079530 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076012 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 19:57:56.079530 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076015 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 19:57:56.079530 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076018 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 19:57:56.079530 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076021 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 19:57:56.079530 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076023 2575 flags.go:64] FLAG: --lock-file="" Apr 22 19:57:56.079530 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076026 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 19:57:56.079530 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076043 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 19:57:56.079530 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076048 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 19:57:56.079530 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076055 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 19:57:56.079530 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076059 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 19:57:56.079530 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076062 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 19:57:56.079530 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076064 2575 flags.go:64] FLAG: --logging-format="text" Apr 22 19:57:56.079530 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076067 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 19:57:56.079530 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076071 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 19:57:56.079530 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076073 2575 flags.go:64] FLAG: --manifest-url="" Apr 22 19:57:56.079530 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076076 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 22 19:57:56.079530 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076085 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 19:57:56.079530 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076089 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 19:57:56.079530 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076093 2575 flags.go:64] FLAG: --max-pods="110" Apr 22 19:57:56.079530 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076096 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 19:57:56.079530 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076099 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 19:57:56.079530 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076102 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 19:57:56.080227 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076105 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 19:57:56.080227 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076108 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 19:57:56.080227 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076111 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 19:57:56.080227 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076114 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 19:57:56.080227 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076122 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 19:57:56.080227 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076125 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 19:57:56.080227 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076128 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 19:57:56.080227 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076131 2575 flags.go:64] FLAG: --pod-cidr="" Apr 22 19:57:56.080227 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076134 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 19:57:56.080227 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076139 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 19:57:56.080227 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076142 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 19:57:56.080227 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076145 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 22 19:57:56.080227 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076148 2575 flags.go:64] FLAG: --port="10250" Apr 22 19:57:56.080227 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076151 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 19:57:56.080227 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076154 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e9432bcda103634a" Apr 22 19:57:56.080227 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076157 2575 flags.go:64] FLAG: --qos-reserved="" Apr 22 19:57:56.080227 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076161 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 22 19:57:56.080227 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076164 2575 flags.go:64] FLAG: --register-node="true" Apr 22 19:57:56.080227 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076166 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 22 19:57:56.080227 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076169 2575 flags.go:64] FLAG: --register-with-taints="" Apr 22 19:57:56.080227 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076173 2575 flags.go:64] FLAG: --registry-burst="10" Apr 22 19:57:56.080227 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076176 2575 flags.go:64] FLAG: --registry-qps="5" Apr 22 19:57:56.080227 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076179 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 22 19:57:56.080227 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076182 2575 flags.go:64] FLAG: --reserved-memory="" Apr 22 19:57:56.080227 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076186 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 19:57:56.080825 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076189 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 19:57:56.080825 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076192 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 19:57:56.080825 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076195 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 19:57:56.080825 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076198 2575 flags.go:64] FLAG: --runonce="false" Apr 22 19:57:56.080825 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076200 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 19:57:56.080825 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076203 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 19:57:56.080825 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076206 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 22 19:57:56.080825 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076209 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 19:57:56.080825 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076212 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 19:57:56.080825 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076215 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 19:57:56.080825 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076218 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 19:57:56.080825 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076221 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 19:57:56.080825 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076224 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 19:57:56.080825 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076227 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 19:57:56.080825 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076230 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 19:57:56.080825 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076233 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 19:57:56.080825 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076235 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 19:57:56.080825 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076238 2575 flags.go:64] FLAG: --system-cgroups="" Apr 22 19:57:56.080825 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076241 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 19:57:56.080825 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076246 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 19:57:56.080825 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076249 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 22 19:57:56.080825 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076252 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 19:57:56.080825 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076259 2575 flags.go:64] FLAG: --tls-min-version="" Apr 22 19:57:56.080825 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076262 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 19:57:56.080825 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076265 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 19:57:56.081494 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076267 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 19:57:56.081494 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076270 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 19:57:56.081494 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076273 2575 flags.go:64] FLAG: --v="2" Apr 22 19:57:56.081494 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076278 2575 flags.go:64] FLAG: --version="false" Apr 22 19:57:56.081494 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076281 2575 flags.go:64] FLAG: --vmodule="" Apr 22 19:57:56.081494 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076285 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 19:57:56.081494 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.076288 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 19:57:56.081494 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076381 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:57:56.081494 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076385 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:57:56.081494 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076389 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:57:56.081494 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076392 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:57:56.081494 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076395 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:57:56.081494 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076397 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:57:56.081494 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076400 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:57:56.081494 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076403 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:57:56.081494 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076405 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:57:56.081494 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076408 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:57:56.081494 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076413 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:57:56.081494 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076415 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:57:56.081494 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076418 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:57:56.081494 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076421 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:57:56.082019 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076424 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:57:56.082019 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076428 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:57:56.082019 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076431 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:57:56.082019 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076434 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:57:56.082019 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076436 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:57:56.082019 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076439 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:57:56.082019 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076441 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:57:56.082019 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076444 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:57:56.082019 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076447 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:57:56.082019 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076449 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:57:56.082019 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076452 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:57:56.082019 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076455 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:57:56.082019 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076457 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:57:56.082019 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076460 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:57:56.082019 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076462 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:57:56.082019 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076465 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:57:56.082019 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076467 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:57:56.082019 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076470 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:57:56.082019 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076473 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:57:56.082019 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076476 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:57:56.082569 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076479 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:57:56.082569 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076481 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:57:56.082569 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076484 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:57:56.082569 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076486 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:57:56.082569 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076489 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:57:56.082569 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076492 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:57:56.082569 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076494 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:57:56.082569 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076497 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:57:56.082569 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076501 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:57:56.082569 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076504 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:57:56.082569 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076506 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:57:56.082569 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076509 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:57:56.082569 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076512 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:57:56.082569 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076515 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:57:56.082569 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076517 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:57:56.082569 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076519 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:57:56.082569 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076522 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:57:56.082569 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076525 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:57:56.082569 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076527 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:57:56.083041 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076530 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:57:56.083041 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076532 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:57:56.083041 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076535 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:57:56.083041 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076537 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:57:56.083041 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076540 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:57:56.083041 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076542 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:57:56.083041 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076545 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:57:56.083041 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076549 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:57:56.083041 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076552 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:57:56.083041 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076554 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:57:56.083041 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076557 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:57:56.083041 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076560 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:57:56.083041 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076562 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:57:56.083041 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076565 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:57:56.083041 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076568 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:57:56.083041 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076570 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:57:56.083041 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076573 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:57:56.083041 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076575 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:57:56.083041 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076578 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:57:56.083041 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076580 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:57:56.083531 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076584 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:57:56.083531 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076589 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:57:56.083531 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076592 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:57:56.083531 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076595 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:57:56.083531 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076598 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:57:56.083531 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076600 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:57:56.083531 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076604 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:57:56.083531 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076607 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:57:56.083531 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076610 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:57:56.083531 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076613 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:57:56.083531 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076615 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:57:56.083531 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076618 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:57:56.083531 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.076620 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:57:56.083531 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.077132 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:57:56.083877 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.083492 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 19:57:56.083877 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.083593 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 19:57:56.083877 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083645 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:57:56.083877 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083650 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:57:56.083877 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083654 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:57:56.083877 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083657 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:57:56.083877 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083660 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:57:56.083877 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083663 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:57:56.083877 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083665 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:57:56.083877 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083668 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:57:56.083877 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083671 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:57:56.083877 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083673 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:57:56.083877 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083676 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:57:56.083877 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083678 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:57:56.083877 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083681 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:57:56.083877 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083683 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:57:56.083877 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083686 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:57:56.083877 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083688 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:57:56.083877 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083691 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:57:56.083877 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083693 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:57:56.084369 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083696 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:57:56.084369 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083699 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:57:56.084369 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083701 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:57:56.084369 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083704 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:57:56.084369 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083707 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:57:56.084369 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083709 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:57:56.084369 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083712 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:57:56.084369 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083714 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:57:56.084369 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083716 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:57:56.084369 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083719 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:57:56.084369 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083721 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:57:56.084369 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083724 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:57:56.084369 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083726 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:57:56.084369 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083734 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:57:56.084369 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083737 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:57:56.084369 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083739 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:57:56.084369 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083742 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:57:56.084369 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083744 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:57:56.084369 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083747 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:57:56.084369 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083749 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:57:56.084890 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083752 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:57:56.084890 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083754 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:57:56.084890 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083757 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:57:56.084890 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083759 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:57:56.084890 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083762 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:57:56.084890 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083764 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:57:56.084890 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083767 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:57:56.084890 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083769 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:57:56.084890 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083771 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:57:56.084890 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083774 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:57:56.084890 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083776 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:57:56.084890 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083779 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:57:56.084890 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083781 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:57:56.084890 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083784 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:57:56.084890 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083786 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:57:56.084890 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083789 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:57:56.084890 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083791 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:57:56.084890 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083794 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:57:56.084890 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083796 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:57:56.085360 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083799 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:57:56.085360 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083802 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:57:56.085360 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083804 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:57:56.085360 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083807 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:57:56.085360 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083809 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:57:56.085360 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083811 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:57:56.085360 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083814 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:57:56.085360 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083820 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:57:56.085360 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083823 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:57:56.085360 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083827 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:57:56.085360 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083831 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:57:56.085360 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083833 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:57:56.085360 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083837 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:57:56.085360 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083841 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:57:56.085360 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083844 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:57:56.085360 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083846 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:57:56.085360 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083849 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:57:56.085360 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083851 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:57:56.085360 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083854 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:57:56.085828 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083857 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:57:56.085828 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083859 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:57:56.085828 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083861 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:57:56.085828 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083864 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:57:56.085828 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083866 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:57:56.085828 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083869 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:57:56.085828 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083871 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:57:56.085828 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083874 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:57:56.085828 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083877 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:57:56.085828 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.083879 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:57:56.085828 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.083884 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:57:56.085828 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084010 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:57:56.085828 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084061 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:57:56.085828 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084065 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:57:56.085828 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084069 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:57:56.086281 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084072 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:57:56.086281 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084075 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:57:56.086281 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084078 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:57:56.086281 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084081 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:57:56.086281 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084084 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:57:56.086281 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084087 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:57:56.086281 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084095 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:57:56.086281 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084098 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:57:56.086281 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084101 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:57:56.086281 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084104 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:57:56.086281 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084106 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:57:56.086281 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084109 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:57:56.086281 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084111 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:57:56.086281 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084114 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:57:56.086281 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084116 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:57:56.086281 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084119 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:57:56.086281 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084127 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:57:56.086281 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084129 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:57:56.086281 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084132 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:57:56.086281 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084134 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:57:56.086759 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084136 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:57:56.086759 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084139 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:57:56.086759 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084141 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:57:56.086759 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084144 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:57:56.086759 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084146 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:57:56.086759 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084148 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:57:56.086759 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084151 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:57:56.086759 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084154 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:57:56.086759 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084156 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:57:56.086759 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084158 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:57:56.086759 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084161 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:57:56.086759 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084163 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:57:56.086759 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084166 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:57:56.086759 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084168 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:57:56.086759 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084171 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:57:56.086759 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084173 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:57:56.086759 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084176 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:57:56.086759 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084178 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:57:56.086759 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084181 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:57:56.086759 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084189 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:57:56.087249 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084191 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:57:56.087249 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084194 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:57:56.087249 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084196 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:57:56.087249 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084199 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:57:56.087249 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084201 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:57:56.087249 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084204 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:57:56.087249 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084206 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:57:56.087249 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084209 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:57:56.087249 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084211 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:57:56.087249 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084214 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:57:56.087249 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084217 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:57:56.087249 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084219 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:57:56.087249 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084222 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:57:56.087249 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084224 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:57:56.087249 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084227 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:57:56.087249 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084229 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:57:56.087249 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084232 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:57:56.087249 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084234 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:57:56.087249 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084236 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:57:56.087249 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084239 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:57:56.087810 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084241 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:57:56.087810 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084244 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:57:56.087810 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084246 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:57:56.087810 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084249 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:57:56.087810 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084251 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:57:56.087810 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084253 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:57:56.087810 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084256 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:57:56.087810 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084258 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:57:56.087810 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084261 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:57:56.087810 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084264 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:57:56.087810 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084266 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:57:56.087810 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084269 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:57:56.087810 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084276 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:57:56.087810 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084279 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:57:56.087810 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084281 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:57:56.087810 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084284 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:57:56.087810 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084286 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:57:56.087810 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084289 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:57:56.087810 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084291 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:57:56.088281 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084294 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:57:56.088281 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084296 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:57:56.088281 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:56.084300 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:57:56.088281 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.084305 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:57:56.088281 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.084988 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 19:57:56.088281 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.086845 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 19:57:56.088281 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.087601 2575 server.go:1019] "Starting client certificate rotation" Apr 22 19:57:56.088281 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.087694 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:57:56.088480 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.088374 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:57:56.110975 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.110957 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:57:56.113518 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.113501 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:57:56.125682 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.125663 2575 log.go:25] "Validated CRI v1 runtime API" Apr 22 19:57:56.130869 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.130855 2575 log.go:25] "Validated CRI v1 image API" Apr 22 19:57:56.132220 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.132205 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 19:57:56.136360 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.136341 2575 fs.go:135] Filesystem UUIDs: map[1a7a6ab7-1234-4730-b9a4-87bd16b6e277:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 b16385a3-723c-49d4-9f29-f27f1f78c6b0:/dev/nvme0n1p4] Apr 22 19:57:56.136430 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.136360 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 19:57:56.139312 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.139296 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:57:56.142192 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.142091 2575 manager.go:217] Machine: {Timestamp:2026-04-22 19:57:56.140310996 +0000 UTC m=+0.391810876 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100181 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec28e5846a04156258275c4c5e5b1570 SystemUUID:ec28e584-6a04-1562-5827-5c4c5e5b1570 BootID:6aee5040-5007-4503-87f7-164828b92cb1 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:e8:8f:b4:f2:ff Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:e8:8f:b4:f2:ff Speed:0 Mtu:9001} {Name:ovs-system MacAddress:42:5a:6d:97:b8:64 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 19:57:56.142192 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.142187 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 19:57:56.142291 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.142257 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 19:57:56.143175 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.143151 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 19:57:56.143316 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.143176 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-239.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 19:57:56.143360 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.143324 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 19:57:56.143360 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.143332 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 19:57:56.143360 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.143344 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:57:56.144059 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.144049 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:57:56.145088 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.145078 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:57:56.145182 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.145174 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 19:57:56.147676 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.147667 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 22 19:57:56.147711 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.147678 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 19:57:56.147711 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.147689 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 19:57:56.147711 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.147698 2575 kubelet.go:397] "Adding apiserver pod source" Apr 22 19:57:56.147711 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.147705 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 19:57:56.148617 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.148606 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:57:56.148661 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.148623 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:57:56.151217 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.151193 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 19:57:56.152366 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.152353 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 19:57:56.153802 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.153787 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 19:57:56.153843 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.153808 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 19:57:56.153843 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.153819 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 19:57:56.153843 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.153826 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 19:57:56.153843 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.153832 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 19:57:56.153843 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.153837 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 19:57:56.153843 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.153843 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 19:57:56.154001 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.153848 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 19:57:56.154001 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.153855 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 19:57:56.154001 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.153861 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 19:57:56.154001 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.153869 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 19:57:56.154130 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.154072 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 19:57:56.156173 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.156159 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 19:57:56.156234 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.156174 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 19:57:56.159566 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.159551 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 19:57:56.159641 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.159585 2575 server.go:1295] "Started kubelet" Apr 22 19:57:56.159694 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.159656 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 19:57:56.159694 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.159662 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 19:57:56.159750 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.159721 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 19:57:56.161003 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.160982 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 19:57:56.161010 ip-10-0-128-239 systemd[1]: Started Kubernetes Kubelet. Apr 22 19:57:56.161758 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.161734 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ll5zp" Apr 22 19:57:56.162125 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.162100 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-239.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 19:57:56.162221 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:56.162191 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 19:57:56.162445 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:56.162421 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-239.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 19:57:56.164181 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.164165 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 22 19:57:56.166633 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.166614 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 19:57:56.166707 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.166638 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 19:57:56.167378 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.167362 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 19:57:56.167378 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.167379 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 19:57:56.167508 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.167479 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 19:57:56.167556 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.167551 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 22 19:57:56.167600 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.167559 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 22 19:57:56.167664 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:56.166331 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-239.ec2.internal.18a8c61be83068c7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-239.ec2.internal,UID:ip-10-0-128-239.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-239.ec2.internal,},FirstTimestamp:2026-04-22 19:57:56.159563975 +0000 UTC m=+0.411063854,LastTimestamp:2026-04-22 19:57:56.159563975 +0000 UTC m=+0.411063854,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-239.ec2.internal,}" Apr 22 19:57:56.167664 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:56.167629 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-239.ec2.internal\" not found" Apr 22 19:57:56.168725 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.168701 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 19:57:56.168804 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.168742 2575 factory.go:55] Registering systemd factory Apr 22 19:57:56.168804 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.168754 2575 factory.go:223] Registration of the systemd container factory successfully Apr 22 19:57:56.169468 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.169448 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ll5zp" Apr 22 19:57:56.170238 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.170221 2575 factory.go:153] Registering CRI-O factory Apr 22 19:57:56.170238 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.170240 2575 factory.go:223] Registration of the crio container factory successfully Apr 22 19:57:56.170375 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.170262 2575 factory.go:103] Registering Raw factory Apr 22 19:57:56.170375 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.170277 2575 manager.go:1196] Started watching for new ooms in manager Apr 22 19:57:56.170669 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.170656 2575 manager.go:319] Starting recovery of all containers Apr 22 19:57:56.173750 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:56.173731 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 19:57:56.178359 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.178339 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:56.180599 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.180583 2575 manager.go:324] Recovery completed Apr 22 19:57:56.181421 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:56.181401 2575 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-128-239.ec2.internal\" not found" node="ip-10-0-128-239.ec2.internal" Apr 22 19:57:56.184741 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.184729 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:56.186915 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.186901 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-239.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:56.186974 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.186927 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-239.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:56.186974 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.186940 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-239.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:56.187389 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.187375 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 19:57:56.187389 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.187387 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 19:57:56.187483 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.187404 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:57:56.189509 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.189493 2575 policy_none.go:49] "None policy: Start" Apr 22 19:57:56.189558 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.189518 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 19:57:56.189558 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.189533 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 22 19:57:56.237535 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.237522 2575 manager.go:341] "Starting Device Plugin manager" Apr 22 19:57:56.237623 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:56.237572 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 19:57:56.237623 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.237584 2575 server.go:85] "Starting device plugin registration server" Apr 22 19:57:56.237811 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.237796 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 19:57:56.237905 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.237812 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 19:57:56.237961 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.237919 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 19:57:56.238013 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.237999 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 19:57:56.238013 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.238008 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 19:57:56.238643 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:56.238624 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 19:57:56.238724 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:56.238662 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-239.ec2.internal\" not found" Apr 22 19:57:56.298012 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.297987 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 19:57:56.299097 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.299083 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 19:57:56.299187 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.299110 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 19:57:56.299187 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.299129 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 19:57:56.299187 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.299137 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 19:57:56.299320 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:56.299208 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 19:57:56.301998 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.301981 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:56.338632 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.338575 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:56.339455 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.339440 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-239.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:56.339520 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.339469 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-239.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:56.339520 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.339483 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-239.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:56.339520 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.339510 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-239.ec2.internal" Apr 22 19:57:56.347742 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.347728 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-239.ec2.internal" Apr 22 19:57:56.347819 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:56.347750 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-239.ec2.internal\": node \"ip-10-0-128-239.ec2.internal\" not found" Apr 22 19:57:56.366940 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:56.366921 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-239.ec2.internal\" not found" Apr 22 19:57:56.400010 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.399990 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-239.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-239.ec2.internal"] Apr 22 19:57:56.400091 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.400075 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:56.400836 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.400822 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-239.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:56.400916 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.400852 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-239.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:56.400916 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.400866 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-239.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:56.402074 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.402060 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:56.402201 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.402185 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-239.ec2.internal" Apr 22 19:57:56.402247 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.402223 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:56.402791 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.402775 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-239.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:56.402843 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.402804 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-239.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:56.402843 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.402814 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-239.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:56.402906 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.402775 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-239.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:56.402906 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.402889 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-239.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:56.402906 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.402906 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-239.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:56.403998 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.403983 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-239.ec2.internal" Apr 22 19:57:56.404064 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.404009 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:56.404636 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.404622 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-239.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:56.404692 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.404648 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-239.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:56.404692 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.404658 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-239.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:56.432524 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:56.432500 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-239.ec2.internal\" not found" node="ip-10-0-128-239.ec2.internal" Apr 22 19:57:56.436656 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:56.436640 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-239.ec2.internal\" not found" node="ip-10-0-128-239.ec2.internal" Apr 22 19:57:56.467205 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:56.467185 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-239.ec2.internal\" not found" Apr 22 19:57:56.469421 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.469405 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e09675d338f4e7033174368cf4768c0b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-239.ec2.internal\" (UID: \"e09675d338f4e7033174368cf4768c0b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-239.ec2.internal" Apr 22 19:57:56.469468 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.469429 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e09675d338f4e7033174368cf4768c0b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-239.ec2.internal\" (UID: \"e09675d338f4e7033174368cf4768c0b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-239.ec2.internal" Apr 22 19:57:56.469468 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.469446 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6653588b6ccc7c46306d530ef0bfbee3-config\") pod \"kube-apiserver-proxy-ip-10-0-128-239.ec2.internal\" (UID: \"6653588b6ccc7c46306d530ef0bfbee3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-239.ec2.internal" Apr 22 19:57:56.568177 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:56.568139 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-239.ec2.internal\" not found" Apr 22 19:57:56.570350 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.570334 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e09675d338f4e7033174368cf4768c0b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-239.ec2.internal\" (UID: \"e09675d338f4e7033174368cf4768c0b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-239.ec2.internal" Apr 22 19:57:56.570397 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.570358 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e09675d338f4e7033174368cf4768c0b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-239.ec2.internal\" (UID: \"e09675d338f4e7033174368cf4768c0b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-239.ec2.internal" Apr 22 19:57:56.570397 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.570374 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6653588b6ccc7c46306d530ef0bfbee3-config\") pod \"kube-apiserver-proxy-ip-10-0-128-239.ec2.internal\" (UID: \"6653588b6ccc7c46306d530ef0bfbee3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-239.ec2.internal" Apr 22 19:57:56.570472 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.570415 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6653588b6ccc7c46306d530ef0bfbee3-config\") pod \"kube-apiserver-proxy-ip-10-0-128-239.ec2.internal\" (UID: \"6653588b6ccc7c46306d530ef0bfbee3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-239.ec2.internal" Apr 22 19:57:56.570472 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.570429 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e09675d338f4e7033174368cf4768c0b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-239.ec2.internal\" (UID: \"e09675d338f4e7033174368cf4768c0b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-239.ec2.internal" Apr 22 19:57:56.570472 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.570441 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e09675d338f4e7033174368cf4768c0b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-239.ec2.internal\" (UID: \"e09675d338f4e7033174368cf4768c0b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-239.ec2.internal" Apr 22 19:57:56.668923 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:56.668842 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-239.ec2.internal\" not found" Apr 22 19:57:56.734171 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.734145 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-239.ec2.internal" Apr 22 19:57:56.739635 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:56.739614 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-239.ec2.internal" Apr 22 19:57:56.769884 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:56.769860 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-239.ec2.internal\" not found" Apr 22 19:57:56.870439 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:56.870411 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-239.ec2.internal\" not found" Apr 22 19:57:56.970950 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:56.970885 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-239.ec2.internal\" not found" Apr 22 19:57:57.008475 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:57.008452 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:57.072044 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:57.072001 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-239.ec2.internal\" not found" Apr 22 19:57:57.088255 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:57.088213 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 19:57:57.088386 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:57.088365 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:57:57.088475 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:57.088365 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:57:57.088475 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:57.088388 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:57:57.166764 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:57.166743 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 19:57:57.171939 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:57.171914 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 19:52:56 +0000 UTC" deadline="2027-10-01 01:12:05.032410395 +0000 UTC" Apr 22 19:57:57.171939 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:57.171938 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12629h14m7.860475164s" Apr 22 19:57:57.172560 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:57.172538 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-239.ec2.internal\" not found" Apr 22 19:57:57.180106 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:57.179957 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:57:57.189027 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:57.189001 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode09675d338f4e7033174368cf4768c0b.slice/crio-ba0da4cabea64958f753d3e2cbd9afd5ff81daa3822e66806af7934aac73542e WatchSource:0}: Error finding container ba0da4cabea64958f753d3e2cbd9afd5ff81daa3822e66806af7934aac73542e: Status 404 returned error can't find the container with id ba0da4cabea64958f753d3e2cbd9afd5ff81daa3822e66806af7934aac73542e Apr 22 19:57:57.189643 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:57.189624 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6653588b6ccc7c46306d530ef0bfbee3.slice/crio-88d226dd9a3477afd9a71d0089fe05760b803a476028b9f593ca53ee66b51d84 WatchSource:0}: Error finding container 88d226dd9a3477afd9a71d0089fe05760b803a476028b9f593ca53ee66b51d84: Status 404 returned error can't find the container with id 88d226dd9a3477afd9a71d0089fe05760b803a476028b9f593ca53ee66b51d84 Apr 22 19:57:57.194291 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:57.194277 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:57:57.205410 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:57.205390 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-4hfcr" Apr 22 19:57:57.212636 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:57.212620 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-4hfcr" Apr 22 19:57:57.272350 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:57.272309 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:57.301746 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:57.301707 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-239.ec2.internal" event={"ID":"e09675d338f4e7033174368cf4768c0b","Type":"ContainerStarted","Data":"ba0da4cabea64958f753d3e2cbd9afd5ff81daa3822e66806af7934aac73542e"} Apr 22 19:57:57.302654 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:57.302636 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-239.ec2.internal" event={"ID":"6653588b6ccc7c46306d530ef0bfbee3","Type":"ContainerStarted","Data":"88d226dd9a3477afd9a71d0089fe05760b803a476028b9f593ca53ee66b51d84"} Apr 22 19:57:57.366781 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:57.366749 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-239.ec2.internal" Apr 22 19:57:57.379266 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:57.379245 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:57:57.380126 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:57.380114 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-239.ec2.internal" Apr 22 19:57:57.392475 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:57.392456 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:57:58.148930 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.148902 2575 apiserver.go:52] "Watching apiserver" Apr 22 19:57:58.156234 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.156212 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 19:57:58.156621 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.156597 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-f7mbc","kube-system/konnectivity-agent-9jzqm","kube-system/kube-apiserver-proxy-ip-10-0-128-239.ec2.internal","openshift-cluster-node-tuning-operator/tuned-lqzhq","openshift-image-registry/node-ca-nhz6m","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-239.ec2.internal","openshift-multus/multus-hrx7n","openshift-multus/network-metrics-daemon-gptpd","openshift-network-diagnostics/network-check-target-v2xhn","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bs9mh","openshift-multus/multus-additional-cni-plugins-m69k4","openshift-network-operator/iptables-alerter-sflf8"] Apr 22 19:57:58.158014 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.157992 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.159024 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.159001 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9jzqm" Apr 22 19:57:58.160129 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.160100 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.160341 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.160321 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 19:57:58.161095 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.161077 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nhz6m" Apr 22 19:57:58.161800 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.161470 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 19:57:58.161800 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.161517 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 19:57:58.161800 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.161564 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 19:57:58.161800 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.161577 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 19:57:58.161800 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.161476 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 19:57:58.161800 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.161628 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-jmxb4\"" Apr 22 19:57:58.161800 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.161688 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 19:57:58.161800 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.161709 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-4pxgt\"" Apr 22 19:57:58.162233 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.161898 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 19:57:58.162233 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.162075 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.162816 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.162666 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-8kl9w\"" Apr 22 19:57:58.162816 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.162676 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 19:57:58.162816 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.162692 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:57:58.163151 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.163131 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:57:58.163246 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:58.163216 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gptpd" podUID="80c4f1f9-74db-4e1b-bb9c-05d1618ca285" Apr 22 19:57:58.163483 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.163444 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 19:57:58.163578 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.163507 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-qqmvh\"" Apr 22 19:57:58.163578 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.163528 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 19:57:58.163691 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.163567 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 19:57:58.164260 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.164242 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 19:57:58.164334 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:58.164302 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v2xhn" podUID="a0ee49c5-2186-4f09-9a83-ceabc81a52e5" Apr 22 19:57:58.164456 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.164436 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-mflnj\"" Apr 22 19:57:58.164532 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.164467 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 19:57:58.164532 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.164493 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 19:57:58.164743 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.164724 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 19:57:58.164818 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.164789 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 19:57:58.165500 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.165451 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bs9mh" Apr 22 19:57:58.167938 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.167806 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 19:57:58.168847 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.168103 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 19:57:58.168847 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.168437 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 19:57:58.169608 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.169570 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-sflf8" Apr 22 19:57:58.170287 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.170065 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-f8gcl\"" Apr 22 19:57:58.170287 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.170263 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-m69k4" Apr 22 19:57:58.171802 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.171784 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:57:58.172368 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.172288 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-cx7zt\"" Apr 22 19:57:58.172368 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.172308 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 19:57:58.172368 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.172309 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 19:57:58.172988 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.172873 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-j7bzm\"" Apr 22 19:57:58.172988 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.172892 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 19:57:58.172988 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.172919 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 19:57:58.177799 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.177771 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e1780916-1341-4e58-886f-d52f29877102-sys-fs\") pod \"aws-ebs-csi-driver-node-bs9mh\" (UID: \"e1780916-1341-4e58-886f-d52f29877102\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bs9mh" Apr 22 19:57:58.177892 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.177807 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-sys\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.177892 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.177847 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-tmp\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.177892 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.177873 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e1780916-1341-4e58-886f-d52f29877102-socket-dir\") pod \"aws-ebs-csi-driver-node-bs9mh\" (UID: \"e1780916-1341-4e58-886f-d52f29877102\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bs9mh" Apr 22 19:57:58.178067 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.177902 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-run\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.178067 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.177936 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/658fb26d-962e-43ef-9be4-c89b573ecb41-host\") pod \"node-ca-nhz6m\" (UID: \"658fb26d-962e-43ef-9be4-c89b573ecb41\") " pod="openshift-image-registry/node-ca-nhz6m" Apr 22 19:57:58.178067 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.177959 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6e1ef928-0940-490e-89a9-e75af398fadb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m69k4\" (UID: \"6e1ef928-0940-490e-89a9-e75af398fadb\") " pod="openshift-multus/multus-additional-cni-plugins-m69k4" Apr 22 19:57:58.178067 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178010 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-log-socket\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.178067 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178050 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e1780916-1341-4e58-886f-d52f29877102-etc-selinux\") pod \"aws-ebs-csi-driver-node-bs9mh\" (UID: \"e1780916-1341-4e58-886f-d52f29877102\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bs9mh" Apr 22 19:57:58.178306 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178074 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-host\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.178306 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178098 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-multus-cni-dir\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.178306 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178123 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6e1ef928-0940-490e-89a9-e75af398fadb-cnibin\") pod \"multus-additional-cni-plugins-m69k4\" (UID: \"6e1ef928-0940-490e-89a9-e75af398fadb\") " pod="openshift-multus/multus-additional-cni-plugins-m69k4" Apr 22 19:57:58.178306 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178149 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-etc-openvswitch\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.178306 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178184 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pd5n\" (UniqueName: \"kubernetes.io/projected/a6c5f956-034b-486c-80ee-8f8ff4328b7f-kube-api-access-8pd5n\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.178306 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178209 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-os-release\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.178306 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178260 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6e1ef928-0940-490e-89a9-e75af398fadb-cni-binary-copy\") pod \"multus-additional-cni-plugins-m69k4\" (UID: \"6e1ef928-0940-490e-89a9-e75af398fadb\") " pod="openshift-multus/multus-additional-cni-plugins-m69k4" Apr 22 19:57:58.178306 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178294 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a6c5f956-034b-486c-80ee-8f8ff4328b7f-env-overrides\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.178618 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178318 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80c4f1f9-74db-4e1b-bb9c-05d1618ca285-metrics-certs\") pod \"network-metrics-daemon-gptpd\" (UID: \"80c4f1f9-74db-4e1b-bb9c-05d1618ca285\") " pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:57:58.178618 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178335 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-var-lib-kubelet\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.178618 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178350 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phjb4\" (UniqueName: \"kubernetes.io/projected/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-kube-api-access-phjb4\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.178618 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178366 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-cni-binary-copy\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.178618 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178388 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-multus-conf-dir\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.178618 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178410 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-multus-daemon-config\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.178618 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178432 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-etc-kubernetes\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.178618 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178453 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfkng\" (UniqueName: \"kubernetes.io/projected/a0ee49c5-2186-4f09-9a83-ceabc81a52e5-kube-api-access-hfkng\") pod \"network-check-target-v2xhn\" (UID: \"a0ee49c5-2186-4f09-9a83-ceabc81a52e5\") " pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 19:57:58.178618 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178471 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbzdf\" (UniqueName: \"kubernetes.io/projected/658fb26d-962e-43ef-9be4-c89b573ecb41-kube-api-access-kbzdf\") pod \"node-ca-nhz6m\" (UID: \"658fb26d-962e-43ef-9be4-c89b573ecb41\") " pod="openshift-image-registry/node-ca-nhz6m" Apr 22 19:57:58.178618 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178485 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-host-run-k8s-cni-cncf-io\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.178618 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178525 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6e1ef928-0940-490e-89a9-e75af398fadb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m69k4\" (UID: \"6e1ef928-0940-490e-89a9-e75af398fadb\") " pod="openshift-multus/multus-additional-cni-plugins-m69k4" Apr 22 19:57:58.178618 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178558 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e1780916-1341-4e58-886f-d52f29877102-registration-dir\") pod \"aws-ebs-csi-driver-node-bs9mh\" (UID: \"e1780916-1341-4e58-886f-d52f29877102\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bs9mh" Apr 22 19:57:58.178618 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178584 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-hostroot\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.178618 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178607 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6e1ef928-0940-490e-89a9-e75af398fadb-os-release\") pod \"multus-additional-cni-plugins-m69k4\" (UID: \"6e1ef928-0940-490e-89a9-e75af398fadb\") " pod="openshift-multus/multus-additional-cni-plugins-m69k4" Apr 22 19:57:58.179250 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178633 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-host-slash\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.179250 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178655 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-run-openvswitch\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.179250 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178669 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-etc-sysctl-conf\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.179250 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178689 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-host-run-multus-certs\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.179250 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178711 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbt72\" (UniqueName: \"kubernetes.io/projected/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-kube-api-access-tbt72\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.179250 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178730 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-host-kubelet\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.179250 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178744 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-var-lib-openvswitch\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.179250 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178765 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1780916-1341-4e58-886f-d52f29877102-kubelet-dir\") pod \"aws-ebs-csi-driver-node-bs9mh\" (UID: \"e1780916-1341-4e58-886f-d52f29877102\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bs9mh" Apr 22 19:57:58.179250 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178787 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-etc-tuned\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.179250 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178811 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9pwt\" (UniqueName: \"kubernetes.io/projected/6e1ef928-0940-490e-89a9-e75af398fadb-kube-api-access-x9pwt\") pod \"multus-additional-cni-plugins-m69k4\" (UID: \"6e1ef928-0940-490e-89a9-e75af398fadb\") " pod="openshift-multus/multus-additional-cni-plugins-m69k4" Apr 22 19:57:58.179250 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178825 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-host-run-netns\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.179250 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178838 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-host-run-ovn-kubernetes\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.179250 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178854 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a6c5f956-034b-486c-80ee-8f8ff4328b7f-ovn-node-metrics-cert\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.179250 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178878 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njlqm\" (UniqueName: \"kubernetes.io/projected/80c4f1f9-74db-4e1b-bb9c-05d1618ca285-kube-api-access-njlqm\") pod \"network-metrics-daemon-gptpd\" (UID: \"80c4f1f9-74db-4e1b-bb9c-05d1618ca285\") " pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:57:58.179250 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178901 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-etc-modprobe-d\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.179250 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178938 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-lib-modules\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.180007 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.178984 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/658fb26d-962e-43ef-9be4-c89b573ecb41-serviceca\") pod \"node-ca-nhz6m\" (UID: \"658fb26d-962e-43ef-9be4-c89b573ecb41\") " pod="openshift-image-registry/node-ca-nhz6m" Apr 22 19:57:58.180007 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.179058 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-host-var-lib-cni-bin\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.180007 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.179086 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-host-var-lib-kubelet\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.180007 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.179109 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-host-cni-bin\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.180007 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.179138 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-etc-sysconfig\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.180007 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.179161 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-system-cni-dir\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.180007 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.179189 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6e1ef928-0940-490e-89a9-e75af398fadb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m69k4\" (UID: \"6e1ef928-0940-490e-89a9-e75af398fadb\") " pod="openshift-multus/multus-additional-cni-plugins-m69k4" Apr 22 19:57:58.180007 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.179235 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-run-ovn\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.180007 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.179276 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-node-log\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.180007 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.179311 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-etc-kubernetes\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.180007 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.179362 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-cnibin\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.180007 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.179389 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-host-run-netns\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.180007 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.179410 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e1ef928-0940-490e-89a9-e75af398fadb-system-cni-dir\") pod \"multus-additional-cni-plugins-m69k4\" (UID: \"6e1ef928-0940-490e-89a9-e75af398fadb\") " pod="openshift-multus/multus-additional-cni-plugins-m69k4" Apr 22 19:57:58.180007 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.179436 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-systemd-units\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.180007 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.179451 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-run-systemd\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.180007 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.179474 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb5md\" (UniqueName: \"kubernetes.io/projected/e1780916-1341-4e58-886f-d52f29877102-kube-api-access-rb5md\") pod \"aws-ebs-csi-driver-node-bs9mh\" (UID: \"e1780916-1341-4e58-886f-d52f29877102\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bs9mh" Apr 22 19:57:58.180661 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.179496 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7d736e66-a7dc-4492-9e64-68b254bc8afa-agent-certs\") pod \"konnectivity-agent-9jzqm\" (UID: \"7d736e66-a7dc-4492-9e64-68b254bc8afa\") " pod="kube-system/konnectivity-agent-9jzqm" Apr 22 19:57:58.180661 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.179520 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e1780916-1341-4e58-886f-d52f29877102-device-dir\") pod \"aws-ebs-csi-driver-node-bs9mh\" (UID: \"e1780916-1341-4e58-886f-d52f29877102\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bs9mh" Apr 22 19:57:58.180661 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.179537 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a6c5f956-034b-486c-80ee-8f8ff4328b7f-ovnkube-config\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.180661 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.179566 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-etc-sysctl-d\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.180661 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.179605 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-multus-socket-dir-parent\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.180661 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.179631 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-host-var-lib-cni-multus\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.180661 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.179653 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-host-cni-netd\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.180661 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.179678 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.180661 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.179701 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a6c5f956-034b-486c-80ee-8f8ff4328b7f-ovnkube-script-lib\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.180661 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.179726 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7d736e66-a7dc-4492-9e64-68b254bc8afa-konnectivity-ca\") pod \"konnectivity-agent-9jzqm\" (UID: \"7d736e66-a7dc-4492-9e64-68b254bc8afa\") " pod="kube-system/konnectivity-agent-9jzqm" Apr 22 19:57:58.180661 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.179763 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-etc-systemd\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.213366 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.213284 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:52:57 +0000 UTC" deadline="2028-02-04 07:00:26.839297206 +0000 UTC" Apr 22 19:57:58.213366 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.213312 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15659h2m28.625988797s" Apr 22 19:57:58.270590 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.268565 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 19:57:58.280390 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.280361 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-etc-tuned\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.280483 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.280406 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9pwt\" (UniqueName: \"kubernetes.io/projected/6e1ef928-0940-490e-89a9-e75af398fadb-kube-api-access-x9pwt\") pod \"multus-additional-cni-plugins-m69k4\" (UID: \"6e1ef928-0940-490e-89a9-e75af398fadb\") " pod="openshift-multus/multus-additional-cni-plugins-m69k4" Apr 22 19:57:58.280483 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.280435 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-host-run-netns\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.280483 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.280459 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-host-run-ovn-kubernetes\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.280639 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.280525 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-host-run-netns\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.280639 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.280569 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-host-run-ovn-kubernetes\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.280639 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.280606 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a6c5f956-034b-486c-80ee-8f8ff4328b7f-ovn-node-metrics-cert\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.280639 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.280622 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 19:57:58.280803 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.280639 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-njlqm\" (UniqueName: \"kubernetes.io/projected/80c4f1f9-74db-4e1b-bb9c-05d1618ca285-kube-api-access-njlqm\") pod \"network-metrics-daemon-gptpd\" (UID: \"80c4f1f9-74db-4e1b-bb9c-05d1618ca285\") " pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:57:58.280803 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.280665 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-etc-modprobe-d\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.280803 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.280691 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-lib-modules\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.280950 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.280804 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/658fb26d-962e-43ef-9be4-c89b573ecb41-serviceca\") pod \"node-ca-nhz6m\" (UID: \"658fb26d-962e-43ef-9be4-c89b573ecb41\") " pod="openshift-image-registry/node-ca-nhz6m" Apr 22 19:57:58.280950 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.280813 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-etc-modprobe-d\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.280950 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.280809 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-lib-modules\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.280950 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.280845 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-host-var-lib-cni-bin\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.280950 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.280873 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-host-var-lib-kubelet\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.280950 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.280905 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-host-cni-bin\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.280950 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.280909 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-host-var-lib-cni-bin\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.280950 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.280933 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-etc-sysconfig\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.281332 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.280964 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-host-cni-bin\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.281332 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.280980 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-etc-sysconfig\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.281332 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.280989 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-system-cni-dir\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.281332 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.280962 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-host-var-lib-kubelet\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.281332 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281059 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-system-cni-dir\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.281332 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281064 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6e1ef928-0940-490e-89a9-e75af398fadb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m69k4\" (UID: \"6e1ef928-0940-490e-89a9-e75af398fadb\") " pod="openshift-multus/multus-additional-cni-plugins-m69k4" Apr 22 19:57:58.281332 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281094 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-run-ovn\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.281332 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281116 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-node-log\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.281332 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281137 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-etc-kubernetes\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.281332 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281158 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-cnibin\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.281332 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281148 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-run-ovn\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.281332 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281182 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-host-run-netns\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.281332 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281219 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-node-log\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.281332 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281233 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-cnibin\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.281332 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281234 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/658fb26d-962e-43ef-9be4-c89b573ecb41-serviceca\") pod \"node-ca-nhz6m\" (UID: \"658fb26d-962e-43ef-9be4-c89b573ecb41\") " pod="openshift-image-registry/node-ca-nhz6m" Apr 22 19:57:58.281332 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281236 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e1ef928-0940-490e-89a9-e75af398fadb-system-cni-dir\") pod \"multus-additional-cni-plugins-m69k4\" (UID: \"6e1ef928-0940-490e-89a9-e75af398fadb\") " pod="openshift-multus/multus-additional-cni-plugins-m69k4" Apr 22 19:57:58.281332 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281266 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-host-run-netns\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.281332 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281268 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-systemd-units\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.282186 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281294 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-run-systemd\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.282186 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281302 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-etc-kubernetes\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.282186 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281304 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6e1ef928-0940-490e-89a9-e75af398fadb-system-cni-dir\") pod \"multus-additional-cni-plugins-m69k4\" (UID: \"6e1ef928-0940-490e-89a9-e75af398fadb\") " pod="openshift-multus/multus-additional-cni-plugins-m69k4" Apr 22 19:57:58.282186 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281319 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-systemd-units\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.282186 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281320 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rb5md\" (UniqueName: \"kubernetes.io/projected/e1780916-1341-4e58-886f-d52f29877102-kube-api-access-rb5md\") pod \"aws-ebs-csi-driver-node-bs9mh\" (UID: \"e1780916-1341-4e58-886f-d52f29877102\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bs9mh" Apr 22 19:57:58.282186 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281334 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-run-systemd\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.282186 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281354 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7d736e66-a7dc-4492-9e64-68b254bc8afa-agent-certs\") pod \"konnectivity-agent-9jzqm\" (UID: \"7d736e66-a7dc-4492-9e64-68b254bc8afa\") " pod="kube-system/konnectivity-agent-9jzqm" Apr 22 19:57:58.282186 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281381 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4w67\" (UniqueName: \"kubernetes.io/projected/e5e122d2-2065-483b-a689-7d9721ed4c07-kube-api-access-d4w67\") pod \"iptables-alerter-sflf8\" (UID: \"e5e122d2-2065-483b-a689-7d9721ed4c07\") " pod="openshift-network-operator/iptables-alerter-sflf8" Apr 22 19:57:58.282186 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281402 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e1780916-1341-4e58-886f-d52f29877102-device-dir\") pod \"aws-ebs-csi-driver-node-bs9mh\" (UID: \"e1780916-1341-4e58-886f-d52f29877102\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bs9mh" Apr 22 19:57:58.282186 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281424 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a6c5f956-034b-486c-80ee-8f8ff4328b7f-ovnkube-config\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.282186 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281447 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-etc-sysctl-d\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.282186 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281469 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-multus-socket-dir-parent\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.282186 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281491 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-host-var-lib-cni-multus\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.282186 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281515 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-host-cni-netd\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.282186 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281540 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.282186 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281570 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a6c5f956-034b-486c-80ee-8f8ff4328b7f-ovnkube-script-lib\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.282186 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281577 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6e1ef928-0940-490e-89a9-e75af398fadb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m69k4\" (UID: \"6e1ef928-0940-490e-89a9-e75af398fadb\") " pod="openshift-multus/multus-additional-cni-plugins-m69k4" Apr 22 19:57:58.282953 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281590 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e1780916-1341-4e58-886f-d52f29877102-device-dir\") pod \"aws-ebs-csi-driver-node-bs9mh\" (UID: \"e1780916-1341-4e58-886f-d52f29877102\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bs9mh" Apr 22 19:57:58.282953 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281593 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7d736e66-a7dc-4492-9e64-68b254bc8afa-konnectivity-ca\") pod \"konnectivity-agent-9jzqm\" (UID: \"7d736e66-a7dc-4492-9e64-68b254bc8afa\") " pod="kube-system/konnectivity-agent-9jzqm" Apr 22 19:57:58.282953 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281636 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-host-cni-netd\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.282953 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281653 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-etc-systemd\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.282953 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281686 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e5e122d2-2065-483b-a689-7d9721ed4c07-host-slash\") pod \"iptables-alerter-sflf8\" (UID: \"e5e122d2-2065-483b-a689-7d9721ed4c07\") " pod="openshift-network-operator/iptables-alerter-sflf8" Apr 22 19:57:58.282953 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281715 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e1780916-1341-4e58-886f-d52f29877102-sys-fs\") pod \"aws-ebs-csi-driver-node-bs9mh\" (UID: \"e1780916-1341-4e58-886f-d52f29877102\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bs9mh" Apr 22 19:57:58.282953 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281738 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-sys\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.282953 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281780 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-tmp\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.282953 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281794 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-multus-socket-dir-parent\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.282953 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281805 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e1780916-1341-4e58-886f-d52f29877102-socket-dir\") pod \"aws-ebs-csi-driver-node-bs9mh\" (UID: \"e1780916-1341-4e58-886f-d52f29877102\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bs9mh" Apr 22 19:57:58.282953 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281835 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-run\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.282953 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281860 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/658fb26d-962e-43ef-9be4-c89b573ecb41-host\") pod \"node-ca-nhz6m\" (UID: \"658fb26d-962e-43ef-9be4-c89b573ecb41\") " pod="openshift-image-registry/node-ca-nhz6m" Apr 22 19:57:58.282953 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281902 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6e1ef928-0940-490e-89a9-e75af398fadb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m69k4\" (UID: \"6e1ef928-0940-490e-89a9-e75af398fadb\") " pod="openshift-multus/multus-additional-cni-plugins-m69k4" Apr 22 19:57:58.282953 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281905 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e1780916-1341-4e58-886f-d52f29877102-socket-dir\") pod \"aws-ebs-csi-driver-node-bs9mh\" (UID: \"e1780916-1341-4e58-886f-d52f29877102\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bs9mh" Apr 22 19:57:58.282953 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281941 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-log-socket\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.282953 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281969 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e1780916-1341-4e58-886f-d52f29877102-etc-selinux\") pod \"aws-ebs-csi-driver-node-bs9mh\" (UID: \"e1780916-1341-4e58-886f-d52f29877102\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bs9mh" Apr 22 19:57:58.282953 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282001 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-etc-sysctl-d\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.283786 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282014 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-host\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.283786 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282056 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-multus-cni-dir\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.283786 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282064 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-run\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.283786 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282080 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6e1ef928-0940-490e-89a9-e75af398fadb-cnibin\") pod \"multus-additional-cni-plugins-m69k4\" (UID: \"6e1ef928-0940-490e-89a9-e75af398fadb\") " pod="openshift-multus/multus-additional-cni-plugins-m69k4" Apr 22 19:57:58.283786 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282104 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/658fb26d-962e-43ef-9be4-c89b573ecb41-host\") pod \"node-ca-nhz6m\" (UID: \"658fb26d-962e-43ef-9be4-c89b573ecb41\") " pod="openshift-image-registry/node-ca-nhz6m" Apr 22 19:57:58.283786 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282106 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-etc-openvswitch\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.283786 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282129 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7d736e66-a7dc-4492-9e64-68b254bc8afa-konnectivity-ca\") pod \"konnectivity-agent-9jzqm\" (UID: \"7d736e66-a7dc-4492-9e64-68b254bc8afa\") " pod="kube-system/konnectivity-agent-9jzqm" Apr 22 19:57:58.283786 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282144 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-host-var-lib-cni-multus\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.283786 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282133 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8pd5n\" (UniqueName: \"kubernetes.io/projected/a6c5f956-034b-486c-80ee-8f8ff4328b7f-kube-api-access-8pd5n\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.283786 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282186 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-os-release\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.283786 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282193 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-etc-systemd\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.283786 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282196 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-sys\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.283786 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.281749 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.283786 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282242 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e1780916-1341-4e58-886f-d52f29877102-sys-fs\") pod \"aws-ebs-csi-driver-node-bs9mh\" (UID: \"e1780916-1341-4e58-886f-d52f29877102\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bs9mh" Apr 22 19:57:58.283786 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282241 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a6c5f956-034b-486c-80ee-8f8ff4328b7f-ovnkube-config\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.283786 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282323 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-os-release\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.283786 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282376 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6e1ef928-0940-490e-89a9-e75af398fadb-cni-binary-copy\") pod \"multus-additional-cni-plugins-m69k4\" (UID: \"6e1ef928-0940-490e-89a9-e75af398fadb\") " pod="openshift-multus/multus-additional-cni-plugins-m69k4" Apr 22 19:57:58.283786 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282411 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a6c5f956-034b-486c-80ee-8f8ff4328b7f-env-overrides\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.284623 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282435 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-multus-cni-dir\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.284623 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282441 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80c4f1f9-74db-4e1b-bb9c-05d1618ca285-metrics-certs\") pod \"network-metrics-daemon-gptpd\" (UID: \"80c4f1f9-74db-4e1b-bb9c-05d1618ca285\") " pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:57:58.284623 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282468 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-var-lib-kubelet\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.284623 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282469 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-etc-openvswitch\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.284623 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282506 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e1780916-1341-4e58-886f-d52f29877102-etc-selinux\") pod \"aws-ebs-csi-driver-node-bs9mh\" (UID: \"e1780916-1341-4e58-886f-d52f29877102\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bs9mh" Apr 22 19:57:58.284623 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282541 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-var-lib-kubelet\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.284623 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282570 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-phjb4\" (UniqueName: \"kubernetes.io/projected/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-kube-api-access-phjb4\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.284623 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282597 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-cni-binary-copy\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.284623 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282622 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-multus-conf-dir\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.284623 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282648 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-multus-daemon-config\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.284623 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282673 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6e1ef928-0940-490e-89a9-e75af398fadb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m69k4\" (UID: \"6e1ef928-0940-490e-89a9-e75af398fadb\") " pod="openshift-multus/multus-additional-cni-plugins-m69k4" Apr 22 19:57:58.284623 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282703 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-etc-kubernetes\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.284623 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282710 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-multus-conf-dir\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.284623 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282730 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hfkng\" (UniqueName: \"kubernetes.io/projected/a0ee49c5-2186-4f09-9a83-ceabc81a52e5-kube-api-access-hfkng\") pod \"network-check-target-v2xhn\" (UID: \"a0ee49c5-2186-4f09-9a83-ceabc81a52e5\") " pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 19:57:58.284623 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282757 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbzdf\" (UniqueName: \"kubernetes.io/projected/658fb26d-962e-43ef-9be4-c89b573ecb41-kube-api-access-kbzdf\") pod \"node-ca-nhz6m\" (UID: \"658fb26d-962e-43ef-9be4-c89b573ecb41\") " pod="openshift-image-registry/node-ca-nhz6m" Apr 22 19:57:58.284623 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:58.282768 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:58.284623 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282784 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-host-run-k8s-cni-cncf-io\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.284623 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.282813 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6e1ef928-0940-490e-89a9-e75af398fadb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m69k4\" (UID: \"6e1ef928-0940-490e-89a9-e75af398fadb\") " pod="openshift-multus/multus-additional-cni-plugins-m69k4" Apr 22 19:57:58.285322 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:58.282838 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80c4f1f9-74db-4e1b-bb9c-05d1618ca285-metrics-certs podName:80c4f1f9-74db-4e1b-bb9c-05d1618ca285 nodeName:}" failed. No retries permitted until 2026-04-22 19:57:58.782818193 +0000 UTC m=+3.034318076 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/80c4f1f9-74db-4e1b-bb9c-05d1618ca285-metrics-certs") pod "network-metrics-daemon-gptpd" (UID: "80c4f1f9-74db-4e1b-bb9c-05d1618ca285") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:58.285322 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.283083 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6e1ef928-0940-490e-89a9-e75af398fadb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m69k4\" (UID: \"6e1ef928-0940-490e-89a9-e75af398fadb\") " pod="openshift-multus/multus-additional-cni-plugins-m69k4" Apr 22 19:57:58.285322 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.283129 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-host-run-k8s-cni-cncf-io\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.285322 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.283332 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-host\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.285322 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.283370 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a6c5f956-034b-486c-80ee-8f8ff4328b7f-env-overrides\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.285322 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.283426 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e5e122d2-2065-483b-a689-7d9721ed4c07-iptables-alerter-script\") pod \"iptables-alerter-sflf8\" (UID: \"e5e122d2-2065-483b-a689-7d9721ed4c07\") " pod="openshift-network-operator/iptables-alerter-sflf8" Apr 22 19:57:58.285322 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.283458 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e1780916-1341-4e58-886f-d52f29877102-registration-dir\") pod \"aws-ebs-csi-driver-node-bs9mh\" (UID: \"e1780916-1341-4e58-886f-d52f29877102\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bs9mh" Apr 22 19:57:58.285322 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.283483 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-hostroot\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.285322 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.283511 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6e1ef928-0940-490e-89a9-e75af398fadb-os-release\") pod \"multus-additional-cni-plugins-m69k4\" (UID: \"6e1ef928-0940-490e-89a9-e75af398fadb\") " pod="openshift-multus/multus-additional-cni-plugins-m69k4" Apr 22 19:57:58.285322 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.283576 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-host-slash\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.285322 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.283600 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-run-openvswitch\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.285322 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.283606 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-etc-kubernetes\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.285322 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.283623 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-etc-sysctl-conf\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.285322 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.283648 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-log-socket\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.285322 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.283654 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-host-run-multus-certs\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.285322 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.283681 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tbt72\" (UniqueName: \"kubernetes.io/projected/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-kube-api-access-tbt72\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.285927 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.283693 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e1780916-1341-4e58-886f-d52f29877102-registration-dir\") pod \"aws-ebs-csi-driver-node-bs9mh\" (UID: \"e1780916-1341-4e58-886f-d52f29877102\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bs9mh" Apr 22 19:57:58.285927 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.283735 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-host-run-multus-certs\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.285927 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.283771 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-host-kubelet\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.285927 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.283818 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a6c5f956-034b-486c-80ee-8f8ff4328b7f-ovnkube-script-lib\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.285927 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.283739 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-host-kubelet\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.285927 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.283856 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-etc-sysctl-conf\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.285927 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.283874 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-var-lib-openvswitch\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.285927 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.283903 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1780916-1341-4e58-886f-d52f29877102-kubelet-dir\") pod \"aws-ebs-csi-driver-node-bs9mh\" (UID: \"e1780916-1341-4e58-886f-d52f29877102\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bs9mh" Apr 22 19:57:58.285927 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.283949 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1780916-1341-4e58-886f-d52f29877102-kubelet-dir\") pod \"aws-ebs-csi-driver-node-bs9mh\" (UID: \"e1780916-1341-4e58-886f-d52f29877102\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bs9mh" Apr 22 19:57:58.285927 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.283948 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-host-slash\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.285927 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.283988 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-var-lib-openvswitch\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.285927 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.283993 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a6c5f956-034b-486c-80ee-8f8ff4328b7f-run-openvswitch\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.285927 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.284050 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-hostroot\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.285927 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.284089 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6e1ef928-0940-490e-89a9-e75af398fadb-cnibin\") pod \"multus-additional-cni-plugins-m69k4\" (UID: \"6e1ef928-0940-490e-89a9-e75af398fadb\") " pod="openshift-multus/multus-additional-cni-plugins-m69k4" Apr 22 19:57:58.285927 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.284115 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6e1ef928-0940-490e-89a9-e75af398fadb-os-release\") pod \"multus-additional-cni-plugins-m69k4\" (UID: \"6e1ef928-0940-490e-89a9-e75af398fadb\") " pod="openshift-multus/multus-additional-cni-plugins-m69k4" Apr 22 19:57:58.285927 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.284397 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-multus-daemon-config\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.285927 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.284433 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6e1ef928-0940-490e-89a9-e75af398fadb-cni-binary-copy\") pod \"multus-additional-cni-plugins-m69k4\" (UID: \"6e1ef928-0940-490e-89a9-e75af398fadb\") " pod="openshift-multus/multus-additional-cni-plugins-m69k4" Apr 22 19:57:58.286454 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.284444 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a6c5f956-034b-486c-80ee-8f8ff4328b7f-ovn-node-metrics-cert\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.286454 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.284505 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-cni-binary-copy\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.286454 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.284797 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7d736e66-a7dc-4492-9e64-68b254bc8afa-agent-certs\") pod \"konnectivity-agent-9jzqm\" (UID: \"7d736e66-a7dc-4492-9e64-68b254bc8afa\") " pod="kube-system/konnectivity-agent-9jzqm" Apr 22 19:57:58.286454 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.284960 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-etc-tuned\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.286454 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.285964 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-tmp\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.288051 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.288016 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-njlqm\" (UniqueName: \"kubernetes.io/projected/80c4f1f9-74db-4e1b-bb9c-05d1618ca285-kube-api-access-njlqm\") pod \"network-metrics-daemon-gptpd\" (UID: \"80c4f1f9-74db-4e1b-bb9c-05d1618ca285\") " pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:57:58.288364 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.288335 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9pwt\" (UniqueName: \"kubernetes.io/projected/6e1ef928-0940-490e-89a9-e75af398fadb-kube-api-access-x9pwt\") pod \"multus-additional-cni-plugins-m69k4\" (UID: \"6e1ef928-0940-490e-89a9-e75af398fadb\") " pod="openshift-multus/multus-additional-cni-plugins-m69k4" Apr 22 19:57:58.291635 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:58.291613 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:57:58.291635 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:58.291639 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:57:58.292123 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:58.291652 2575 projected.go:194] Error preparing data for projected volume kube-api-access-hfkng for pod openshift-network-diagnostics/network-check-target-v2xhn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:58.292123 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:58.291719 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a0ee49c5-2186-4f09-9a83-ceabc81a52e5-kube-api-access-hfkng podName:a0ee49c5-2186-4f09-9a83-ceabc81a52e5 nodeName:}" failed. No retries permitted until 2026-04-22 19:57:58.791702348 +0000 UTC m=+3.043202230 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hfkng" (UniqueName: "kubernetes.io/projected/a0ee49c5-2186-4f09-9a83-ceabc81a52e5-kube-api-access-hfkng") pod "network-check-target-v2xhn" (UID: "a0ee49c5-2186-4f09-9a83-ceabc81a52e5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:58.293101 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.293083 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb5md\" (UniqueName: \"kubernetes.io/projected/e1780916-1341-4e58-886f-d52f29877102-kube-api-access-rb5md\") pod \"aws-ebs-csi-driver-node-bs9mh\" (UID: \"e1780916-1341-4e58-886f-d52f29877102\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bs9mh" Apr 22 19:57:58.293951 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.293934 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-phjb4\" (UniqueName: \"kubernetes.io/projected/46001a3b-bba4-48eb-b1ca-5e0377ba88fd-kube-api-access-phjb4\") pod \"tuned-lqzhq\" (UID: \"46001a3b-bba4-48eb-b1ca-5e0377ba88fd\") " pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.294019 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.294004 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pd5n\" (UniqueName: \"kubernetes.io/projected/a6c5f956-034b-486c-80ee-8f8ff4328b7f-kube-api-access-8pd5n\") pod \"ovnkube-node-f7mbc\" (UID: \"a6c5f956-034b-486c-80ee-8f8ff4328b7f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.294172 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.294154 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbt72\" (UniqueName: \"kubernetes.io/projected/bc9dd9ba-6ee7-42ef-bef4-834abd02ac13-kube-api-access-tbt72\") pod \"multus-hrx7n\" (UID: \"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13\") " pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.294372 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.294356 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbzdf\" (UniqueName: \"kubernetes.io/projected/658fb26d-962e-43ef-9be4-c89b573ecb41-kube-api-access-kbzdf\") pod \"node-ca-nhz6m\" (UID: \"658fb26d-962e-43ef-9be4-c89b573ecb41\") " pod="openshift-image-registry/node-ca-nhz6m" Apr 22 19:57:58.384472 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.384442 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e5e122d2-2065-483b-a689-7d9721ed4c07-iptables-alerter-script\") pod \"iptables-alerter-sflf8\" (UID: \"e5e122d2-2065-483b-a689-7d9721ed4c07\") " pod="openshift-network-operator/iptables-alerter-sflf8" Apr 22 19:57:58.384622 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.384483 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4w67\" (UniqueName: \"kubernetes.io/projected/e5e122d2-2065-483b-a689-7d9721ed4c07-kube-api-access-d4w67\") pod \"iptables-alerter-sflf8\" (UID: \"e5e122d2-2065-483b-a689-7d9721ed4c07\") " pod="openshift-network-operator/iptables-alerter-sflf8" Apr 22 19:57:58.384622 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.384503 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e5e122d2-2065-483b-a689-7d9721ed4c07-host-slash\") pod \"iptables-alerter-sflf8\" (UID: \"e5e122d2-2065-483b-a689-7d9721ed4c07\") " pod="openshift-network-operator/iptables-alerter-sflf8" Apr 22 19:57:58.384622 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.384557 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e5e122d2-2065-483b-a689-7d9721ed4c07-host-slash\") pod \"iptables-alerter-sflf8\" (UID: \"e5e122d2-2065-483b-a689-7d9721ed4c07\") " pod="openshift-network-operator/iptables-alerter-sflf8" Apr 22 19:57:58.385651 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.385626 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e5e122d2-2065-483b-a689-7d9721ed4c07-iptables-alerter-script\") pod \"iptables-alerter-sflf8\" (UID: \"e5e122d2-2065-483b-a689-7d9721ed4c07\") " pod="openshift-network-operator/iptables-alerter-sflf8" Apr 22 19:57:58.400967 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.400879 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4w67\" (UniqueName: \"kubernetes.io/projected/e5e122d2-2065-483b-a689-7d9721ed4c07-kube-api-access-d4w67\") pod \"iptables-alerter-sflf8\" (UID: \"e5e122d2-2065-483b-a689-7d9721ed4c07\") " pod="openshift-network-operator/iptables-alerter-sflf8" Apr 22 19:57:58.404613 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.404595 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:58.460337 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.460312 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:58.471419 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.471398 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:57:58.478088 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.478069 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9jzqm" Apr 22 19:57:58.485670 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.485653 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" Apr 22 19:57:58.490980 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.490966 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nhz6m" Apr 22 19:57:58.497471 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.497451 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hrx7n" Apr 22 19:57:58.503982 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.503964 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bs9mh" Apr 22 19:57:58.510472 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.510457 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-sflf8" Apr 22 19:57:58.516045 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.516014 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-m69k4" Apr 22 19:57:58.788489 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.788420 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80c4f1f9-74db-4e1b-bb9c-05d1618ca285-metrics-certs\") pod \"network-metrics-daemon-gptpd\" (UID: \"80c4f1f9-74db-4e1b-bb9c-05d1618ca285\") " pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:57:58.788641 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:58.788578 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:58.788704 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:58.788650 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80c4f1f9-74db-4e1b-bb9c-05d1618ca285-metrics-certs podName:80c4f1f9-74db-4e1b-bb9c-05d1618ca285 nodeName:}" failed. No retries permitted until 2026-04-22 19:57:59.78863085 +0000 UTC m=+4.040130732 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/80c4f1f9-74db-4e1b-bb9c-05d1618ca285-metrics-certs") pod "network-metrics-daemon-gptpd" (UID: "80c4f1f9-74db-4e1b-bb9c-05d1618ca285") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:58.810840 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:58.810819 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc9dd9ba_6ee7_42ef_bef4_834abd02ac13.slice/crio-5ad647879fc3626aa0e2d743811c35441d783e145feab2246d5625f5baf9d953 WatchSource:0}: Error finding container 5ad647879fc3626aa0e2d743811c35441d783e145feab2246d5625f5baf9d953: Status 404 returned error can't find the container with id 5ad647879fc3626aa0e2d743811c35441d783e145feab2246d5625f5baf9d953 Apr 22 19:57:58.812655 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:58.812485 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d736e66_a7dc_4492_9e64_68b254bc8afa.slice/crio-5e546b03c711fef5582334a00c9684abf2e780f8220922890b4531148951feda WatchSource:0}: Error finding container 5e546b03c711fef5582334a00c9684abf2e780f8220922890b4531148951feda: Status 404 returned error can't find the container with id 5e546b03c711fef5582334a00c9684abf2e780f8220922890b4531148951feda Apr 22 19:57:58.813713 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:58.813627 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod658fb26d_962e_43ef_9be4_c89b573ecb41.slice/crio-b06735054c7b836c916e7113a8f1a4cf8edf6ad97408ff056ec1f7cd6d706812 WatchSource:0}: Error finding container b06735054c7b836c916e7113a8f1a4cf8edf6ad97408ff056ec1f7cd6d706812: Status 404 returned error can't find the container with id b06735054c7b836c916e7113a8f1a4cf8edf6ad97408ff056ec1f7cd6d706812 Apr 22 19:57:58.815136 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:58.815024 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e1ef928_0940_490e_89a9_e75af398fadb.slice/crio-d31d5064e18c6e54525762688ac204121922dcc104f2383372f1f27e8dbd7cff WatchSource:0}: Error finding container d31d5064e18c6e54525762688ac204121922dcc104f2383372f1f27e8dbd7cff: Status 404 returned error can't find the container with id d31d5064e18c6e54525762688ac204121922dcc104f2383372f1f27e8dbd7cff Apr 22 19:57:58.817678 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:58.817652 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6c5f956_034b_486c_80ee_8f8ff4328b7f.slice/crio-fd4ab945fab8267bb42418957d5c3e6a5a19959cc05eb4115b8eca4f13e45c9a WatchSource:0}: Error finding container fd4ab945fab8267bb42418957d5c3e6a5a19959cc05eb4115b8eca4f13e45c9a: Status 404 returned error can't find the container with id fd4ab945fab8267bb42418957d5c3e6a5a19959cc05eb4115b8eca4f13e45c9a Apr 22 19:57:58.819308 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:58.819280 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46001a3b_bba4_48eb_b1ca_5e0377ba88fd.slice/crio-3d0217b704e3f347043d3078c44eed1dd5c690125f5ffa04cc51047c61f65233 WatchSource:0}: Error finding container 3d0217b704e3f347043d3078c44eed1dd5c690125f5ffa04cc51047c61f65233: Status 404 returned error can't find the container with id 3d0217b704e3f347043d3078c44eed1dd5c690125f5ffa04cc51047c61f65233 Apr 22 19:57:58.820991 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:58.819919 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5e122d2_2065_483b_a689_7d9721ed4c07.slice/crio-4a478ae20bee7bca240e88f5139ae2df610ec40688781107ba2505da9031510c WatchSource:0}: Error finding container 4a478ae20bee7bca240e88f5139ae2df610ec40688781107ba2505da9031510c: Status 404 returned error can't find the container with id 4a478ae20bee7bca240e88f5139ae2df610ec40688781107ba2505da9031510c Apr 22 19:57:58.821965 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:57:58.821924 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1780916_1341_4e58_886f_d52f29877102.slice/crio-41b235c46a433bbf1a22c2da6a6f07c549ee997eabbfd42496b83c9da11d1aa6 WatchSource:0}: Error finding container 41b235c46a433bbf1a22c2da6a6f07c549ee997eabbfd42496b83c9da11d1aa6: Status 404 returned error can't find the container with id 41b235c46a433bbf1a22c2da6a6f07c549ee997eabbfd42496b83c9da11d1aa6 Apr 22 19:57:58.888942 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:58.888919 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hfkng\" (UniqueName: \"kubernetes.io/projected/a0ee49c5-2186-4f09-9a83-ceabc81a52e5-kube-api-access-hfkng\") pod \"network-check-target-v2xhn\" (UID: \"a0ee49c5-2186-4f09-9a83-ceabc81a52e5\") " pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 19:57:58.889097 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:58.889077 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:57:58.889173 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:58.889103 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:57:58.889173 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:58.889114 2575 projected.go:194] Error preparing data for projected volume kube-api-access-hfkng for pod openshift-network-diagnostics/network-check-target-v2xhn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:58.889173 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:58.889157 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a0ee49c5-2186-4f09-9a83-ceabc81a52e5-kube-api-access-hfkng podName:a0ee49c5-2186-4f09-9a83-ceabc81a52e5 nodeName:}" failed. No retries permitted until 2026-04-22 19:57:59.889143242 +0000 UTC m=+4.140643109 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-hfkng" (UniqueName: "kubernetes.io/projected/a0ee49c5-2186-4f09-9a83-ceabc81a52e5-kube-api-access-hfkng") pod "network-check-target-v2xhn" (UID: "a0ee49c5-2186-4f09-9a83-ceabc81a52e5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:59.215046 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:59.214932 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:52:57 +0000 UTC" deadline="2027-12-08 22:35:30.989350966 +0000 UTC" Apr 22 19:57:59.215046 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:59.214969 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14282h37m31.774385911s" Apr 22 19:57:59.251693 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:59.250891 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-d46mn"] Apr 22 19:57:59.253131 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:59.252546 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:57:59.253131 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:59.252625 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d46mn" podUID="d02dd904-b70b-4d97-9033-5614a158edbf" Apr 22 19:57:59.292291 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:59.292084 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d02dd904-b70b-4d97-9033-5614a158edbf-dbus\") pod \"global-pull-secret-syncer-d46mn\" (UID: \"d02dd904-b70b-4d97-9033-5614a158edbf\") " pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:57:59.292291 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:59.292178 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d02dd904-b70b-4d97-9033-5614a158edbf-original-pull-secret\") pod \"global-pull-secret-syncer-d46mn\" (UID: \"d02dd904-b70b-4d97-9033-5614a158edbf\") " pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:57:59.292291 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:59.292218 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d02dd904-b70b-4d97-9033-5614a158edbf-kubelet-config\") pod \"global-pull-secret-syncer-d46mn\" (UID: \"d02dd904-b70b-4d97-9033-5614a158edbf\") " pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:57:59.300739 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:59.300023 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 19:57:59.300739 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:59.300164 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v2xhn" podUID="a0ee49c5-2186-4f09-9a83-ceabc81a52e5" Apr 22 19:57:59.300739 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:59.300586 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:57:59.300739 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:59.300695 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gptpd" podUID="80c4f1f9-74db-4e1b-bb9c-05d1618ca285" Apr 22 19:57:59.318737 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:59.318706 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nhz6m" event={"ID":"658fb26d-962e-43ef-9be4-c89b573ecb41","Type":"ContainerStarted","Data":"b06735054c7b836c916e7113a8f1a4cf8edf6ad97408ff056ec1f7cd6d706812"} Apr 22 19:57:59.327143 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:59.327077 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hrx7n" event={"ID":"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13","Type":"ContainerStarted","Data":"5ad647879fc3626aa0e2d743811c35441d783e145feab2246d5625f5baf9d953"} Apr 22 19:57:59.338922 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:59.338897 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-sflf8" event={"ID":"e5e122d2-2065-483b-a689-7d9721ed4c07","Type":"ContainerStarted","Data":"4a478ae20bee7bca240e88f5139ae2df610ec40688781107ba2505da9031510c"} Apr 22 19:57:59.347164 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:59.347143 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" event={"ID":"a6c5f956-034b-486c-80ee-8f8ff4328b7f","Type":"ContainerStarted","Data":"fd4ab945fab8267bb42418957d5c3e6a5a19959cc05eb4115b8eca4f13e45c9a"} Apr 22 19:57:59.352157 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:59.352132 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m69k4" event={"ID":"6e1ef928-0940-490e-89a9-e75af398fadb","Type":"ContainerStarted","Data":"d31d5064e18c6e54525762688ac204121922dcc104f2383372f1f27e8dbd7cff"} Apr 22 19:57:59.357781 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:59.357161 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-239.ec2.internal" event={"ID":"6653588b6ccc7c46306d530ef0bfbee3","Type":"ContainerStarted","Data":"a88d3ba812c2a1443eb9178c1089142932ffc2f93d22976b6add989485224c56"} Apr 22 19:57:59.359220 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:59.359175 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bs9mh" event={"ID":"e1780916-1341-4e58-886f-d52f29877102","Type":"ContainerStarted","Data":"41b235c46a433bbf1a22c2da6a6f07c549ee997eabbfd42496b83c9da11d1aa6"} Apr 22 19:57:59.362828 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:59.362796 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" event={"ID":"46001a3b-bba4-48eb-b1ca-5e0377ba88fd","Type":"ContainerStarted","Data":"3d0217b704e3f347043d3078c44eed1dd5c690125f5ffa04cc51047c61f65233"} Apr 22 19:57:59.371599 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:59.371558 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9jzqm" event={"ID":"7d736e66-a7dc-4492-9e64-68b254bc8afa","Type":"ContainerStarted","Data":"5e546b03c711fef5582334a00c9684abf2e780f8220922890b4531148951feda"} Apr 22 19:57:59.393179 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:59.392552 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d02dd904-b70b-4d97-9033-5614a158edbf-original-pull-secret\") pod \"global-pull-secret-syncer-d46mn\" (UID: \"d02dd904-b70b-4d97-9033-5614a158edbf\") " pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:57:59.393179 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:59.392606 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d02dd904-b70b-4d97-9033-5614a158edbf-kubelet-config\") pod \"global-pull-secret-syncer-d46mn\" (UID: \"d02dd904-b70b-4d97-9033-5614a158edbf\") " pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:57:59.393179 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:59.392650 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d02dd904-b70b-4d97-9033-5614a158edbf-dbus\") pod \"global-pull-secret-syncer-d46mn\" (UID: \"d02dd904-b70b-4d97-9033-5614a158edbf\") " pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:57:59.393179 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:59.392766 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d02dd904-b70b-4d97-9033-5614a158edbf-dbus\") pod \"global-pull-secret-syncer-d46mn\" (UID: \"d02dd904-b70b-4d97-9033-5614a158edbf\") " pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:57:59.393179 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:59.392860 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:57:59.393179 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:59.392918 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d02dd904-b70b-4d97-9033-5614a158edbf-original-pull-secret podName:d02dd904-b70b-4d97-9033-5614a158edbf nodeName:}" failed. No retries permitted until 2026-04-22 19:57:59.892901333 +0000 UTC m=+4.144401202 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d02dd904-b70b-4d97-9033-5614a158edbf-original-pull-secret") pod "global-pull-secret-syncer-d46mn" (UID: "d02dd904-b70b-4d97-9033-5614a158edbf") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:57:59.393179 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:59.393136 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d02dd904-b70b-4d97-9033-5614a158edbf-kubelet-config\") pod \"global-pull-secret-syncer-d46mn\" (UID: \"d02dd904-b70b-4d97-9033-5614a158edbf\") " pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:57:59.797332 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:59.796702 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80c4f1f9-74db-4e1b-bb9c-05d1618ca285-metrics-certs\") pod \"network-metrics-daemon-gptpd\" (UID: \"80c4f1f9-74db-4e1b-bb9c-05d1618ca285\") " pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:57:59.797332 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:59.796886 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:59.797332 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:59.796950 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80c4f1f9-74db-4e1b-bb9c-05d1618ca285-metrics-certs podName:80c4f1f9-74db-4e1b-bb9c-05d1618ca285 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:01.796930085 +0000 UTC m=+6.048429972 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/80c4f1f9-74db-4e1b-bb9c-05d1618ca285-metrics-certs") pod "network-metrics-daemon-gptpd" (UID: "80c4f1f9-74db-4e1b-bb9c-05d1618ca285") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:59.897993 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:59.897737 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d02dd904-b70b-4d97-9033-5614a158edbf-original-pull-secret\") pod \"global-pull-secret-syncer-d46mn\" (UID: \"d02dd904-b70b-4d97-9033-5614a158edbf\") " pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:57:59.897993 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:57:59.897787 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hfkng\" (UniqueName: \"kubernetes.io/projected/a0ee49c5-2186-4f09-9a83-ceabc81a52e5-kube-api-access-hfkng\") pod \"network-check-target-v2xhn\" (UID: \"a0ee49c5-2186-4f09-9a83-ceabc81a52e5\") " pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 19:57:59.897993 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:59.897847 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:57:59.897993 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:59.897905 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d02dd904-b70b-4d97-9033-5614a158edbf-original-pull-secret podName:d02dd904-b70b-4d97-9033-5614a158edbf nodeName:}" failed. No retries permitted until 2026-04-22 19:58:00.897886861 +0000 UTC m=+5.149386745 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d02dd904-b70b-4d97-9033-5614a158edbf-original-pull-secret") pod "global-pull-secret-syncer-d46mn" (UID: "d02dd904-b70b-4d97-9033-5614a158edbf") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:57:59.897993 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:59.897906 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:57:59.897993 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:59.897922 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:57:59.897993 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:59.897932 2575 projected.go:194] Error preparing data for projected volume kube-api-access-hfkng for pod openshift-network-diagnostics/network-check-target-v2xhn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:59.897993 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:57:59.897967 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a0ee49c5-2186-4f09-9a83-ceabc81a52e5-kube-api-access-hfkng podName:a0ee49c5-2186-4f09-9a83-ceabc81a52e5 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:01.897957214 +0000 UTC m=+6.149457084 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-hfkng" (UniqueName: "kubernetes.io/projected/a0ee49c5-2186-4f09-9a83-ceabc81a52e5-kube-api-access-hfkng") pod "network-check-target-v2xhn" (UID: "a0ee49c5-2186-4f09-9a83-ceabc81a52e5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:00.024644 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:00.024594 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:58:00.406580 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:00.405958 2575 generic.go:358] "Generic (PLEG): container finished" podID="e09675d338f4e7033174368cf4768c0b" containerID="e5da8e1d8bfaaffa2df11e18cf5cfea07b60a5af795abacb724eaed502301c77" exitCode=0 Apr 22 19:58:00.406580 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:00.406091 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-239.ec2.internal" event={"ID":"e09675d338f4e7033174368cf4768c0b","Type":"ContainerDied","Data":"e5da8e1d8bfaaffa2df11e18cf5cfea07b60a5af795abacb724eaed502301c77"} Apr 22 19:58:00.418884 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:00.418838 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-239.ec2.internal" podStartSLOduration=3.418820726 podStartE2EDuration="3.418820726s" podCreationTimestamp="2026-04-22 19:57:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:57:59.3741859 +0000 UTC m=+3.625685791" watchObservedRunningTime="2026-04-22 19:58:00.418820726 +0000 UTC m=+4.670320618" Apr 22 19:58:00.908143 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:00.908108 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d02dd904-b70b-4d97-9033-5614a158edbf-original-pull-secret\") pod \"global-pull-secret-syncer-d46mn\" (UID: \"d02dd904-b70b-4d97-9033-5614a158edbf\") " pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:58:00.908302 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:00.908271 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:00.908362 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:00.908327 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d02dd904-b70b-4d97-9033-5614a158edbf-original-pull-secret podName:d02dd904-b70b-4d97-9033-5614a158edbf nodeName:}" failed. No retries permitted until 2026-04-22 19:58:02.90830945 +0000 UTC m=+7.159809340 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d02dd904-b70b-4d97-9033-5614a158edbf-original-pull-secret") pod "global-pull-secret-syncer-d46mn" (UID: "d02dd904-b70b-4d97-9033-5614a158edbf") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:01.300207 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:01.300126 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:58:01.300424 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:01.300271 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d46mn" podUID="d02dd904-b70b-4d97-9033-5614a158edbf" Apr 22 19:58:01.300690 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:01.300670 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:58:01.300808 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:01.300788 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gptpd" podUID="80c4f1f9-74db-4e1b-bb9c-05d1618ca285" Apr 22 19:58:01.300887 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:01.300843 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 19:58:01.300939 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:01.300920 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v2xhn" podUID="a0ee49c5-2186-4f09-9a83-ceabc81a52e5" Apr 22 19:58:01.410866 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:01.410829 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-239.ec2.internal" event={"ID":"e09675d338f4e7033174368cf4768c0b","Type":"ContainerStarted","Data":"bccb25b5566d284037277f683d86acc5a11ce9c0063e63d58c8514b0146a2d7f"} Apr 22 19:58:01.426882 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:01.426834 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-239.ec2.internal" podStartSLOduration=4.426818387 podStartE2EDuration="4.426818387s" podCreationTimestamp="2026-04-22 19:57:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:58:01.426609506 +0000 UTC m=+5.678109396" watchObservedRunningTime="2026-04-22 19:58:01.426818387 +0000 UTC m=+5.678318280" Apr 22 19:58:01.815708 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:01.815115 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80c4f1f9-74db-4e1b-bb9c-05d1618ca285-metrics-certs\") pod \"network-metrics-daemon-gptpd\" (UID: \"80c4f1f9-74db-4e1b-bb9c-05d1618ca285\") " pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:58:01.815708 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:01.815298 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:01.815708 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:01.815359 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80c4f1f9-74db-4e1b-bb9c-05d1618ca285-metrics-certs podName:80c4f1f9-74db-4e1b-bb9c-05d1618ca285 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:05.815340612 +0000 UTC m=+10.066840494 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/80c4f1f9-74db-4e1b-bb9c-05d1618ca285-metrics-certs") pod "network-metrics-daemon-gptpd" (UID: "80c4f1f9-74db-4e1b-bb9c-05d1618ca285") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:01.915725 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:01.915684 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hfkng\" (UniqueName: \"kubernetes.io/projected/a0ee49c5-2186-4f09-9a83-ceabc81a52e5-kube-api-access-hfkng\") pod \"network-check-target-v2xhn\" (UID: \"a0ee49c5-2186-4f09-9a83-ceabc81a52e5\") " pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 19:58:01.915909 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:01.915854 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:01.915909 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:01.915874 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:01.915909 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:01.915887 2575 projected.go:194] Error preparing data for projected volume kube-api-access-hfkng for pod openshift-network-diagnostics/network-check-target-v2xhn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:01.916091 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:01.915946 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a0ee49c5-2186-4f09-9a83-ceabc81a52e5-kube-api-access-hfkng podName:a0ee49c5-2186-4f09-9a83-ceabc81a52e5 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:05.915927357 +0000 UTC m=+10.167427238 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-hfkng" (UniqueName: "kubernetes.io/projected/a0ee49c5-2186-4f09-9a83-ceabc81a52e5-kube-api-access-hfkng") pod "network-check-target-v2xhn" (UID: "a0ee49c5-2186-4f09-9a83-ceabc81a52e5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:02.841832 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:02.841180 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-mht87"] Apr 22 19:58:02.844084 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:02.843929 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mht87" Apr 22 19:58:02.849999 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:02.849918 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-gxmxw\"" Apr 22 19:58:02.850591 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:02.850571 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 19:58:02.851175 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:02.851002 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 19:58:02.924344 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:02.924306 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1e73951e-095f-4406-93d0-afb41cb12c4b-tmp-dir\") pod \"node-resolver-mht87\" (UID: \"1e73951e-095f-4406-93d0-afb41cb12c4b\") " pod="openshift-dns/node-resolver-mht87" Apr 22 19:58:02.924490 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:02.924361 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d02dd904-b70b-4d97-9033-5614a158edbf-original-pull-secret\") pod \"global-pull-secret-syncer-d46mn\" (UID: \"d02dd904-b70b-4d97-9033-5614a158edbf\") " pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:58:02.924490 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:02.924413 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqw4g\" (UniqueName: \"kubernetes.io/projected/1e73951e-095f-4406-93d0-afb41cb12c4b-kube-api-access-hqw4g\") pod \"node-resolver-mht87\" (UID: \"1e73951e-095f-4406-93d0-afb41cb12c4b\") " pod="openshift-dns/node-resolver-mht87" Apr 22 19:58:02.924490 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:02.924458 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1e73951e-095f-4406-93d0-afb41cb12c4b-hosts-file\") pod \"node-resolver-mht87\" (UID: \"1e73951e-095f-4406-93d0-afb41cb12c4b\") " pod="openshift-dns/node-resolver-mht87" Apr 22 19:58:02.924652 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:02.924516 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:02.924652 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:02.924588 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d02dd904-b70b-4d97-9033-5614a158edbf-original-pull-secret podName:d02dd904-b70b-4d97-9033-5614a158edbf nodeName:}" failed. No retries permitted until 2026-04-22 19:58:06.924568278 +0000 UTC m=+11.176068147 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d02dd904-b70b-4d97-9033-5614a158edbf-original-pull-secret") pod "global-pull-secret-syncer-d46mn" (UID: "d02dd904-b70b-4d97-9033-5614a158edbf") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:03.024986 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:03.024954 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqw4g\" (UniqueName: \"kubernetes.io/projected/1e73951e-095f-4406-93d0-afb41cb12c4b-kube-api-access-hqw4g\") pod \"node-resolver-mht87\" (UID: \"1e73951e-095f-4406-93d0-afb41cb12c4b\") " pod="openshift-dns/node-resolver-mht87" Apr 22 19:58:03.025196 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:03.025006 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1e73951e-095f-4406-93d0-afb41cb12c4b-hosts-file\") pod \"node-resolver-mht87\" (UID: \"1e73951e-095f-4406-93d0-afb41cb12c4b\") " pod="openshift-dns/node-resolver-mht87" Apr 22 19:58:03.025196 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:03.025137 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1e73951e-095f-4406-93d0-afb41cb12c4b-hosts-file\") pod \"node-resolver-mht87\" (UID: \"1e73951e-095f-4406-93d0-afb41cb12c4b\") " pod="openshift-dns/node-resolver-mht87" Apr 22 19:58:03.025196 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:03.025155 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1e73951e-095f-4406-93d0-afb41cb12c4b-tmp-dir\") pod \"node-resolver-mht87\" (UID: \"1e73951e-095f-4406-93d0-afb41cb12c4b\") " pod="openshift-dns/node-resolver-mht87" Apr 22 19:58:03.025551 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:03.025528 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1e73951e-095f-4406-93d0-afb41cb12c4b-tmp-dir\") pod \"node-resolver-mht87\" (UID: \"1e73951e-095f-4406-93d0-afb41cb12c4b\") " pod="openshift-dns/node-resolver-mht87" Apr 22 19:58:03.040219 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:03.040018 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqw4g\" (UniqueName: \"kubernetes.io/projected/1e73951e-095f-4406-93d0-afb41cb12c4b-kube-api-access-hqw4g\") pod \"node-resolver-mht87\" (UID: \"1e73951e-095f-4406-93d0-afb41cb12c4b\") " pod="openshift-dns/node-resolver-mht87" Apr 22 19:58:03.154913 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:03.154841 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mht87" Apr 22 19:58:03.299984 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:03.299952 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 19:58:03.300152 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:03.300091 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v2xhn" podUID="a0ee49c5-2186-4f09-9a83-ceabc81a52e5" Apr 22 19:58:03.300775 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:03.300482 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:58:03.300775 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:03.300578 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d46mn" podUID="d02dd904-b70b-4d97-9033-5614a158edbf" Apr 22 19:58:03.300775 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:03.300657 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:58:03.300775 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:03.300737 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gptpd" podUID="80c4f1f9-74db-4e1b-bb9c-05d1618ca285" Apr 22 19:58:05.299666 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:05.299636 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 19:58:05.300151 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:05.299679 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:58:05.300151 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:05.299758 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v2xhn" podUID="a0ee49c5-2186-4f09-9a83-ceabc81a52e5" Apr 22 19:58:05.300151 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:05.299825 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:58:05.300151 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:05.299931 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gptpd" podUID="80c4f1f9-74db-4e1b-bb9c-05d1618ca285" Apr 22 19:58:05.300151 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:05.300014 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d46mn" podUID="d02dd904-b70b-4d97-9033-5614a158edbf" Apr 22 19:58:05.847444 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:05.847407 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80c4f1f9-74db-4e1b-bb9c-05d1618ca285-metrics-certs\") pod \"network-metrics-daemon-gptpd\" (UID: \"80c4f1f9-74db-4e1b-bb9c-05d1618ca285\") " pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:58:05.847623 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:05.847607 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:05.847682 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:05.847668 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80c4f1f9-74db-4e1b-bb9c-05d1618ca285-metrics-certs podName:80c4f1f9-74db-4e1b-bb9c-05d1618ca285 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:13.84764996 +0000 UTC m=+18.099149841 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/80c4f1f9-74db-4e1b-bb9c-05d1618ca285-metrics-certs") pod "network-metrics-daemon-gptpd" (UID: "80c4f1f9-74db-4e1b-bb9c-05d1618ca285") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:05.948588 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:05.948554 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hfkng\" (UniqueName: \"kubernetes.io/projected/a0ee49c5-2186-4f09-9a83-ceabc81a52e5-kube-api-access-hfkng\") pod \"network-check-target-v2xhn\" (UID: \"a0ee49c5-2186-4f09-9a83-ceabc81a52e5\") " pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 19:58:05.948733 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:05.948691 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:05.948733 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:05.948714 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:05.948733 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:05.948728 2575 projected.go:194] Error preparing data for projected volume kube-api-access-hfkng for pod openshift-network-diagnostics/network-check-target-v2xhn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:05.948905 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:05.948783 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a0ee49c5-2186-4f09-9a83-ceabc81a52e5-kube-api-access-hfkng podName:a0ee49c5-2186-4f09-9a83-ceabc81a52e5 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:13.948764665 +0000 UTC m=+18.200264537 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-hfkng" (UniqueName: "kubernetes.io/projected/a0ee49c5-2186-4f09-9a83-ceabc81a52e5-kube-api-access-hfkng") pod "network-check-target-v2xhn" (UID: "a0ee49c5-2186-4f09-9a83-ceabc81a52e5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:06.955601 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:06.955564 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d02dd904-b70b-4d97-9033-5614a158edbf-original-pull-secret\") pod \"global-pull-secret-syncer-d46mn\" (UID: \"d02dd904-b70b-4d97-9033-5614a158edbf\") " pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:58:06.956079 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:06.955737 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:06.956079 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:06.955811 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d02dd904-b70b-4d97-9033-5614a158edbf-original-pull-secret podName:d02dd904-b70b-4d97-9033-5614a158edbf nodeName:}" failed. No retries permitted until 2026-04-22 19:58:14.955790578 +0000 UTC m=+19.207290457 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d02dd904-b70b-4d97-9033-5614a158edbf-original-pull-secret") pod "global-pull-secret-syncer-d46mn" (UID: "d02dd904-b70b-4d97-9033-5614a158edbf") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:07.299652 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:07.299580 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 19:58:07.299803 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:07.299724 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:58:07.299803 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:07.299733 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v2xhn" podUID="a0ee49c5-2186-4f09-9a83-ceabc81a52e5" Apr 22 19:58:07.299882 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:07.299825 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d46mn" podUID="d02dd904-b70b-4d97-9033-5614a158edbf" Apr 22 19:58:07.299882 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:07.299873 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:58:07.299948 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:07.299935 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gptpd" podUID="80c4f1f9-74db-4e1b-bb9c-05d1618ca285" Apr 22 19:58:09.300005 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:09.299937 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 19:58:09.300005 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:09.299978 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:58:09.300005 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:09.299937 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:58:09.300511 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:09.300076 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v2xhn" podUID="a0ee49c5-2186-4f09-9a83-ceabc81a52e5" Apr 22 19:58:09.300511 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:09.300261 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d46mn" podUID="d02dd904-b70b-4d97-9033-5614a158edbf" Apr 22 19:58:09.300511 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:09.300396 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gptpd" podUID="80c4f1f9-74db-4e1b-bb9c-05d1618ca285" Apr 22 19:58:11.299781 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:11.299709 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 19:58:11.300303 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:11.299709 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:58:11.300303 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:11.299824 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v2xhn" podUID="a0ee49c5-2186-4f09-9a83-ceabc81a52e5" Apr 22 19:58:11.300303 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:11.299721 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:58:11.300303 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:11.299893 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d46mn" podUID="d02dd904-b70b-4d97-9033-5614a158edbf" Apr 22 19:58:11.300303 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:11.300016 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gptpd" podUID="80c4f1f9-74db-4e1b-bb9c-05d1618ca285" Apr 22 19:58:13.300376 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:13.300341 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:58:13.300376 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:13.300370 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 19:58:13.300851 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:13.300396 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:58:13.300851 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:13.300491 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gptpd" podUID="80c4f1f9-74db-4e1b-bb9c-05d1618ca285" Apr 22 19:58:13.300851 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:13.300544 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v2xhn" podUID="a0ee49c5-2186-4f09-9a83-ceabc81a52e5" Apr 22 19:58:13.300851 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:13.300609 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d46mn" podUID="d02dd904-b70b-4d97-9033-5614a158edbf" Apr 22 19:58:13.909979 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:13.909945 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80c4f1f9-74db-4e1b-bb9c-05d1618ca285-metrics-certs\") pod \"network-metrics-daemon-gptpd\" (UID: \"80c4f1f9-74db-4e1b-bb9c-05d1618ca285\") " pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:58:13.910186 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:13.910107 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:13.910255 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:13.910196 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80c4f1f9-74db-4e1b-bb9c-05d1618ca285-metrics-certs podName:80c4f1f9-74db-4e1b-bb9c-05d1618ca285 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:29.910169612 +0000 UTC m=+34.161669495 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/80c4f1f9-74db-4e1b-bb9c-05d1618ca285-metrics-certs") pod "network-metrics-daemon-gptpd" (UID: "80c4f1f9-74db-4e1b-bb9c-05d1618ca285") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:14.011227 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:14.011195 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hfkng\" (UniqueName: \"kubernetes.io/projected/a0ee49c5-2186-4f09-9a83-ceabc81a52e5-kube-api-access-hfkng\") pod \"network-check-target-v2xhn\" (UID: \"a0ee49c5-2186-4f09-9a83-ceabc81a52e5\") " pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 19:58:14.011408 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:14.011392 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:14.011465 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:14.011414 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:14.011465 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:14.011427 2575 projected.go:194] Error preparing data for projected volume kube-api-access-hfkng for pod openshift-network-diagnostics/network-check-target-v2xhn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:14.011530 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:14.011482 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a0ee49c5-2186-4f09-9a83-ceabc81a52e5-kube-api-access-hfkng podName:a0ee49c5-2186-4f09-9a83-ceabc81a52e5 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:30.011467644 +0000 UTC m=+34.262967535 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-hfkng" (UniqueName: "kubernetes.io/projected/a0ee49c5-2186-4f09-9a83-ceabc81a52e5-kube-api-access-hfkng") pod "network-check-target-v2xhn" (UID: "a0ee49c5-2186-4f09-9a83-ceabc81a52e5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:15.019760 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:15.019720 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d02dd904-b70b-4d97-9033-5614a158edbf-original-pull-secret\") pod \"global-pull-secret-syncer-d46mn\" (UID: \"d02dd904-b70b-4d97-9033-5614a158edbf\") " pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:58:15.020242 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:15.019855 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:15.020242 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:15.019915 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d02dd904-b70b-4d97-9033-5614a158edbf-original-pull-secret podName:d02dd904-b70b-4d97-9033-5614a158edbf nodeName:}" failed. No retries permitted until 2026-04-22 19:58:31.019900364 +0000 UTC m=+35.271400236 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d02dd904-b70b-4d97-9033-5614a158edbf-original-pull-secret") pod "global-pull-secret-syncer-d46mn" (UID: "d02dd904-b70b-4d97-9033-5614a158edbf") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:15.300270 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:15.300207 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 19:58:15.300270 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:15.300229 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:58:15.300413 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:15.300217 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:58:15.300413 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:15.300297 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v2xhn" podUID="a0ee49c5-2186-4f09-9a83-ceabc81a52e5" Apr 22 19:58:15.300413 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:15.300391 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d46mn" podUID="d02dd904-b70b-4d97-9033-5614a158edbf" Apr 22 19:58:15.300506 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:15.300486 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gptpd" podUID="80c4f1f9-74db-4e1b-bb9c-05d1618ca285" Apr 22 19:58:15.404802 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:58:15.404776 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e73951e_095f_4406_93d0_afb41cb12c4b.slice/crio-3352fa4c361c09cefc52faae0bece263d9a0b1a0a6a0d9d1ba1fcab28f2e62d4 WatchSource:0}: Error finding container 3352fa4c361c09cefc52faae0bece263d9a0b1a0a6a0d9d1ba1fcab28f2e62d4: Status 404 returned error can't find the container with id 3352fa4c361c09cefc52faae0bece263d9a0b1a0a6a0d9d1ba1fcab28f2e62d4 Apr 22 19:58:15.434963 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:15.434942 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mht87" event={"ID":"1e73951e-095f-4406-93d0-afb41cb12c4b","Type":"ContainerStarted","Data":"3352fa4c361c09cefc52faae0bece263d9a0b1a0a6a0d9d1ba1fcab28f2e62d4"} Apr 22 19:58:16.438180 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:16.437836 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bs9mh" event={"ID":"e1780916-1341-4e58-886f-d52f29877102","Type":"ContainerStarted","Data":"e666f726c848da302acfd0e9f58bf91616641bb2f03215eebe175b1f691640a4"} Apr 22 19:58:16.439235 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:16.439207 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" event={"ID":"46001a3b-bba4-48eb-b1ca-5e0377ba88fd","Type":"ContainerStarted","Data":"f30a2812064ed16748e7cd2a102d1ebecaf90eccf26530bc071dd9b77e558d08"} Apr 22 19:58:16.440657 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:16.440631 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9jzqm" event={"ID":"7d736e66-a7dc-4492-9e64-68b254bc8afa","Type":"ContainerStarted","Data":"8dd00b548c77b7b8d6af5b6ed0bf135e1d438865a5204e8bd3acd859ddad9fd8"} Apr 22 19:58:16.442089 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:16.442061 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nhz6m" event={"ID":"658fb26d-962e-43ef-9be4-c89b573ecb41","Type":"ContainerStarted","Data":"e3c559dff25d1782666bd30ef01568c9f9545bdabe64efb7116e4d6ceefcbc59"} Apr 22 19:58:16.443477 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:16.443445 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hrx7n" event={"ID":"bc9dd9ba-6ee7-42ef-bef4-834abd02ac13","Type":"ContainerStarted","Data":"0f295f0a46088040a926e97939499dc9b502dfc637ce1f62634f0bb0ac115a07"} Apr 22 19:58:16.446005 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:16.445984 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" event={"ID":"a6c5f956-034b-486c-80ee-8f8ff4328b7f","Type":"ContainerStarted","Data":"f16cfd8e99a163b077e2e892ead44aef5621ed8843fd6c96d8f07e192b50da44"} Apr 22 19:58:16.446108 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:16.446009 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" event={"ID":"a6c5f956-034b-486c-80ee-8f8ff4328b7f","Type":"ContainerStarted","Data":"d177c007943c3a7f56a33f0f511e60e496bc881eda0fef8b8989cde544486d0a"} Apr 22 19:58:16.446108 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:16.446022 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" event={"ID":"a6c5f956-034b-486c-80ee-8f8ff4328b7f","Type":"ContainerStarted","Data":"bc282b60ae4d94f531c3079c1a3f53e2b03a696163a0d7999c05f4d9ee66d563"} Apr 22 19:58:16.446108 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:16.446044 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" event={"ID":"a6c5f956-034b-486c-80ee-8f8ff4328b7f","Type":"ContainerStarted","Data":"82f331b67c51f9929dca6a11119eedf6ee5935ae4cef88984ef08d04ea734339"} Apr 22 19:58:16.446108 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:16.446056 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" event={"ID":"a6c5f956-034b-486c-80ee-8f8ff4328b7f","Type":"ContainerStarted","Data":"569643c6be7d242f4f59f224d81f4654961eec99473b3ecbdaad6b4114f3b579"} Apr 22 19:58:16.447370 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:16.447345 2575 generic.go:358] "Generic (PLEG): container finished" podID="6e1ef928-0940-490e-89a9-e75af398fadb" containerID="c1ab373dd7785eeb44ff59a22956cd8fae16a307b0399687b77d98220eab19fc" exitCode=0 Apr 22 19:58:16.447446 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:16.447379 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m69k4" event={"ID":"6e1ef928-0940-490e-89a9-e75af398fadb","Type":"ContainerDied","Data":"c1ab373dd7785eeb44ff59a22956cd8fae16a307b0399687b77d98220eab19fc"} Apr 22 19:58:16.448532 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:16.448509 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mht87" event={"ID":"1e73951e-095f-4406-93d0-afb41cb12c4b","Type":"ContainerStarted","Data":"c9287ed9193e506105c857576836a9797e2b9ae841255e198d6ec6071f9aefb9"} Apr 22 19:58:16.477477 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:16.477439 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-lqzhq" podStartSLOduration=3.897030918 podStartE2EDuration="20.477426594s" podCreationTimestamp="2026-04-22 19:57:56 +0000 UTC" firstStartedPulling="2026-04-22 19:57:58.821359091 +0000 UTC m=+3.072858961" lastFinishedPulling="2026-04-22 19:58:15.401754762 +0000 UTC m=+19.653254637" observedRunningTime="2026-04-22 19:58:16.457669606 +0000 UTC m=+20.709169497" watchObservedRunningTime="2026-04-22 19:58:16.477426594 +0000 UTC m=+20.728926517" Apr 22 19:58:16.494319 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:16.494281 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-mht87" podStartSLOduration=14.494268362 podStartE2EDuration="14.494268362s" podCreationTimestamp="2026-04-22 19:58:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:58:16.493928748 +0000 UTC m=+20.745428628" watchObservedRunningTime="2026-04-22 19:58:16.494268362 +0000 UTC m=+20.745768251" Apr 22 19:58:16.509718 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:16.509685 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-9jzqm" podStartSLOduration=3.923175035 podStartE2EDuration="20.509674455s" podCreationTimestamp="2026-04-22 19:57:56 +0000 UTC" firstStartedPulling="2026-04-22 19:57:58.815275387 +0000 UTC m=+3.066775257" lastFinishedPulling="2026-04-22 19:58:15.401774797 +0000 UTC m=+19.653274677" observedRunningTime="2026-04-22 19:58:16.509546395 +0000 UTC m=+20.761046285" watchObservedRunningTime="2026-04-22 19:58:16.509674455 +0000 UTC m=+20.761174343" Apr 22 19:58:16.532665 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:16.532621 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-nhz6m" podStartSLOduration=3.9469346119999997 podStartE2EDuration="20.532608486s" podCreationTimestamp="2026-04-22 19:57:56 +0000 UTC" firstStartedPulling="2026-04-22 19:57:58.816088222 +0000 UTC m=+3.067588103" lastFinishedPulling="2026-04-22 19:58:15.401762098 +0000 UTC m=+19.653261977" observedRunningTime="2026-04-22 19:58:16.532159976 +0000 UTC m=+20.783659866" watchObservedRunningTime="2026-04-22 19:58:16.532608486 +0000 UTC m=+20.784108375" Apr 22 19:58:16.569836 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:16.569801 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-hrx7n" podStartSLOduration=3.943318577 podStartE2EDuration="20.569790515s" podCreationTimestamp="2026-04-22 19:57:56 +0000 UTC" firstStartedPulling="2026-04-22 19:57:58.814243399 +0000 UTC m=+3.065743271" lastFinishedPulling="2026-04-22 19:58:15.440715339 +0000 UTC m=+19.692215209" observedRunningTime="2026-04-22 19:58:16.569471862 +0000 UTC m=+20.820971745" watchObservedRunningTime="2026-04-22 19:58:16.569790515 +0000 UTC m=+20.821290405" Apr 22 19:58:16.621504 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:16.621483 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 19:58:17.250221 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:17.250067 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T19:58:16.621499789Z","UUID":"71410b7d-b70d-49fb-8362-08a7241b87d3","Handler":null,"Name":"","Endpoint":""} Apr 22 19:58:17.251924 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:17.251898 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 19:58:17.252061 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:17.251937 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 19:58:17.299933 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:17.299903 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 19:58:17.300075 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:17.299903 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:58:17.300075 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:17.300001 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v2xhn" podUID="a0ee49c5-2186-4f09-9a83-ceabc81a52e5" Apr 22 19:58:17.300187 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:17.300109 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d46mn" podUID="d02dd904-b70b-4d97-9033-5614a158edbf" Apr 22 19:58:17.300187 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:17.299907 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:58:17.300271 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:17.300210 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gptpd" podUID="80c4f1f9-74db-4e1b-bb9c-05d1618ca285" Apr 22 19:58:17.453906 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:17.453877 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bs9mh" event={"ID":"e1780916-1341-4e58-886f-d52f29877102","Type":"ContainerStarted","Data":"c70a317112103fe6753997acb394843586b88eb9c2117d89cf3c248a78efa4c5"} Apr 22 19:58:17.455770 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:17.455734 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-sflf8" event={"ID":"e5e122d2-2065-483b-a689-7d9721ed4c07","Type":"ContainerStarted","Data":"3fd1970acaa5ab6fb85af8e9a497cfff088a9943d6b7caf1fec8c70910688770"} Apr 22 19:58:17.460011 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:17.459018 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" event={"ID":"a6c5f956-034b-486c-80ee-8f8ff4328b7f","Type":"ContainerStarted","Data":"eaffd43aaeebdfdc502e3848444ad1ba18d6c301ad6d8506d14ff3f2ba69592d"} Apr 22 19:58:17.470574 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:17.470528 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-sflf8" podStartSLOduration=4.890876721 podStartE2EDuration="21.470512747s" podCreationTimestamp="2026-04-22 19:57:56 +0000 UTC" firstStartedPulling="2026-04-22 19:57:58.822148171 +0000 UTC m=+3.073648052" lastFinishedPulling="2026-04-22 19:58:15.401784197 +0000 UTC m=+19.653284078" observedRunningTime="2026-04-22 19:58:17.470287461 +0000 UTC m=+21.721787351" watchObservedRunningTime="2026-04-22 19:58:17.470512747 +0000 UTC m=+21.722012636" Apr 22 19:58:18.072154 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:18.072109 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-9jzqm" Apr 22 19:58:18.073180 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:18.073153 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-9jzqm" Apr 22 19:58:18.463275 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:18.463194 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bs9mh" event={"ID":"e1780916-1341-4e58-886f-d52f29877102","Type":"ContainerStarted","Data":"328b39bfc1efc4df7fc7bdeb2f6829b7b2f7c0512fe4a79cedb056d2946b6927"} Apr 22 19:58:18.463858 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:18.463492 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-9jzqm" Apr 22 19:58:18.464391 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:18.464374 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-9jzqm" Apr 22 19:58:18.496626 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:18.496582 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bs9mh" podStartSLOduration=3.920237278 podStartE2EDuration="22.496556632s" podCreationTimestamp="2026-04-22 19:57:56 +0000 UTC" firstStartedPulling="2026-04-22 19:57:58.824127918 +0000 UTC m=+3.075627796" lastFinishedPulling="2026-04-22 19:58:17.400447279 +0000 UTC m=+21.651947150" observedRunningTime="2026-04-22 19:58:18.481583963 +0000 UTC m=+22.733083852" watchObservedRunningTime="2026-04-22 19:58:18.496556632 +0000 UTC m=+22.748056522" Apr 22 19:58:19.299909 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:19.299873 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 19:58:19.300113 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:19.299984 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v2xhn" podUID="a0ee49c5-2186-4f09-9a83-ceabc81a52e5" Apr 22 19:58:19.300113 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:19.299873 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:58:19.300113 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:19.299873 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:58:19.300284 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:19.300123 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gptpd" podUID="80c4f1f9-74db-4e1b-bb9c-05d1618ca285" Apr 22 19:58:19.300284 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:19.300199 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d46mn" podUID="d02dd904-b70b-4d97-9033-5614a158edbf" Apr 22 19:58:19.468199 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:19.468155 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" event={"ID":"a6c5f956-034b-486c-80ee-8f8ff4328b7f","Type":"ContainerStarted","Data":"d22a22c3cd4be1b10ac77385bf260fe4bab47f64ece54f17d4d05bb21adfc082"} Apr 22 19:58:21.299659 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:21.299431 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:58:21.299999 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:21.299441 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 19:58:21.299999 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:21.299699 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d46mn" podUID="d02dd904-b70b-4d97-9033-5614a158edbf" Apr 22 19:58:21.299999 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:21.299441 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:58:21.299999 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:21.299759 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v2xhn" podUID="a0ee49c5-2186-4f09-9a83-ceabc81a52e5" Apr 22 19:58:21.299999 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:21.299853 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gptpd" podUID="80c4f1f9-74db-4e1b-bb9c-05d1618ca285" Apr 22 19:58:21.473852 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:21.473826 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" event={"ID":"a6c5f956-034b-486c-80ee-8f8ff4328b7f","Type":"ContainerStarted","Data":"40d41ba026dce8494c57e5a3a2927433980b488bacfe8afb101e8ab68e6f6254"} Apr 22 19:58:21.474194 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:21.474162 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:58:21.474194 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:21.474198 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:58:21.475745 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:21.475717 2575 generic.go:358] "Generic (PLEG): container finished" podID="6e1ef928-0940-490e-89a9-e75af398fadb" containerID="450506d63148e19e87301c26dc529ebca8b561ecedfd67111af1d18090412047" exitCode=0 Apr 22 19:58:21.475849 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:21.475767 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m69k4" event={"ID":"6e1ef928-0940-490e-89a9-e75af398fadb","Type":"ContainerDied","Data":"450506d63148e19e87301c26dc529ebca8b561ecedfd67111af1d18090412047"} Apr 22 19:58:21.489053 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:21.489019 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:58:21.489157 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:21.489138 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:58:21.501738 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:21.501701 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" podStartSLOduration=8.407121949 podStartE2EDuration="25.501686262s" podCreationTimestamp="2026-04-22 19:57:56 +0000 UTC" firstStartedPulling="2026-04-22 19:57:58.819445665 +0000 UTC m=+3.070945532" lastFinishedPulling="2026-04-22 19:58:15.914009975 +0000 UTC m=+20.165509845" observedRunningTime="2026-04-22 19:58:21.5016048 +0000 UTC m=+25.753104690" watchObservedRunningTime="2026-04-22 19:58:21.501686262 +0000 UTC m=+25.753186151" Apr 22 19:58:22.479136 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:22.479103 2575 generic.go:358] "Generic (PLEG): container finished" podID="6e1ef928-0940-490e-89a9-e75af398fadb" containerID="6f701bd66a525105e31924cd194ed7b8619b6f32bb7e3bed4172cbe11a6ed449" exitCode=0 Apr 22 19:58:22.479686 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:22.479164 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m69k4" event={"ID":"6e1ef928-0940-490e-89a9-e75af398fadb","Type":"ContainerDied","Data":"6f701bd66a525105e31924cd194ed7b8619b6f32bb7e3bed4172cbe11a6ed449"} Apr 22 19:58:22.479686 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:22.479337 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 19:58:22.583916 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:22.583857 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-v2xhn"] Apr 22 19:58:22.584084 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:22.583963 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 19:58:22.584153 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:22.584081 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v2xhn" podUID="a0ee49c5-2186-4f09-9a83-ceabc81a52e5" Apr 22 19:58:22.584613 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:22.584565 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-d46mn"] Apr 22 19:58:22.584704 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:22.584665 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:58:22.584787 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:22.584756 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d46mn" podUID="d02dd904-b70b-4d97-9033-5614a158edbf" Apr 22 19:58:22.585172 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:22.585153 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gptpd"] Apr 22 19:58:22.585261 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:22.585250 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:58:22.585344 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:22.585323 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gptpd" podUID="80c4f1f9-74db-4e1b-bb9c-05d1618ca285" Apr 22 19:58:23.482564 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:23.482489 2575 generic.go:358] "Generic (PLEG): container finished" podID="6e1ef928-0940-490e-89a9-e75af398fadb" containerID="89d0cfe152ee0f12e7535b8295dddfeb008e7ab99cc1baa8ddd67984c74e9482" exitCode=0 Apr 22 19:58:23.483085 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:23.482572 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m69k4" event={"ID":"6e1ef928-0940-490e-89a9-e75af398fadb","Type":"ContainerDied","Data":"89d0cfe152ee0f12e7535b8295dddfeb008e7ab99cc1baa8ddd67984c74e9482"} Apr 22 19:58:23.483085 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:23.482628 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 19:58:24.064395 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:24.064367 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:58:24.299750 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:24.299721 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:58:24.299918 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:24.299722 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:58:24.299918 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:24.299842 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 19:58:24.299918 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:24.299839 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gptpd" podUID="80c4f1f9-74db-4e1b-bb9c-05d1618ca285" Apr 22 19:58:24.300068 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:24.299931 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d46mn" podUID="d02dd904-b70b-4d97-9033-5614a158edbf" Apr 22 19:58:24.300068 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:24.300021 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v2xhn" podUID="a0ee49c5-2186-4f09-9a83-ceabc81a52e5" Apr 22 19:58:26.300903 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:26.300825 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:58:26.301437 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:26.300944 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gptpd" podUID="80c4f1f9-74db-4e1b-bb9c-05d1618ca285" Apr 22 19:58:26.301437 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:26.301043 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 19:58:26.301437 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:26.301144 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v2xhn" podUID="a0ee49c5-2186-4f09-9a83-ceabc81a52e5" Apr 22 19:58:26.301437 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:26.301189 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:58:26.301437 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:26.301263 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d46mn" podUID="d02dd904-b70b-4d97-9033-5614a158edbf" Apr 22 19:58:28.300072 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.299819 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 19:58:28.300499 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.299857 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:58:28.300499 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:28.300172 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v2xhn" podUID="a0ee49c5-2186-4f09-9a83-ceabc81a52e5" Apr 22 19:58:28.300499 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.299909 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:58:28.300499 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:28.300284 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gptpd" podUID="80c4f1f9-74db-4e1b-bb9c-05d1618ca285" Apr 22 19:58:28.300499 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:28.300316 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d46mn" podUID="d02dd904-b70b-4d97-9033-5614a158edbf" Apr 22 19:58:28.615951 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.615873 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-239.ec2.internal" event="NodeReady" Apr 22 19:58:28.616130 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.616050 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 19:58:28.640146 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.640118 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mht87_1e73951e-095f-4406-93d0-afb41cb12c4b/dns-node-resolver/0.log" Apr 22 19:58:28.652329 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.652300 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7b97ccfb5-7tjp6"] Apr 22 19:58:28.679501 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.679450 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7b97ccfb5-7tjp6"] Apr 22 19:58:28.679501 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.679483 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-vjxnz"] Apr 22 19:58:28.679670 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.679560 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:28.681884 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.681851 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 19:58:28.681884 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.681863 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-md5kj\"" Apr 22 19:58:28.681884 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.681856 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 19:58:28.682353 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.682335 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 19:58:28.687822 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.687753 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 19:58:28.695117 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.695098 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-mdzds"] Apr 22 19:58:28.695253 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.695239 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vjxnz" Apr 22 19:58:28.697668 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.697644 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xt4d8\"" Apr 22 19:58:28.697953 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.697939 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 19:58:28.698950 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.698179 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 19:58:28.698950 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.698213 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 19:58:28.713681 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.713656 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vjxnz"] Apr 22 19:58:28.713784 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.713689 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mdzds"] Apr 22 19:58:28.713839 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.713809 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mdzds" Apr 22 19:58:28.716926 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.716800 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 19:58:28.717020 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.716944 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 19:58:28.717020 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.717005 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-9r99g\"" Apr 22 19:58:28.828120 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.828085 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/48a20871-3902-4d97-9b7d-6c533dbb4d37-installation-pull-secrets\") pod \"image-registry-7b97ccfb5-7tjp6\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:28.828285 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.828128 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/48a20871-3902-4d97-9b7d-6c533dbb4d37-ca-trust-extracted\") pod \"image-registry-7b97ccfb5-7tjp6\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:28.828285 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.828160 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-bound-sa-token\") pod \"image-registry-7b97ccfb5-7tjp6\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:28.828285 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.828185 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmfz4\" (UniqueName: \"kubernetes.io/projected/85f994f1-e18a-4886-bc92-4b762096129a-kube-api-access-fmfz4\") pod \"ingress-canary-vjxnz\" (UID: \"85f994f1-e18a-4886-bc92-4b762096129a\") " pod="openshift-ingress-canary/ingress-canary-vjxnz" Apr 22 19:58:28.828285 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.828233 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cb084c1-c4af-443e-afce-025ccb08ba3f-config-volume\") pod \"dns-default-mdzds\" (UID: \"7cb084c1-c4af-443e-afce-025ccb08ba3f\") " pod="openshift-dns/dns-default-mdzds" Apr 22 19:58:28.828285 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.828276 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4nbr\" (UniqueName: \"kubernetes.io/projected/7cb084c1-c4af-443e-afce-025ccb08ba3f-kube-api-access-k4nbr\") pod \"dns-default-mdzds\" (UID: \"7cb084c1-c4af-443e-afce-025ccb08ba3f\") " pod="openshift-dns/dns-default-mdzds" Apr 22 19:58:28.828513 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.828305 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndq52\" (UniqueName: \"kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-kube-api-access-ndq52\") pod \"image-registry-7b97ccfb5-7tjp6\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:28.828513 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.828351 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-registry-tls\") pod \"image-registry-7b97ccfb5-7tjp6\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:28.828513 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.828417 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7cb084c1-c4af-443e-afce-025ccb08ba3f-tmp-dir\") pod \"dns-default-mdzds\" (UID: \"7cb084c1-c4af-443e-afce-025ccb08ba3f\") " pod="openshift-dns/dns-default-mdzds" Apr 22 19:58:28.828513 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.828467 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85f994f1-e18a-4886-bc92-4b762096129a-cert\") pod \"ingress-canary-vjxnz\" (UID: \"85f994f1-e18a-4886-bc92-4b762096129a\") " pod="openshift-ingress-canary/ingress-canary-vjxnz" Apr 22 19:58:28.828513 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.828493 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/48a20871-3902-4d97-9b7d-6c533dbb4d37-image-registry-private-configuration\") pod \"image-registry-7b97ccfb5-7tjp6\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:28.828710 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.828518 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7cb084c1-c4af-443e-afce-025ccb08ba3f-metrics-tls\") pod \"dns-default-mdzds\" (UID: \"7cb084c1-c4af-443e-afce-025ccb08ba3f\") " pod="openshift-dns/dns-default-mdzds" Apr 22 19:58:28.828710 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.828534 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/48a20871-3902-4d97-9b7d-6c533dbb4d37-registry-certificates\") pod \"image-registry-7b97ccfb5-7tjp6\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:28.828710 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.828608 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48a20871-3902-4d97-9b7d-6c533dbb4d37-trusted-ca\") pod \"image-registry-7b97ccfb5-7tjp6\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:28.929948 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.929913 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7cb084c1-c4af-443e-afce-025ccb08ba3f-metrics-tls\") pod \"dns-default-mdzds\" (UID: \"7cb084c1-c4af-443e-afce-025ccb08ba3f\") " pod="openshift-dns/dns-default-mdzds" Apr 22 19:58:28.929948 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.929951 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/48a20871-3902-4d97-9b7d-6c533dbb4d37-registry-certificates\") pod \"image-registry-7b97ccfb5-7tjp6\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:28.930219 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.929973 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48a20871-3902-4d97-9b7d-6c533dbb4d37-trusted-ca\") pod \"image-registry-7b97ccfb5-7tjp6\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:28.930219 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.930003 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/48a20871-3902-4d97-9b7d-6c533dbb4d37-installation-pull-secrets\") pod \"image-registry-7b97ccfb5-7tjp6\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:28.930219 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.930023 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/48a20871-3902-4d97-9b7d-6c533dbb4d37-ca-trust-extracted\") pod \"image-registry-7b97ccfb5-7tjp6\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:28.930219 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.930063 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-bound-sa-token\") pod \"image-registry-7b97ccfb5-7tjp6\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:28.930219 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.930083 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmfz4\" (UniqueName: \"kubernetes.io/projected/85f994f1-e18a-4886-bc92-4b762096129a-kube-api-access-fmfz4\") pod \"ingress-canary-vjxnz\" (UID: \"85f994f1-e18a-4886-bc92-4b762096129a\") " pod="openshift-ingress-canary/ingress-canary-vjxnz" Apr 22 19:58:28.930219 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:28.930095 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:28.930219 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.930102 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cb084c1-c4af-443e-afce-025ccb08ba3f-config-volume\") pod \"dns-default-mdzds\" (UID: \"7cb084c1-c4af-443e-afce-025ccb08ba3f\") " pod="openshift-dns/dns-default-mdzds" Apr 22 19:58:28.930219 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:28.930170 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cb084c1-c4af-443e-afce-025ccb08ba3f-metrics-tls podName:7cb084c1-c4af-443e-afce-025ccb08ba3f nodeName:}" failed. No retries permitted until 2026-04-22 19:58:29.430148438 +0000 UTC m=+33.681648322 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7cb084c1-c4af-443e-afce-025ccb08ba3f-metrics-tls") pod "dns-default-mdzds" (UID: "7cb084c1-c4af-443e-afce-025ccb08ba3f") : secret "dns-default-metrics-tls" not found Apr 22 19:58:28.930219 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.930215 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4nbr\" (UniqueName: \"kubernetes.io/projected/7cb084c1-c4af-443e-afce-025ccb08ba3f-kube-api-access-k4nbr\") pod \"dns-default-mdzds\" (UID: \"7cb084c1-c4af-443e-afce-025ccb08ba3f\") " pod="openshift-dns/dns-default-mdzds" Apr 22 19:58:28.930774 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.930243 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ndq52\" (UniqueName: \"kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-kube-api-access-ndq52\") pod \"image-registry-7b97ccfb5-7tjp6\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:28.930774 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.930288 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-registry-tls\") pod \"image-registry-7b97ccfb5-7tjp6\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:28.930774 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.930334 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7cb084c1-c4af-443e-afce-025ccb08ba3f-tmp-dir\") pod \"dns-default-mdzds\" (UID: \"7cb084c1-c4af-443e-afce-025ccb08ba3f\") " pod="openshift-dns/dns-default-mdzds" Apr 22 19:58:28.930774 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.930386 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85f994f1-e18a-4886-bc92-4b762096129a-cert\") pod \"ingress-canary-vjxnz\" (UID: \"85f994f1-e18a-4886-bc92-4b762096129a\") " pod="openshift-ingress-canary/ingress-canary-vjxnz" Apr 22 19:58:28.930774 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.930414 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/48a20871-3902-4d97-9b7d-6c533dbb4d37-ca-trust-extracted\") pod \"image-registry-7b97ccfb5-7tjp6\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:28.930774 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.930419 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/48a20871-3902-4d97-9b7d-6c533dbb4d37-image-registry-private-configuration\") pod \"image-registry-7b97ccfb5-7tjp6\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:28.930774 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:28.930618 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:28.930774 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:28.930670 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:58:28.930774 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:28.930686 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7b97ccfb5-7tjp6: secret "image-registry-tls" not found Apr 22 19:58:28.930774 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:28.930706 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85f994f1-e18a-4886-bc92-4b762096129a-cert podName:85f994f1-e18a-4886-bc92-4b762096129a nodeName:}" failed. No retries permitted until 2026-04-22 19:58:29.430689871 +0000 UTC m=+33.682189739 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85f994f1-e18a-4886-bc92-4b762096129a-cert") pod "ingress-canary-vjxnz" (UID: "85f994f1-e18a-4886-bc92-4b762096129a") : secret "canary-serving-cert" not found Apr 22 19:58:28.930774 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:28.930731 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-registry-tls podName:48a20871-3902-4d97-9b7d-6c533dbb4d37 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:29.43071485 +0000 UTC m=+33.682214743 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-registry-tls") pod "image-registry-7b97ccfb5-7tjp6" (UID: "48a20871-3902-4d97-9b7d-6c533dbb4d37") : secret "image-registry-tls" not found Apr 22 19:58:28.930774 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.930806 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/48a20871-3902-4d97-9b7d-6c533dbb4d37-registry-certificates\") pod \"image-registry-7b97ccfb5-7tjp6\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:28.931256 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.930896 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7cb084c1-c4af-443e-afce-025ccb08ba3f-config-volume\") pod \"dns-default-mdzds\" (UID: \"7cb084c1-c4af-443e-afce-025ccb08ba3f\") " pod="openshift-dns/dns-default-mdzds" Apr 22 19:58:28.931256 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.931097 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7cb084c1-c4af-443e-afce-025ccb08ba3f-tmp-dir\") pod \"dns-default-mdzds\" (UID: \"7cb084c1-c4af-443e-afce-025ccb08ba3f\") " pod="openshift-dns/dns-default-mdzds" Apr 22 19:58:28.931256 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.931105 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48a20871-3902-4d97-9b7d-6c533dbb4d37-trusted-ca\") pod \"image-registry-7b97ccfb5-7tjp6\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:28.934091 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.934073 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/48a20871-3902-4d97-9b7d-6c533dbb4d37-image-registry-private-configuration\") pod \"image-registry-7b97ccfb5-7tjp6\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:28.934091 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.934087 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/48a20871-3902-4d97-9b7d-6c533dbb4d37-installation-pull-secrets\") pod \"image-registry-7b97ccfb5-7tjp6\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:28.938338 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.938317 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-bound-sa-token\") pod \"image-registry-7b97ccfb5-7tjp6\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:28.938813 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.938780 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmfz4\" (UniqueName: \"kubernetes.io/projected/85f994f1-e18a-4886-bc92-4b762096129a-kube-api-access-fmfz4\") pod \"ingress-canary-vjxnz\" (UID: \"85f994f1-e18a-4886-bc92-4b762096129a\") " pod="openshift-ingress-canary/ingress-canary-vjxnz" Apr 22 19:58:28.939065 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.938981 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4nbr\" (UniqueName: \"kubernetes.io/projected/7cb084c1-c4af-443e-afce-025ccb08ba3f-kube-api-access-k4nbr\") pod \"dns-default-mdzds\" (UID: \"7cb084c1-c4af-443e-afce-025ccb08ba3f\") " pod="openshift-dns/dns-default-mdzds" Apr 22 19:58:28.939219 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:28.939202 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndq52\" (UniqueName: \"kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-kube-api-access-ndq52\") pod \"image-registry-7b97ccfb5-7tjp6\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:29.433489 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:29.433453 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7cb084c1-c4af-443e-afce-025ccb08ba3f-metrics-tls\") pod \"dns-default-mdzds\" (UID: \"7cb084c1-c4af-443e-afce-025ccb08ba3f\") " pod="openshift-dns/dns-default-mdzds" Apr 22 19:58:29.433841 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:29.433546 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-registry-tls\") pod \"image-registry-7b97ccfb5-7tjp6\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:29.433841 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:29.433586 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:29.433841 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:29.433625 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85f994f1-e18a-4886-bc92-4b762096129a-cert\") pod \"ingress-canary-vjxnz\" (UID: \"85f994f1-e18a-4886-bc92-4b762096129a\") " pod="openshift-ingress-canary/ingress-canary-vjxnz" Apr 22 19:58:29.433841 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:29.433649 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cb084c1-c4af-443e-afce-025ccb08ba3f-metrics-tls podName:7cb084c1-c4af-443e-afce-025ccb08ba3f nodeName:}" failed. No retries permitted until 2026-04-22 19:58:30.433629489 +0000 UTC m=+34.685129362 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7cb084c1-c4af-443e-afce-025ccb08ba3f-metrics-tls") pod "dns-default-mdzds" (UID: "7cb084c1-c4af-443e-afce-025ccb08ba3f") : secret "dns-default-metrics-tls" not found Apr 22 19:58:29.433841 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:29.433666 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:58:29.433841 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:29.433679 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7b97ccfb5-7tjp6: secret "image-registry-tls" not found Apr 22 19:58:29.433841 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:29.433712 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-registry-tls podName:48a20871-3902-4d97-9b7d-6c533dbb4d37 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:30.433701526 +0000 UTC m=+34.685201401 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-registry-tls") pod "image-registry-7b97ccfb5-7tjp6" (UID: "48a20871-3902-4d97-9b7d-6c533dbb4d37") : secret "image-registry-tls" not found Apr 22 19:58:29.433841 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:29.433714 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:29.433841 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:29.433770 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85f994f1-e18a-4886-bc92-4b762096129a-cert podName:85f994f1-e18a-4886-bc92-4b762096129a nodeName:}" failed. No retries permitted until 2026-04-22 19:58:30.433753567 +0000 UTC m=+34.685253450 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85f994f1-e18a-4886-bc92-4b762096129a-cert") pod "ingress-canary-vjxnz" (UID: "85f994f1-e18a-4886-bc92-4b762096129a") : secret "canary-serving-cert" not found Apr 22 19:58:29.496453 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:29.496415 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m69k4" event={"ID":"6e1ef928-0940-490e-89a9-e75af398fadb","Type":"ContainerStarted","Data":"6bb0a85353102ad1b5eb6d784cc07ab39a24be7704e672a35a6a2a5b86bb0095"} Apr 22 19:58:29.625254 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:29.625232 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-nhz6m_658fb26d-962e-43ef-9be4-c89b573ecb41/node-ca/0.log" Apr 22 19:58:29.937758 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:29.937721 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80c4f1f9-74db-4e1b-bb9c-05d1618ca285-metrics-certs\") pod \"network-metrics-daemon-gptpd\" (UID: \"80c4f1f9-74db-4e1b-bb9c-05d1618ca285\") " pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:58:29.937899 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:29.937854 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:29.937940 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:29.937911 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80c4f1f9-74db-4e1b-bb9c-05d1618ca285-metrics-certs podName:80c4f1f9-74db-4e1b-bb9c-05d1618ca285 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:01.937897475 +0000 UTC m=+66.189397341 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/80c4f1f9-74db-4e1b-bb9c-05d1618ca285-metrics-certs") pod "network-metrics-daemon-gptpd" (UID: "80c4f1f9-74db-4e1b-bb9c-05d1618ca285") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:30.038975 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.038949 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hfkng\" (UniqueName: \"kubernetes.io/projected/a0ee49c5-2186-4f09-9a83-ceabc81a52e5-kube-api-access-hfkng\") pod \"network-check-target-v2xhn\" (UID: \"a0ee49c5-2186-4f09-9a83-ceabc81a52e5\") " pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 19:58:30.039162 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:30.039141 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:30.039231 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:30.039168 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:30.039231 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:30.039181 2575 projected.go:194] Error preparing data for projected volume kube-api-access-hfkng for pod openshift-network-diagnostics/network-check-target-v2xhn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:30.039329 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:30.039248 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a0ee49c5-2186-4f09-9a83-ceabc81a52e5-kube-api-access-hfkng podName:a0ee49c5-2186-4f09-9a83-ceabc81a52e5 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:02.039227674 +0000 UTC m=+66.290727556 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-hfkng" (UniqueName: "kubernetes.io/projected/a0ee49c5-2186-4f09-9a83-ceabc81a52e5-kube-api-access-hfkng") pod "network-check-target-v2xhn" (UID: "a0ee49c5-2186-4f09-9a83-ceabc81a52e5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:30.300235 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.300166 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:58:30.300235 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.300224 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:58:30.300385 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.300250 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 19:58:30.303046 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.303008 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:58:30.303046 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.303040 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 19:58:30.304008 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.303991 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:58:30.304133 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.304107 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qxcj4\"" Apr 22 19:58:30.304133 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.304115 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:58:30.304233 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.304160 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tjpls\"" Apr 22 19:58:30.441616 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.441596 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-registry-tls\") pod \"image-registry-7b97ccfb5-7tjp6\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:30.441895 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.441640 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85f994f1-e18a-4886-bc92-4b762096129a-cert\") pod \"ingress-canary-vjxnz\" (UID: \"85f994f1-e18a-4886-bc92-4b762096129a\") " pod="openshift-ingress-canary/ingress-canary-vjxnz" Apr 22 19:58:30.441895 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.441671 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7cb084c1-c4af-443e-afce-025ccb08ba3f-metrics-tls\") pod \"dns-default-mdzds\" (UID: \"7cb084c1-c4af-443e-afce-025ccb08ba3f\") " pod="openshift-dns/dns-default-mdzds" Apr 22 19:58:30.441895 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:30.441741 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:30.441895 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:30.441743 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:58:30.441895 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:30.441758 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7b97ccfb5-7tjp6: secret "image-registry-tls" not found Apr 22 19:58:30.441895 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:30.441781 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cb084c1-c4af-443e-afce-025ccb08ba3f-metrics-tls podName:7cb084c1-c4af-443e-afce-025ccb08ba3f nodeName:}" failed. No retries permitted until 2026-04-22 19:58:32.441768662 +0000 UTC m=+36.693268528 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7cb084c1-c4af-443e-afce-025ccb08ba3f-metrics-tls") pod "dns-default-mdzds" (UID: "7cb084c1-c4af-443e-afce-025ccb08ba3f") : secret "dns-default-metrics-tls" not found Apr 22 19:58:30.441895 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:30.441747 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:30.441895 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:30.441793 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-registry-tls podName:48a20871-3902-4d97-9b7d-6c533dbb4d37 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:32.441787278 +0000 UTC m=+36.693287144 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-registry-tls") pod "image-registry-7b97ccfb5-7tjp6" (UID: "48a20871-3902-4d97-9b7d-6c533dbb4d37") : secret "image-registry-tls" not found Apr 22 19:58:30.441895 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:30.441816 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85f994f1-e18a-4886-bc92-4b762096129a-cert podName:85f994f1-e18a-4886-bc92-4b762096129a nodeName:}" failed. No retries permitted until 2026-04-22 19:58:32.441804779 +0000 UTC m=+36.693304646 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85f994f1-e18a-4886-bc92-4b762096129a-cert") pod "ingress-canary-vjxnz" (UID: "85f994f1-e18a-4886-bc92-4b762096129a") : secret "canary-serving-cert" not found Apr 22 19:58:30.500559 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.500535 2575 generic.go:358] "Generic (PLEG): container finished" podID="6e1ef928-0940-490e-89a9-e75af398fadb" containerID="6bb0a85353102ad1b5eb6d784cc07ab39a24be7704e672a35a6a2a5b86bb0095" exitCode=0 Apr 22 19:58:30.500652 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.500562 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m69k4" event={"ID":"6e1ef928-0940-490e-89a9-e75af398fadb","Type":"ContainerDied","Data":"6bb0a85353102ad1b5eb6d784cc07ab39a24be7704e672a35a6a2a5b86bb0095"} Apr 22 19:58:30.507889 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.507869 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-4klwc"] Apr 22 19:58:30.538368 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.538339 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-4klwc"] Apr 22 19:58:30.538486 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.538470 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-4klwc" Apr 22 19:58:30.541367 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.541144 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-ntclk\"" Apr 22 19:58:30.541566 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.541546 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 19:58:30.541659 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.541570 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 19:58:30.541659 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.541643 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 19:58:30.541995 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.541843 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 19:58:30.643227 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.643205 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c1ce87b0-f8c8-41be-90cd-62d4e6d1a878-signing-cabundle\") pod \"service-ca-865cb79987-4klwc\" (UID: \"c1ce87b0-f8c8-41be-90cd-62d4e6d1a878\") " pod="openshift-service-ca/service-ca-865cb79987-4klwc" Apr 22 19:58:30.643307 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.643284 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpkrm\" (UniqueName: \"kubernetes.io/projected/c1ce87b0-f8c8-41be-90cd-62d4e6d1a878-kube-api-access-jpkrm\") pod \"service-ca-865cb79987-4klwc\" (UID: \"c1ce87b0-f8c8-41be-90cd-62d4e6d1a878\") " pod="openshift-service-ca/service-ca-865cb79987-4klwc" Apr 22 19:58:30.643366 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.643352 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c1ce87b0-f8c8-41be-90cd-62d4e6d1a878-signing-key\") pod \"service-ca-865cb79987-4klwc\" (UID: \"c1ce87b0-f8c8-41be-90cd-62d4e6d1a878\") " pod="openshift-service-ca/service-ca-865cb79987-4klwc" Apr 22 19:58:30.743741 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.743720 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c1ce87b0-f8c8-41be-90cd-62d4e6d1a878-signing-cabundle\") pod \"service-ca-865cb79987-4klwc\" (UID: \"c1ce87b0-f8c8-41be-90cd-62d4e6d1a878\") " pod="openshift-service-ca/service-ca-865cb79987-4klwc" Apr 22 19:58:30.743839 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.743756 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jpkrm\" (UniqueName: \"kubernetes.io/projected/c1ce87b0-f8c8-41be-90cd-62d4e6d1a878-kube-api-access-jpkrm\") pod \"service-ca-865cb79987-4klwc\" (UID: \"c1ce87b0-f8c8-41be-90cd-62d4e6d1a878\") " pod="openshift-service-ca/service-ca-865cb79987-4klwc" Apr 22 19:58:30.743839 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.743804 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c1ce87b0-f8c8-41be-90cd-62d4e6d1a878-signing-key\") pod \"service-ca-865cb79987-4klwc\" (UID: \"c1ce87b0-f8c8-41be-90cd-62d4e6d1a878\") " pod="openshift-service-ca/service-ca-865cb79987-4klwc" Apr 22 19:58:30.744299 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.744281 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c1ce87b0-f8c8-41be-90cd-62d4e6d1a878-signing-cabundle\") pod \"service-ca-865cb79987-4klwc\" (UID: \"c1ce87b0-f8c8-41be-90cd-62d4e6d1a878\") " pod="openshift-service-ca/service-ca-865cb79987-4klwc" Apr 22 19:58:30.745945 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.745926 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c1ce87b0-f8c8-41be-90cd-62d4e6d1a878-signing-key\") pod \"service-ca-865cb79987-4klwc\" (UID: \"c1ce87b0-f8c8-41be-90cd-62d4e6d1a878\") " pod="openshift-service-ca/service-ca-865cb79987-4klwc" Apr 22 19:58:30.751421 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.751403 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpkrm\" (UniqueName: \"kubernetes.io/projected/c1ce87b0-f8c8-41be-90cd-62d4e6d1a878-kube-api-access-jpkrm\") pod \"service-ca-865cb79987-4klwc\" (UID: \"c1ce87b0-f8c8-41be-90cd-62d4e6d1a878\") " pod="openshift-service-ca/service-ca-865cb79987-4klwc" Apr 22 19:58:30.849942 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:30.849886 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-4klwc" Apr 22 19:58:31.019342 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:31.019308 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-4klwc"] Apr 22 19:58:31.032606 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:58:31.032392 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1ce87b0_f8c8_41be_90cd_62d4e6d1a878.slice/crio-b4690de0363ff13305fb9112955ef0690578f434b645fe8c0e62e5c68b223c7c WatchSource:0}: Error finding container b4690de0363ff13305fb9112955ef0690578f434b645fe8c0e62e5c68b223c7c: Status 404 returned error can't find the container with id b4690de0363ff13305fb9112955ef0690578f434b645fe8c0e62e5c68b223c7c Apr 22 19:58:31.046771 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:31.046744 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d02dd904-b70b-4d97-9033-5614a158edbf-original-pull-secret\") pod \"global-pull-secret-syncer-d46mn\" (UID: \"d02dd904-b70b-4d97-9033-5614a158edbf\") " pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:58:31.049861 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:31.049842 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d02dd904-b70b-4d97-9033-5614a158edbf-original-pull-secret\") pod \"global-pull-secret-syncer-d46mn\" (UID: \"d02dd904-b70b-4d97-9033-5614a158edbf\") " pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:58:31.210017 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:31.209983 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d46mn" Apr 22 19:58:31.325832 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:31.325807 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-d46mn"] Apr 22 19:58:31.331560 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:58:31.331527 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd02dd904_b70b_4d97_9033_5614a158edbf.slice/crio-3b9de199967023a03ad87e663d4dee15029a5461ac10cf785f5c6fabca094188 WatchSource:0}: Error finding container 3b9de199967023a03ad87e663d4dee15029a5461ac10cf785f5c6fabca094188: Status 404 returned error can't find the container with id 3b9de199967023a03ad87e663d4dee15029a5461ac10cf785f5c6fabca094188 Apr 22 19:58:31.503166 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:31.503096 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-d46mn" event={"ID":"d02dd904-b70b-4d97-9033-5614a158edbf","Type":"ContainerStarted","Data":"3b9de199967023a03ad87e663d4dee15029a5461ac10cf785f5c6fabca094188"} Apr 22 19:58:31.505682 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:31.505656 2575 generic.go:358] "Generic (PLEG): container finished" podID="6e1ef928-0940-490e-89a9-e75af398fadb" containerID="cc3f927aca7607dd12b7d6c1bcfdbb88fb38bc214fce351a7a1c0bb8288fc44c" exitCode=0 Apr 22 19:58:31.505777 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:31.505722 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m69k4" event={"ID":"6e1ef928-0940-490e-89a9-e75af398fadb","Type":"ContainerDied","Data":"cc3f927aca7607dd12b7d6c1bcfdbb88fb38bc214fce351a7a1c0bb8288fc44c"} Apr 22 19:58:31.506743 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:31.506626 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-4klwc" event={"ID":"c1ce87b0-f8c8-41be-90cd-62d4e6d1a878","Type":"ContainerStarted","Data":"b4690de0363ff13305fb9112955ef0690578f434b645fe8c0e62e5c68b223c7c"} Apr 22 19:58:32.458651 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:32.458262 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85f994f1-e18a-4886-bc92-4b762096129a-cert\") pod \"ingress-canary-vjxnz\" (UID: \"85f994f1-e18a-4886-bc92-4b762096129a\") " pod="openshift-ingress-canary/ingress-canary-vjxnz" Apr 22 19:58:32.458651 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:32.458320 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7cb084c1-c4af-443e-afce-025ccb08ba3f-metrics-tls\") pod \"dns-default-mdzds\" (UID: \"7cb084c1-c4af-443e-afce-025ccb08ba3f\") " pod="openshift-dns/dns-default-mdzds" Apr 22 19:58:32.458651 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:32.458410 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-registry-tls\") pod \"image-registry-7b97ccfb5-7tjp6\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:32.458651 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:32.458496 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:32.458651 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:32.458536 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:58:32.458651 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:32.458549 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7b97ccfb5-7tjp6: secret "image-registry-tls" not found Apr 22 19:58:32.458651 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:32.458574 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85f994f1-e18a-4886-bc92-4b762096129a-cert podName:85f994f1-e18a-4886-bc92-4b762096129a nodeName:}" failed. No retries permitted until 2026-04-22 19:58:36.458550918 +0000 UTC m=+40.710050808 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85f994f1-e18a-4886-bc92-4b762096129a-cert") pod "ingress-canary-vjxnz" (UID: "85f994f1-e18a-4886-bc92-4b762096129a") : secret "canary-serving-cert" not found Apr 22 19:58:32.458651 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:32.458496 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:32.458651 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:32.458594 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-registry-tls podName:48a20871-3902-4d97-9b7d-6c533dbb4d37 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:36.458584161 +0000 UTC m=+40.710084033 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-registry-tls") pod "image-registry-7b97ccfb5-7tjp6" (UID: "48a20871-3902-4d97-9b7d-6c533dbb4d37") : secret "image-registry-tls" not found Apr 22 19:58:32.458651 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:32.458610 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cb084c1-c4af-443e-afce-025ccb08ba3f-metrics-tls podName:7cb084c1-c4af-443e-afce-025ccb08ba3f nodeName:}" failed. No retries permitted until 2026-04-22 19:58:36.458602729 +0000 UTC m=+40.710102599 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7cb084c1-c4af-443e-afce-025ccb08ba3f-metrics-tls") pod "dns-default-mdzds" (UID: "7cb084c1-c4af-443e-afce-025ccb08ba3f") : secret "dns-default-metrics-tls" not found Apr 22 19:58:32.512234 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:32.512199 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m69k4" event={"ID":"6e1ef928-0940-490e-89a9-e75af398fadb","Type":"ContainerStarted","Data":"649dcb8145685b4ee8c3730145e11eee64d0770953f929f7becf446d38ce50d1"} Apr 22 19:58:32.539024 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:32.538975 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-m69k4" podStartSLOduration=6.053583661 podStartE2EDuration="36.53895885s" podCreationTimestamp="2026-04-22 19:57:56 +0000 UTC" firstStartedPulling="2026-04-22 19:57:58.816941088 +0000 UTC m=+3.068440959" lastFinishedPulling="2026-04-22 19:58:29.302316277 +0000 UTC m=+33.553816148" observedRunningTime="2026-04-22 19:58:32.537803954 +0000 UTC m=+36.789303844" watchObservedRunningTime="2026-04-22 19:58:32.53895885 +0000 UTC m=+36.790458740" Apr 22 19:58:33.516294 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:33.516215 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-4klwc" event={"ID":"c1ce87b0-f8c8-41be-90cd-62d4e6d1a878","Type":"ContainerStarted","Data":"31054a5f0f899b6cf86b25330f2421b66e1807e3b862c83dcaa5d4eff7e38ae7"} Apr 22 19:58:33.530689 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:33.530646 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-4klwc" podStartSLOduration=1.344553162 podStartE2EDuration="3.530633775s" podCreationTimestamp="2026-04-22 19:58:30 +0000 UTC" firstStartedPulling="2026-04-22 19:58:31.03432797 +0000 UTC m=+35.285827837" lastFinishedPulling="2026-04-22 19:58:33.220408569 +0000 UTC m=+37.471908450" observedRunningTime="2026-04-22 19:58:33.529387266 +0000 UTC m=+37.780887156" watchObservedRunningTime="2026-04-22 19:58:33.530633775 +0000 UTC m=+37.782133664" Apr 22 19:58:36.492902 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:36.492870 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-registry-tls\") pod \"image-registry-7b97ccfb5-7tjp6\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:36.493295 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:36.492917 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85f994f1-e18a-4886-bc92-4b762096129a-cert\") pod \"ingress-canary-vjxnz\" (UID: \"85f994f1-e18a-4886-bc92-4b762096129a\") " pod="openshift-ingress-canary/ingress-canary-vjxnz" Apr 22 19:58:36.493295 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:36.492940 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7cb084c1-c4af-443e-afce-025ccb08ba3f-metrics-tls\") pod \"dns-default-mdzds\" (UID: \"7cb084c1-c4af-443e-afce-025ccb08ba3f\") " pod="openshift-dns/dns-default-mdzds" Apr 22 19:58:36.493295 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:36.493008 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:58:36.493295 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:36.493022 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:36.493295 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:36.493046 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7b97ccfb5-7tjp6: secret "image-registry-tls" not found Apr 22 19:58:36.493295 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:36.493074 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:36.493295 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:36.493092 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7cb084c1-c4af-443e-afce-025ccb08ba3f-metrics-tls podName:7cb084c1-c4af-443e-afce-025ccb08ba3f nodeName:}" failed. No retries permitted until 2026-04-22 19:58:44.493078577 +0000 UTC m=+48.744578457 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7cb084c1-c4af-443e-afce-025ccb08ba3f-metrics-tls") pod "dns-default-mdzds" (UID: "7cb084c1-c4af-443e-afce-025ccb08ba3f") : secret "dns-default-metrics-tls" not found Apr 22 19:58:36.493295 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:36.493105 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-registry-tls podName:48a20871-3902-4d97-9b7d-6c533dbb4d37 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:44.493099311 +0000 UTC m=+48.744599179 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-registry-tls") pod "image-registry-7b97ccfb5-7tjp6" (UID: "48a20871-3902-4d97-9b7d-6c533dbb4d37") : secret "image-registry-tls" not found Apr 22 19:58:36.493295 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:58:36.493114 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85f994f1-e18a-4886-bc92-4b762096129a-cert podName:85f994f1-e18a-4886-bc92-4b762096129a nodeName:}" failed. No retries permitted until 2026-04-22 19:58:44.493109374 +0000 UTC m=+48.744609240 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85f994f1-e18a-4886-bc92-4b762096129a-cert") pod "ingress-canary-vjxnz" (UID: "85f994f1-e18a-4886-bc92-4b762096129a") : secret "canary-serving-cert" not found Apr 22 19:58:36.525740 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:36.525708 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-d46mn" event={"ID":"d02dd904-b70b-4d97-9033-5614a158edbf","Type":"ContainerStarted","Data":"ba7850e96a247db16907bdc5256aaaadd4612df45d857df1c940df084d071971"} Apr 22 19:58:36.540043 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:36.540002 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-d46mn" podStartSLOduration=33.345164683 podStartE2EDuration="37.539991471s" podCreationTimestamp="2026-04-22 19:57:59 +0000 UTC" firstStartedPulling="2026-04-22 19:58:31.333244353 +0000 UTC m=+35.584744221" lastFinishedPulling="2026-04-22 19:58:35.528071136 +0000 UTC m=+39.779571009" observedRunningTime="2026-04-22 19:58:36.539839735 +0000 UTC m=+40.791339624" watchObservedRunningTime="2026-04-22 19:58:36.539991471 +0000 UTC m=+40.791491360" Apr 22 19:58:44.549557 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:44.549529 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7cb084c1-c4af-443e-afce-025ccb08ba3f-metrics-tls\") pod \"dns-default-mdzds\" (UID: \"7cb084c1-c4af-443e-afce-025ccb08ba3f\") " pod="openshift-dns/dns-default-mdzds" Apr 22 19:58:44.549937 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:44.549584 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-registry-tls\") pod \"image-registry-7b97ccfb5-7tjp6\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:44.549937 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:44.549725 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85f994f1-e18a-4886-bc92-4b762096129a-cert\") pod \"ingress-canary-vjxnz\" (UID: \"85f994f1-e18a-4886-bc92-4b762096129a\") " pod="openshift-ingress-canary/ingress-canary-vjxnz" Apr 22 19:58:44.552289 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:44.552264 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-registry-tls\") pod \"image-registry-7b97ccfb5-7tjp6\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:44.552417 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:44.552397 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85f994f1-e18a-4886-bc92-4b762096129a-cert\") pod \"ingress-canary-vjxnz\" (UID: \"85f994f1-e18a-4886-bc92-4b762096129a\") " pod="openshift-ingress-canary/ingress-canary-vjxnz" Apr 22 19:58:44.561514 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:44.561485 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7cb084c1-c4af-443e-afce-025ccb08ba3f-metrics-tls\") pod \"dns-default-mdzds\" (UID: \"7cb084c1-c4af-443e-afce-025ccb08ba3f\") " pod="openshift-dns/dns-default-mdzds" Apr 22 19:58:44.592464 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:44.592439 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:44.612772 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:44.612564 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vjxnz" Apr 22 19:58:44.623528 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:44.623056 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mdzds" Apr 22 19:58:44.750802 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:44.750753 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7b97ccfb5-7tjp6"] Apr 22 19:58:44.764857 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:44.764824 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mdzds"] Apr 22 19:58:44.768551 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:58:44.768522 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cb084c1_c4af_443e_afce_025ccb08ba3f.slice/crio-15c51ce507a8790fdf5499ecfaea574394014c1349cfc31cd011aefa54517cec WatchSource:0}: Error finding container 15c51ce507a8790fdf5499ecfaea574394014c1349cfc31cd011aefa54517cec: Status 404 returned error can't find the container with id 15c51ce507a8790fdf5499ecfaea574394014c1349cfc31cd011aefa54517cec Apr 22 19:58:44.783983 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:44.783963 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vjxnz"] Apr 22 19:58:44.798076 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:58:44.798052 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85f994f1_e18a_4886_bc92_4b762096129a.slice/crio-11fe8df78bab63a544a4af2f84aa66a2b63fa5c991da3fde931b085a67600d8a WatchSource:0}: Error finding container 11fe8df78bab63a544a4af2f84aa66a2b63fa5c991da3fde931b085a67600d8a: Status 404 returned error can't find the container with id 11fe8df78bab63a544a4af2f84aa66a2b63fa5c991da3fde931b085a67600d8a Apr 22 19:58:45.544653 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:45.544615 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mdzds" event={"ID":"7cb084c1-c4af-443e-afce-025ccb08ba3f","Type":"ContainerStarted","Data":"15c51ce507a8790fdf5499ecfaea574394014c1349cfc31cd011aefa54517cec"} Apr 22 19:58:45.546934 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:45.546572 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" event={"ID":"48a20871-3902-4d97-9b7d-6c533dbb4d37","Type":"ContainerStarted","Data":"007facee288895f1b5529293d7abca821f948e3d55ef661d56df2edc812d9da0"} Apr 22 19:58:45.546934 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:45.546608 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" event={"ID":"48a20871-3902-4d97-9b7d-6c533dbb4d37","Type":"ContainerStarted","Data":"de432f97b5f39c2e253d4efbbdefa8bed322b5cb056135a8d307793015ed3234"} Apr 22 19:58:45.546934 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:45.546681 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:58:45.548019 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:45.547983 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vjxnz" event={"ID":"85f994f1-e18a-4886-bc92-4b762096129a","Type":"ContainerStarted","Data":"11fe8df78bab63a544a4af2f84aa66a2b63fa5c991da3fde931b085a67600d8a"} Apr 22 19:58:45.566667 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:45.566624 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" podStartSLOduration=47.566613559 podStartE2EDuration="47.566613559s" podCreationTimestamp="2026-04-22 19:57:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:58:45.565100521 +0000 UTC m=+49.816600412" watchObservedRunningTime="2026-04-22 19:58:45.566613559 +0000 UTC m=+49.818113447" Apr 22 19:58:47.554902 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:47.554868 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mdzds" event={"ID":"7cb084c1-c4af-443e-afce-025ccb08ba3f","Type":"ContainerStarted","Data":"944a2f33b70d6964599d1c86c7d0083a7a1eb5dfde267ff0bd156fb8aac6fbf7"} Apr 22 19:58:47.554902 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:47.554901 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mdzds" event={"ID":"7cb084c1-c4af-443e-afce-025ccb08ba3f","Type":"ContainerStarted","Data":"4fb6e326ba9c9f57eec81f3d81b90e6182044f6342ae4d4d5bb9566cd81d425d"} Apr 22 19:58:47.555409 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:47.554962 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-mdzds" Apr 22 19:58:47.556073 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:47.556049 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vjxnz" event={"ID":"85f994f1-e18a-4886-bc92-4b762096129a","Type":"ContainerStarted","Data":"8acbac23dee41f76a52431ed7f74c0cc654872e4de2fae62b7ef08afb223e51e"} Apr 22 19:58:47.572242 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:47.572202 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-mdzds" podStartSLOduration=17.344148981 podStartE2EDuration="19.572192542s" podCreationTimestamp="2026-04-22 19:58:28 +0000 UTC" firstStartedPulling="2026-04-22 19:58:44.771625919 +0000 UTC m=+49.023125791" lastFinishedPulling="2026-04-22 19:58:46.999669471 +0000 UTC m=+51.251169352" observedRunningTime="2026-04-22 19:58:47.571369595 +0000 UTC m=+51.822869482" watchObservedRunningTime="2026-04-22 19:58:47.572192542 +0000 UTC m=+51.823692425" Apr 22 19:58:50.899906 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.899850 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-vjxnz" podStartSLOduration=20.697091455 podStartE2EDuration="22.899835544s" podCreationTimestamp="2026-04-22 19:58:28 +0000 UTC" firstStartedPulling="2026-04-22 19:58:44.799750237 +0000 UTC m=+49.051250103" lastFinishedPulling="2026-04-22 19:58:47.00249431 +0000 UTC m=+51.253994192" observedRunningTime="2026-04-22 19:58:47.585721315 +0000 UTC m=+51.837221206" watchObservedRunningTime="2026-04-22 19:58:50.899835544 +0000 UTC m=+55.151335432" Apr 22 19:58:50.900281 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.900247 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7b97ccfb5-7tjp6"] Apr 22 19:58:50.912989 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.912966 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-974789bfc-zsbbj"] Apr 22 19:58:50.915832 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.915817 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-974789bfc-zsbbj" Apr 22 19:58:50.917434 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.917401 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-8565f89794-vk6gz"] Apr 22 19:58:50.918710 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.918689 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 19:58:50.918816 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.918690 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 19:58:50.918862 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.918824 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 19:58:50.919021 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.919006 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-glrp4\"" Apr 22 19:58:50.919125 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.919108 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 19:58:50.919797 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.919779 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64f95dbfc4-6xscx"] Apr 22 19:58:50.919914 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.919898 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8565f89794-vk6gz" Apr 22 19:58:50.921764 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.921746 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 19:58:50.922422 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.922402 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64f95dbfc4-6xscx" Apr 22 19:58:50.924363 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.924344 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 19:58:50.924459 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.924435 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 19:58:50.924523 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.924459 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 19:58:50.924523 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.924481 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 19:58:50.929915 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.929893 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-974789bfc-zsbbj"] Apr 22 19:58:50.930615 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.930595 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-8565f89794-vk6gz"] Apr 22 19:58:50.932777 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.932756 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64f95dbfc4-6xscx"] Apr 22 19:58:50.996898 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.996877 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7bqd\" (UniqueName: \"kubernetes.io/projected/5031df66-81a1-4741-89a0-7881243ec7fd-kube-api-access-n7bqd\") pod \"cluster-proxy-proxy-agent-64f95dbfc4-6xscx\" (UID: \"5031df66-81a1-4741-89a0-7881243ec7fd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64f95dbfc4-6xscx" Apr 22 19:58:50.997118 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.996907 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7a14051d-411f-447f-bd34-1abaf46edaa2-tmp\") pod \"klusterlet-addon-workmgr-8565f89794-vk6gz\" (UID: \"7a14051d-411f-447f-bd34-1abaf46edaa2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8565f89794-vk6gz" Apr 22 19:58:50.997118 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.996925 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/7a14051d-411f-447f-bd34-1abaf46edaa2-klusterlet-config\") pod \"klusterlet-addon-workmgr-8565f89794-vk6gz\" (UID: \"7a14051d-411f-447f-bd34-1abaf46edaa2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8565f89794-vk6gz" Apr 22 19:58:50.997118 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.996943 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/5031df66-81a1-4741-89a0-7881243ec7fd-ca\") pod \"cluster-proxy-proxy-agent-64f95dbfc4-6xscx\" (UID: \"5031df66-81a1-4741-89a0-7881243ec7fd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64f95dbfc4-6xscx" Apr 22 19:58:50.997118 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.997073 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c749q\" (UniqueName: \"kubernetes.io/projected/7a14051d-411f-447f-bd34-1abaf46edaa2-kube-api-access-c749q\") pod \"klusterlet-addon-workmgr-8565f89794-vk6gz\" (UID: \"7a14051d-411f-447f-bd34-1abaf46edaa2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8565f89794-vk6gz" Apr 22 19:58:50.997334 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.997116 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/5031df66-81a1-4741-89a0-7881243ec7fd-hub\") pod \"cluster-proxy-proxy-agent-64f95dbfc4-6xscx\" (UID: \"5031df66-81a1-4741-89a0-7881243ec7fd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64f95dbfc4-6xscx" Apr 22 19:58:50.997334 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.997142 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/5031df66-81a1-4741-89a0-7881243ec7fd-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-64f95dbfc4-6xscx\" (UID: \"5031df66-81a1-4741-89a0-7881243ec7fd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64f95dbfc4-6xscx" Apr 22 19:58:50.997334 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.997179 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5031df66-81a1-4741-89a0-7881243ec7fd-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-64f95dbfc4-6xscx\" (UID: \"5031df66-81a1-4741-89a0-7881243ec7fd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64f95dbfc4-6xscx" Apr 22 19:58:50.997334 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.997209 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7865aab6-2a06-423e-ba38-7d4717be80a1-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-974789bfc-zsbbj\" (UID: \"7865aab6-2a06-423e-ba38-7d4717be80a1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-974789bfc-zsbbj" Apr 22 19:58:50.997334 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.997269 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9qpz\" (UniqueName: \"kubernetes.io/projected/7865aab6-2a06-423e-ba38-7d4717be80a1-kube-api-access-z9qpz\") pod \"managed-serviceaccount-addon-agent-974789bfc-zsbbj\" (UID: \"7865aab6-2a06-423e-ba38-7d4717be80a1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-974789bfc-zsbbj" Apr 22 19:58:50.997334 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:50.997294 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/5031df66-81a1-4741-89a0-7881243ec7fd-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-64f95dbfc4-6xscx\" (UID: \"5031df66-81a1-4741-89a0-7881243ec7fd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64f95dbfc4-6xscx" Apr 22 19:58:51.017558 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.017536 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx"] Apr 22 19:58:51.022416 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.022399 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" Apr 22 19:58:51.041535 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.041485 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx"] Apr 22 19:58:51.050459 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.050440 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-jldrx"] Apr 22 19:58:51.053974 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.053959 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-jldrx" Apr 22 19:58:51.060132 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.060113 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 19:58:51.060225 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.060145 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 19:58:51.060376 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.060364 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 19:58:51.060919 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.060903 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-h76wn\"" Apr 22 19:58:51.061011 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.060941 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 19:58:51.070949 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.070932 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-jldrx"] Apr 22 19:58:51.098224 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.098199 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14040862-cf36-42dc-928c-1c43754396db-trusted-ca\") pod \"image-registry-7c8cf6cfd5-wrpnx\" (UID: \"14040862-cf36-42dc-928c-1c43754396db\") " pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" Apr 22 19:58:51.098308 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.098232 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5031df66-81a1-4741-89a0-7881243ec7fd-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-64f95dbfc4-6xscx\" (UID: \"5031df66-81a1-4741-89a0-7881243ec7fd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64f95dbfc4-6xscx" Apr 22 19:58:51.098308 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.098263 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/36d813c0-2bc0-48c5-a696-ae872c3dcca4-crio-socket\") pod \"insights-runtime-extractor-jldrx\" (UID: \"36d813c0-2bc0-48c5-a696-ae872c3dcca4\") " pod="openshift-insights/insights-runtime-extractor-jldrx" Apr 22 19:58:51.098387 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.098363 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7865aab6-2a06-423e-ba38-7d4717be80a1-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-974789bfc-zsbbj\" (UID: \"7865aab6-2a06-423e-ba38-7d4717be80a1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-974789bfc-zsbbj" Apr 22 19:58:51.098422 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.098396 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/36d813c0-2bc0-48c5-a696-ae872c3dcca4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jldrx\" (UID: \"36d813c0-2bc0-48c5-a696-ae872c3dcca4\") " pod="openshift-insights/insights-runtime-extractor-jldrx" Apr 22 19:58:51.098465 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.098425 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z9qpz\" (UniqueName: \"kubernetes.io/projected/7865aab6-2a06-423e-ba38-7d4717be80a1-kube-api-access-z9qpz\") pod \"managed-serviceaccount-addon-agent-974789bfc-zsbbj\" (UID: \"7865aab6-2a06-423e-ba38-7d4717be80a1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-974789bfc-zsbbj" Apr 22 19:58:51.098465 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.098441 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/5031df66-81a1-4741-89a0-7881243ec7fd-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-64f95dbfc4-6xscx\" (UID: \"5031df66-81a1-4741-89a0-7881243ec7fd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64f95dbfc4-6xscx" Apr 22 19:58:51.098540 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.098469 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14040862-cf36-42dc-928c-1c43754396db-registry-certificates\") pod \"image-registry-7c8cf6cfd5-wrpnx\" (UID: \"14040862-cf36-42dc-928c-1c43754396db\") " pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" Apr 22 19:58:51.098540 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.098486 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl9k6\" (UniqueName: \"kubernetes.io/projected/14040862-cf36-42dc-928c-1c43754396db-kube-api-access-fl9k6\") pod \"image-registry-7c8cf6cfd5-wrpnx\" (UID: \"14040862-cf36-42dc-928c-1c43754396db\") " pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" Apr 22 19:58:51.098540 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.098510 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/36d813c0-2bc0-48c5-a696-ae872c3dcca4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jldrx\" (UID: \"36d813c0-2bc0-48c5-a696-ae872c3dcca4\") " pod="openshift-insights/insights-runtime-extractor-jldrx" Apr 22 19:58:51.098679 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.098541 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7bqd\" (UniqueName: \"kubernetes.io/projected/5031df66-81a1-4741-89a0-7881243ec7fd-kube-api-access-n7bqd\") pod \"cluster-proxy-proxy-agent-64f95dbfc4-6xscx\" (UID: \"5031df66-81a1-4741-89a0-7881243ec7fd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64f95dbfc4-6xscx" Apr 22 19:58:51.098679 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.098591 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7a14051d-411f-447f-bd34-1abaf46edaa2-tmp\") pod \"klusterlet-addon-workmgr-8565f89794-vk6gz\" (UID: \"7a14051d-411f-447f-bd34-1abaf46edaa2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8565f89794-vk6gz" Apr 22 19:58:51.098679 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.098622 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/7a14051d-411f-447f-bd34-1abaf46edaa2-klusterlet-config\") pod \"klusterlet-addon-workmgr-8565f89794-vk6gz\" (UID: \"7a14051d-411f-447f-bd34-1abaf46edaa2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8565f89794-vk6gz" Apr 22 19:58:51.098679 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.098653 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/5031df66-81a1-4741-89a0-7881243ec7fd-ca\") pod \"cluster-proxy-proxy-agent-64f95dbfc4-6xscx\" (UID: \"5031df66-81a1-4741-89a0-7881243ec7fd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64f95dbfc4-6xscx" Apr 22 19:58:51.098931 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.098689 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14040862-cf36-42dc-928c-1c43754396db-installation-pull-secrets\") pod \"image-registry-7c8cf6cfd5-wrpnx\" (UID: \"14040862-cf36-42dc-928c-1c43754396db\") " pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" Apr 22 19:58:51.098931 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.098721 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv2k5\" (UniqueName: \"kubernetes.io/projected/36d813c0-2bc0-48c5-a696-ae872c3dcca4-kube-api-access-bv2k5\") pod \"insights-runtime-extractor-jldrx\" (UID: \"36d813c0-2bc0-48c5-a696-ae872c3dcca4\") " pod="openshift-insights/insights-runtime-extractor-jldrx" Apr 22 19:58:51.098931 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.098754 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14040862-cf36-42dc-928c-1c43754396db-bound-sa-token\") pod \"image-registry-7c8cf6cfd5-wrpnx\" (UID: \"14040862-cf36-42dc-928c-1c43754396db\") " pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" Apr 22 19:58:51.098931 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.098799 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/14040862-cf36-42dc-928c-1c43754396db-image-registry-private-configuration\") pod \"image-registry-7c8cf6cfd5-wrpnx\" (UID: \"14040862-cf36-42dc-928c-1c43754396db\") " pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" Apr 22 19:58:51.098931 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.098824 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/36d813c0-2bc0-48c5-a696-ae872c3dcca4-data-volume\") pod \"insights-runtime-extractor-jldrx\" (UID: \"36d813c0-2bc0-48c5-a696-ae872c3dcca4\") " pod="openshift-insights/insights-runtime-extractor-jldrx" Apr 22 19:58:51.098931 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.098893 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c749q\" (UniqueName: \"kubernetes.io/projected/7a14051d-411f-447f-bd34-1abaf46edaa2-kube-api-access-c749q\") pod \"klusterlet-addon-workmgr-8565f89794-vk6gz\" (UID: \"7a14051d-411f-447f-bd34-1abaf46edaa2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8565f89794-vk6gz" Apr 22 19:58:51.098931 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.098924 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14040862-cf36-42dc-928c-1c43754396db-ca-trust-extracted\") pod \"image-registry-7c8cf6cfd5-wrpnx\" (UID: \"14040862-cf36-42dc-928c-1c43754396db\") " pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" Apr 22 19:58:51.099302 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.098955 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/5031df66-81a1-4741-89a0-7881243ec7fd-hub\") pod \"cluster-proxy-proxy-agent-64f95dbfc4-6xscx\" (UID: \"5031df66-81a1-4741-89a0-7881243ec7fd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64f95dbfc4-6xscx" Apr 22 19:58:51.099302 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.098980 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/5031df66-81a1-4741-89a0-7881243ec7fd-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-64f95dbfc4-6xscx\" (UID: \"5031df66-81a1-4741-89a0-7881243ec7fd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64f95dbfc4-6xscx" Apr 22 19:58:51.099302 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.099003 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7a14051d-411f-447f-bd34-1abaf46edaa2-tmp\") pod \"klusterlet-addon-workmgr-8565f89794-vk6gz\" (UID: \"7a14051d-411f-447f-bd34-1abaf46edaa2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8565f89794-vk6gz" Apr 22 19:58:51.099302 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.099012 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14040862-cf36-42dc-928c-1c43754396db-registry-tls\") pod \"image-registry-7c8cf6cfd5-wrpnx\" (UID: \"14040862-cf36-42dc-928c-1c43754396db\") " pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" Apr 22 19:58:51.099302 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.099296 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/5031df66-81a1-4741-89a0-7881243ec7fd-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-64f95dbfc4-6xscx\" (UID: \"5031df66-81a1-4741-89a0-7881243ec7fd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64f95dbfc4-6xscx" Apr 22 19:58:51.101253 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.101232 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5031df66-81a1-4741-89a0-7881243ec7fd-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-64f95dbfc4-6xscx\" (UID: \"5031df66-81a1-4741-89a0-7881243ec7fd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64f95dbfc4-6xscx" Apr 22 19:58:51.101384 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.101365 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/5031df66-81a1-4741-89a0-7881243ec7fd-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-64f95dbfc4-6xscx\" (UID: \"5031df66-81a1-4741-89a0-7881243ec7fd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64f95dbfc4-6xscx" Apr 22 19:58:51.101517 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.101497 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7865aab6-2a06-423e-ba38-7d4717be80a1-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-974789bfc-zsbbj\" (UID: \"7865aab6-2a06-423e-ba38-7d4717be80a1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-974789bfc-zsbbj" Apr 22 19:58:51.101579 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.101518 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/7a14051d-411f-447f-bd34-1abaf46edaa2-klusterlet-config\") pod \"klusterlet-addon-workmgr-8565f89794-vk6gz\" (UID: \"7a14051d-411f-447f-bd34-1abaf46edaa2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8565f89794-vk6gz" Apr 22 19:58:51.101579 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.101522 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/5031df66-81a1-4741-89a0-7881243ec7fd-ca\") pod \"cluster-proxy-proxy-agent-64f95dbfc4-6xscx\" (UID: \"5031df66-81a1-4741-89a0-7881243ec7fd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64f95dbfc4-6xscx" Apr 22 19:58:51.101869 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.101851 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/5031df66-81a1-4741-89a0-7881243ec7fd-hub\") pod \"cluster-proxy-proxy-agent-64f95dbfc4-6xscx\" (UID: \"5031df66-81a1-4741-89a0-7881243ec7fd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64f95dbfc4-6xscx" Apr 22 19:58:51.114615 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.114591 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9qpz\" (UniqueName: \"kubernetes.io/projected/7865aab6-2a06-423e-ba38-7d4717be80a1-kube-api-access-z9qpz\") pod \"managed-serviceaccount-addon-agent-974789bfc-zsbbj\" (UID: \"7865aab6-2a06-423e-ba38-7d4717be80a1\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-974789bfc-zsbbj" Apr 22 19:58:51.114751 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.114725 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c749q\" (UniqueName: \"kubernetes.io/projected/7a14051d-411f-447f-bd34-1abaf46edaa2-kube-api-access-c749q\") pod \"klusterlet-addon-workmgr-8565f89794-vk6gz\" (UID: \"7a14051d-411f-447f-bd34-1abaf46edaa2\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8565f89794-vk6gz" Apr 22 19:58:51.115820 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.115797 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7bqd\" (UniqueName: \"kubernetes.io/projected/5031df66-81a1-4741-89a0-7881243ec7fd-kube-api-access-n7bqd\") pod \"cluster-proxy-proxy-agent-64f95dbfc4-6xscx\" (UID: \"5031df66-81a1-4741-89a0-7881243ec7fd\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64f95dbfc4-6xscx" Apr 22 19:58:51.199525 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.199446 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14040862-cf36-42dc-928c-1c43754396db-ca-trust-extracted\") pod \"image-registry-7c8cf6cfd5-wrpnx\" (UID: \"14040862-cf36-42dc-928c-1c43754396db\") " pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" Apr 22 19:58:51.199525 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.199509 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14040862-cf36-42dc-928c-1c43754396db-registry-tls\") pod \"image-registry-7c8cf6cfd5-wrpnx\" (UID: \"14040862-cf36-42dc-928c-1c43754396db\") " pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" Apr 22 19:58:51.199730 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.199544 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14040862-cf36-42dc-928c-1c43754396db-trusted-ca\") pod \"image-registry-7c8cf6cfd5-wrpnx\" (UID: \"14040862-cf36-42dc-928c-1c43754396db\") " pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" Apr 22 19:58:51.199730 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.199574 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/36d813c0-2bc0-48c5-a696-ae872c3dcca4-crio-socket\") pod \"insights-runtime-extractor-jldrx\" (UID: \"36d813c0-2bc0-48c5-a696-ae872c3dcca4\") " pod="openshift-insights/insights-runtime-extractor-jldrx" Apr 22 19:58:51.199730 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.199609 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/36d813c0-2bc0-48c5-a696-ae872c3dcca4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jldrx\" (UID: \"36d813c0-2bc0-48c5-a696-ae872c3dcca4\") " pod="openshift-insights/insights-runtime-extractor-jldrx" Apr 22 19:58:51.199730 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.199672 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14040862-cf36-42dc-928c-1c43754396db-registry-certificates\") pod \"image-registry-7c8cf6cfd5-wrpnx\" (UID: \"14040862-cf36-42dc-928c-1c43754396db\") " pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" Apr 22 19:58:51.199730 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.199699 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fl9k6\" (UniqueName: \"kubernetes.io/projected/14040862-cf36-42dc-928c-1c43754396db-kube-api-access-fl9k6\") pod \"image-registry-7c8cf6cfd5-wrpnx\" (UID: \"14040862-cf36-42dc-928c-1c43754396db\") " pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" Apr 22 19:58:51.199730 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.199723 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/36d813c0-2bc0-48c5-a696-ae872c3dcca4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jldrx\" (UID: \"36d813c0-2bc0-48c5-a696-ae872c3dcca4\") " pod="openshift-insights/insights-runtime-extractor-jldrx" Apr 22 19:58:51.200017 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.199733 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/36d813c0-2bc0-48c5-a696-ae872c3dcca4-crio-socket\") pod \"insights-runtime-extractor-jldrx\" (UID: \"36d813c0-2bc0-48c5-a696-ae872c3dcca4\") " pod="openshift-insights/insights-runtime-extractor-jldrx" Apr 22 19:58:51.200017 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.199792 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14040862-cf36-42dc-928c-1c43754396db-installation-pull-secrets\") pod \"image-registry-7c8cf6cfd5-wrpnx\" (UID: \"14040862-cf36-42dc-928c-1c43754396db\") " pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" Apr 22 19:58:51.200017 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.199830 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bv2k5\" (UniqueName: \"kubernetes.io/projected/36d813c0-2bc0-48c5-a696-ae872c3dcca4-kube-api-access-bv2k5\") pod \"insights-runtime-extractor-jldrx\" (UID: \"36d813c0-2bc0-48c5-a696-ae872c3dcca4\") " pod="openshift-insights/insights-runtime-extractor-jldrx" Apr 22 19:58:51.200017 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.199861 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14040862-cf36-42dc-928c-1c43754396db-bound-sa-token\") pod \"image-registry-7c8cf6cfd5-wrpnx\" (UID: \"14040862-cf36-42dc-928c-1c43754396db\") " pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" Apr 22 19:58:51.200017 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.199885 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/14040862-cf36-42dc-928c-1c43754396db-ca-trust-extracted\") pod \"image-registry-7c8cf6cfd5-wrpnx\" (UID: \"14040862-cf36-42dc-928c-1c43754396db\") " pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" Apr 22 19:58:51.200017 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.199907 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/14040862-cf36-42dc-928c-1c43754396db-image-registry-private-configuration\") pod \"image-registry-7c8cf6cfd5-wrpnx\" (UID: \"14040862-cf36-42dc-928c-1c43754396db\") " pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" Apr 22 19:58:51.200017 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.199934 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/36d813c0-2bc0-48c5-a696-ae872c3dcca4-data-volume\") pod \"insights-runtime-extractor-jldrx\" (UID: \"36d813c0-2bc0-48c5-a696-ae872c3dcca4\") " pod="openshift-insights/insights-runtime-extractor-jldrx" Apr 22 19:58:51.200345 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.200266 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/36d813c0-2bc0-48c5-a696-ae872c3dcca4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jldrx\" (UID: \"36d813c0-2bc0-48c5-a696-ae872c3dcca4\") " pod="openshift-insights/insights-runtime-extractor-jldrx" Apr 22 19:58:51.200400 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.200358 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/36d813c0-2bc0-48c5-a696-ae872c3dcca4-data-volume\") pod \"insights-runtime-extractor-jldrx\" (UID: \"36d813c0-2bc0-48c5-a696-ae872c3dcca4\") " pod="openshift-insights/insights-runtime-extractor-jldrx" Apr 22 19:58:51.200565 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.200512 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/14040862-cf36-42dc-928c-1c43754396db-trusted-ca\") pod \"image-registry-7c8cf6cfd5-wrpnx\" (UID: \"14040862-cf36-42dc-928c-1c43754396db\") " pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" Apr 22 19:58:51.200694 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.200582 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/14040862-cf36-42dc-928c-1c43754396db-registry-certificates\") pod \"image-registry-7c8cf6cfd5-wrpnx\" (UID: \"14040862-cf36-42dc-928c-1c43754396db\") " pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" Apr 22 19:58:51.202443 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.202417 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/36d813c0-2bc0-48c5-a696-ae872c3dcca4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jldrx\" (UID: \"36d813c0-2bc0-48c5-a696-ae872c3dcca4\") " pod="openshift-insights/insights-runtime-extractor-jldrx" Apr 22 19:58:51.202602 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.202584 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/14040862-cf36-42dc-928c-1c43754396db-image-registry-private-configuration\") pod \"image-registry-7c8cf6cfd5-wrpnx\" (UID: \"14040862-cf36-42dc-928c-1c43754396db\") " pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" Apr 22 19:58:51.202806 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.202785 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/14040862-cf36-42dc-928c-1c43754396db-installation-pull-secrets\") pod \"image-registry-7c8cf6cfd5-wrpnx\" (UID: \"14040862-cf36-42dc-928c-1c43754396db\") " pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" Apr 22 19:58:51.202905 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.202888 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/14040862-cf36-42dc-928c-1c43754396db-registry-tls\") pod \"image-registry-7c8cf6cfd5-wrpnx\" (UID: \"14040862-cf36-42dc-928c-1c43754396db\") " pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" Apr 22 19:58:51.217129 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.217104 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl9k6\" (UniqueName: \"kubernetes.io/projected/14040862-cf36-42dc-928c-1c43754396db-kube-api-access-fl9k6\") pod \"image-registry-7c8cf6cfd5-wrpnx\" (UID: \"14040862-cf36-42dc-928c-1c43754396db\") " pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" Apr 22 19:58:51.217207 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.217136 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/14040862-cf36-42dc-928c-1c43754396db-bound-sa-token\") pod \"image-registry-7c8cf6cfd5-wrpnx\" (UID: \"14040862-cf36-42dc-928c-1c43754396db\") " pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" Apr 22 19:58:51.217207 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.217191 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv2k5\" (UniqueName: \"kubernetes.io/projected/36d813c0-2bc0-48c5-a696-ae872c3dcca4-kube-api-access-bv2k5\") pod \"insights-runtime-extractor-jldrx\" (UID: \"36d813c0-2bc0-48c5-a696-ae872c3dcca4\") " pod="openshift-insights/insights-runtime-extractor-jldrx" Apr 22 19:58:51.235297 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.235274 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-974789bfc-zsbbj" Apr 22 19:58:51.241975 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.241959 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8565f89794-vk6gz" Apr 22 19:58:51.247007 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.246989 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64f95dbfc4-6xscx" Apr 22 19:58:51.348847 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.348732 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" Apr 22 19:58:51.362007 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.361984 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-jldrx" Apr 22 19:58:51.372178 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.372158 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-974789bfc-zsbbj"] Apr 22 19:58:51.385737 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:58:51.385711 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7865aab6_2a06_423e_ba38_7d4717be80a1.slice/crio-eae0d5558ceb90db6717d3fb97492b9df2da3b9b8cc8006efbae26fc2df3ad91 WatchSource:0}: Error finding container eae0d5558ceb90db6717d3fb97492b9df2da3b9b8cc8006efbae26fc2df3ad91: Status 404 returned error can't find the container with id eae0d5558ceb90db6717d3fb97492b9df2da3b9b8cc8006efbae26fc2df3ad91 Apr 22 19:58:51.485511 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.485479 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-jldrx"] Apr 22 19:58:51.488578 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:58:51.488552 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36d813c0_2bc0_48c5_a696_ae872c3dcca4.slice/crio-0d1004387c4d66e1e3c9aa429f0064686e5a60b521f76269b4fc44b6d88187a4 WatchSource:0}: Error finding container 0d1004387c4d66e1e3c9aa429f0064686e5a60b521f76269b4fc44b6d88187a4: Status 404 returned error can't find the container with id 0d1004387c4d66e1e3c9aa429f0064686e5a60b521f76269b4fc44b6d88187a4 Apr 22 19:58:51.500902 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.500879 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx"] Apr 22 19:58:51.503099 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:58:51.503075 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14040862_cf36_42dc_928c_1c43754396db.slice/crio-e3c5780801b6de6144bf37cac672fb2f343e10555327c5e3ea7073516de4cc9d WatchSource:0}: Error finding container e3c5780801b6de6144bf37cac672fb2f343e10555327c5e3ea7073516de4cc9d: Status 404 returned error can't find the container with id e3c5780801b6de6144bf37cac672fb2f343e10555327c5e3ea7073516de4cc9d Apr 22 19:58:51.566559 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.566527 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-974789bfc-zsbbj" event={"ID":"7865aab6-2a06-423e-ba38-7d4717be80a1","Type":"ContainerStarted","Data":"eae0d5558ceb90db6717d3fb97492b9df2da3b9b8cc8006efbae26fc2df3ad91"} Apr 22 19:58:51.567507 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.567484 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jldrx" event={"ID":"36d813c0-2bc0-48c5-a696-ae872c3dcca4","Type":"ContainerStarted","Data":"0d1004387c4d66e1e3c9aa429f0064686e5a60b521f76269b4fc44b6d88187a4"} Apr 22 19:58:51.568540 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.568518 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" event={"ID":"14040862-cf36-42dc-928c-1c43754396db","Type":"ContainerStarted","Data":"e3c5780801b6de6144bf37cac672fb2f343e10555327c5e3ea7073516de4cc9d"} Apr 22 19:58:51.599697 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.599671 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64f95dbfc4-6xscx"] Apr 22 19:58:51.602682 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:51.602641 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-8565f89794-vk6gz"] Apr 22 19:58:51.602911 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:58:51.602870 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5031df66_81a1_4741_89a0_7881243ec7fd.slice/crio-6a01c85fe7f21767a8bd032c2d5c3d3176928a86376d64ff5b69428368dcce58 WatchSource:0}: Error finding container 6a01c85fe7f21767a8bd032c2d5c3d3176928a86376d64ff5b69428368dcce58: Status 404 returned error can't find the container with id 6a01c85fe7f21767a8bd032c2d5c3d3176928a86376d64ff5b69428368dcce58 Apr 22 19:58:51.606387 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:58:51.606362 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a14051d_411f_447f_bd34_1abaf46edaa2.slice/crio-78d5db7193f3aeed21a0b900f4cf2b83b3ae3ec56b94de8c98af939c6149a745 WatchSource:0}: Error finding container 78d5db7193f3aeed21a0b900f4cf2b83b3ae3ec56b94de8c98af939c6149a745: Status 404 returned error can't find the container with id 78d5db7193f3aeed21a0b900f4cf2b83b3ae3ec56b94de8c98af939c6149a745 Apr 22 19:58:52.576718 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:52.576679 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jldrx" event={"ID":"36d813c0-2bc0-48c5-a696-ae872c3dcca4","Type":"ContainerStarted","Data":"f87f934cb162743eaf6f28b3f057c51e07ce05157aeda4905e4611ca1431e621"} Apr 22 19:58:52.580760 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:52.580731 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" event={"ID":"14040862-cf36-42dc-928c-1c43754396db","Type":"ContainerStarted","Data":"3e324dd74dd9549d4c1fc56bdc8e2c72ae97dafbc6cc94d733a08f8142051984"} Apr 22 19:58:52.582078 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:52.581644 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" Apr 22 19:58:52.583265 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:52.582840 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64f95dbfc4-6xscx" event={"ID":"5031df66-81a1-4741-89a0-7881243ec7fd","Type":"ContainerStarted","Data":"6a01c85fe7f21767a8bd032c2d5c3d3176928a86376d64ff5b69428368dcce58"} Apr 22 19:58:52.585794 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:52.585769 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8565f89794-vk6gz" event={"ID":"7a14051d-411f-447f-bd34-1abaf46edaa2","Type":"ContainerStarted","Data":"78d5db7193f3aeed21a0b900f4cf2b83b3ae3ec56b94de8c98af939c6149a745"} Apr 22 19:58:53.593076 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:53.592986 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jldrx" event={"ID":"36d813c0-2bc0-48c5-a696-ae872c3dcca4","Type":"ContainerStarted","Data":"406d965d00d36c8150d0f4b9d805d0ffd2098e3edaa992a345473bfae741ac6d"} Apr 22 19:58:54.497860 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:54.497704 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f7mbc" Apr 22 19:58:54.531722 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:54.531476 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" podStartSLOduration=4.531460596 podStartE2EDuration="4.531460596s" podCreationTimestamp="2026-04-22 19:58:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:58:52.602699933 +0000 UTC m=+56.854199824" watchObservedRunningTime="2026-04-22 19:58:54.531460596 +0000 UTC m=+58.782960486" Apr 22 19:58:57.561343 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:57.561313 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-mdzds" Apr 22 19:58:57.606105 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:57.606071 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jldrx" event={"ID":"36d813c0-2bc0-48c5-a696-ae872c3dcca4","Type":"ContainerStarted","Data":"f35e149e8672b2cd86d6bdfbe20632a3b67219f991504bc051ae73a4fd2a9f60"} Apr 22 19:58:57.607506 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:57.607478 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-974789bfc-zsbbj" event={"ID":"7865aab6-2a06-423e-ba38-7d4717be80a1","Type":"ContainerStarted","Data":"c429f583d01982009b7b243dd31ca87d6b8d50046a1f7738c133e5e9ab092b6e"} Apr 22 19:58:57.608876 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:57.608846 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64f95dbfc4-6xscx" event={"ID":"5031df66-81a1-4741-89a0-7881243ec7fd","Type":"ContainerStarted","Data":"1fa8a94ce409a0b7183e5f12dd76eca1d819f70f1ecaebec5ddf3d7a1905a0dc"} Apr 22 19:58:57.610330 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:57.610309 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8565f89794-vk6gz" event={"ID":"7a14051d-411f-447f-bd34-1abaf46edaa2","Type":"ContainerStarted","Data":"6461bbf76c9d4f0721490083917d2c6d24405523fb9d7ddbc427089d564b146c"} Apr 22 19:58:57.610542 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:57.610520 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8565f89794-vk6gz" Apr 22 19:58:57.611883 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:57.611864 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8565f89794-vk6gz" Apr 22 19:58:57.626188 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:57.626153 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-jldrx" podStartSLOduration=1.087287103 podStartE2EDuration="6.626142827s" podCreationTimestamp="2026-04-22 19:58:51 +0000 UTC" firstStartedPulling="2026-04-22 19:58:51.585617845 +0000 UTC m=+55.837117716" lastFinishedPulling="2026-04-22 19:58:57.124473572 +0000 UTC m=+61.375973440" observedRunningTime="2026-04-22 19:58:57.625768039 +0000 UTC m=+61.877267928" watchObservedRunningTime="2026-04-22 19:58:57.626142827 +0000 UTC m=+61.877642716" Apr 22 19:58:57.644574 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:57.644535 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-8565f89794-vk6gz" podStartSLOduration=2.135989744 podStartE2EDuration="7.644525138s" podCreationTimestamp="2026-04-22 19:58:50 +0000 UTC" firstStartedPulling="2026-04-22 19:58:51.610349786 +0000 UTC m=+55.861849656" lastFinishedPulling="2026-04-22 19:58:57.118885179 +0000 UTC m=+61.370385050" observedRunningTime="2026-04-22 19:58:57.64407567 +0000 UTC m=+61.895575559" watchObservedRunningTime="2026-04-22 19:58:57.644525138 +0000 UTC m=+61.896025027" Apr 22 19:58:57.663715 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:58:57.663670 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-974789bfc-zsbbj" podStartSLOduration=1.96875577 podStartE2EDuration="7.663659682s" podCreationTimestamp="2026-04-22 19:58:50 +0000 UTC" firstStartedPulling="2026-04-22 19:58:51.388083437 +0000 UTC m=+55.639583304" lastFinishedPulling="2026-04-22 19:58:57.082987349 +0000 UTC m=+61.334487216" observedRunningTime="2026-04-22 19:58:57.662476287 +0000 UTC m=+61.913976189" watchObservedRunningTime="2026-04-22 19:58:57.663659682 +0000 UTC m=+61.915159570" Apr 22 19:59:00.618986 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:00.618946 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64f95dbfc4-6xscx" event={"ID":"5031df66-81a1-4741-89a0-7881243ec7fd","Type":"ContainerStarted","Data":"1f318e7c589da2092764aad200d77a48099cb9be221c70b5d4a8b50e045ba58b"} Apr 22 19:59:00.618986 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:00.618987 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64f95dbfc4-6xscx" event={"ID":"5031df66-81a1-4741-89a0-7881243ec7fd","Type":"ContainerStarted","Data":"7b0d3facd4c749a93fa879adaf0e90435f23e1089a0c29a3c4f975a9401ee17a"} Apr 22 19:59:00.637017 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:00.636975 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-64f95dbfc4-6xscx" podStartSLOduration=2.604439246 podStartE2EDuration="10.636963542s" podCreationTimestamp="2026-04-22 19:58:50 +0000 UTC" firstStartedPulling="2026-04-22 19:58:51.604989834 +0000 UTC m=+55.856489707" lastFinishedPulling="2026-04-22 19:58:59.637514136 +0000 UTC m=+63.889014003" observedRunningTime="2026-04-22 19:59:00.636473702 +0000 UTC m=+64.887973590" watchObservedRunningTime="2026-04-22 19:59:00.636963542 +0000 UTC m=+64.888463430" Apr 22 19:59:00.906155 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:00.906075 2575 patch_prober.go:28] interesting pod/image-registry-7b97ccfb5-7tjp6 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 19:59:00.906155 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:00.906124 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" podUID="48a20871-3902-4d97-9b7d-6c533dbb4d37" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:59:01.989914 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:01.989876 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80c4f1f9-74db-4e1b-bb9c-05d1618ca285-metrics-certs\") pod \"network-metrics-daemon-gptpd\" (UID: \"80c4f1f9-74db-4e1b-bb9c-05d1618ca285\") " pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:59:01.992234 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:01.992216 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:59:02.003304 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:02.003277 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80c4f1f9-74db-4e1b-bb9c-05d1618ca285-metrics-certs\") pod \"network-metrics-daemon-gptpd\" (UID: \"80c4f1f9-74db-4e1b-bb9c-05d1618ca285\") " pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:59:02.090209 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:02.090181 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hfkng\" (UniqueName: \"kubernetes.io/projected/a0ee49c5-2186-4f09-9a83-ceabc81a52e5-kube-api-access-hfkng\") pod \"network-check-target-v2xhn\" (UID: \"a0ee49c5-2186-4f09-9a83-ceabc81a52e5\") " pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 19:59:02.093010 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:02.092993 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:59:02.103155 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:02.103141 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:59:02.113444 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:02.113416 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfkng\" (UniqueName: \"kubernetes.io/projected/a0ee49c5-2186-4f09-9a83-ceabc81a52e5-kube-api-access-hfkng\") pod \"network-check-target-v2xhn\" (UID: \"a0ee49c5-2186-4f09-9a83-ceabc81a52e5\") " pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 19:59:02.117396 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:02.117372 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tjpls\"" Apr 22 19:59:02.121250 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:02.121232 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qxcj4\"" Apr 22 19:59:02.125027 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:02.125008 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 19:59:02.129657 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:02.129638 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gptpd" Apr 22 19:59:02.263904 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:02.263840 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-v2xhn"] Apr 22 19:59:02.268446 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:59:02.268416 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0ee49c5_2186_4f09_9a83_ceabc81a52e5.slice/crio-08f837e82698087fe51ef7b28cd4ee02059be468501fb0c2da211f248afeec64 WatchSource:0}: Error finding container 08f837e82698087fe51ef7b28cd4ee02059be468501fb0c2da211f248afeec64: Status 404 returned error can't find the container with id 08f837e82698087fe51ef7b28cd4ee02059be468501fb0c2da211f248afeec64 Apr 22 19:59:02.281241 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:02.281218 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gptpd"] Apr 22 19:59:02.284484 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:59:02.284461 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80c4f1f9_74db_4e1b_bb9c_05d1618ca285.slice/crio-ff60c7710da0c81875da9434e93c93a6a70f7c7d6914ba52c608af6cf2038dff WatchSource:0}: Error finding container ff60c7710da0c81875da9434e93c93a6a70f7c7d6914ba52c608af6cf2038dff: Status 404 returned error can't find the container with id ff60c7710da0c81875da9434e93c93a6a70f7c7d6914ba52c608af6cf2038dff Apr 22 19:59:02.624387 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:02.624351 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gptpd" event={"ID":"80c4f1f9-74db-4e1b-bb9c-05d1618ca285","Type":"ContainerStarted","Data":"ff60c7710da0c81875da9434e93c93a6a70f7c7d6914ba52c608af6cf2038dff"} Apr 22 19:59:02.625401 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:02.625375 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-v2xhn" event={"ID":"a0ee49c5-2186-4f09-9a83-ceabc81a52e5","Type":"ContainerStarted","Data":"08f837e82698087fe51ef7b28cd4ee02059be468501fb0c2da211f248afeec64"} Apr 22 19:59:03.629639 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:03.629609 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gptpd" event={"ID":"80c4f1f9-74db-4e1b-bb9c-05d1618ca285","Type":"ContainerStarted","Data":"3d13a0038d359d659f2094c62bfcb97cc76a322744edb0e7a907ab453929b90e"} Apr 22 19:59:04.498050 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.497453 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-phnrd"] Apr 22 19:59:04.502532 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.502505 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-phnrd" Apr 22 19:59:04.504991 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.504969 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 19:59:04.507151 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.507128 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 19:59:04.507644 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.507304 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 19:59:04.507644 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.507410 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 19:59:04.507644 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.507494 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-98m5q\"" Apr 22 19:59:04.507644 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.507248 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 19:59:04.508599 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.507779 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 19:59:04.609167 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.608663 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/201a3fbe-7126-4565-b184-2dbb1cc12e69-metrics-client-ca\") pod \"node-exporter-phnrd\" (UID: \"201a3fbe-7126-4565-b184-2dbb1cc12e69\") " pod="openshift-monitoring/node-exporter-phnrd" Apr 22 19:59:04.609167 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.608727 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/201a3fbe-7126-4565-b184-2dbb1cc12e69-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-phnrd\" (UID: \"201a3fbe-7126-4565-b184-2dbb1cc12e69\") " pod="openshift-monitoring/node-exporter-phnrd" Apr 22 19:59:04.609167 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.608763 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk66k\" (UniqueName: \"kubernetes.io/projected/201a3fbe-7126-4565-b184-2dbb1cc12e69-kube-api-access-dk66k\") pod \"node-exporter-phnrd\" (UID: \"201a3fbe-7126-4565-b184-2dbb1cc12e69\") " pod="openshift-monitoring/node-exporter-phnrd" Apr 22 19:59:04.609167 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.608808 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/201a3fbe-7126-4565-b184-2dbb1cc12e69-node-exporter-accelerators-collector-config\") pod \"node-exporter-phnrd\" (UID: \"201a3fbe-7126-4565-b184-2dbb1cc12e69\") " pod="openshift-monitoring/node-exporter-phnrd" Apr 22 19:59:04.609167 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.608839 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/201a3fbe-7126-4565-b184-2dbb1cc12e69-sys\") pod \"node-exporter-phnrd\" (UID: \"201a3fbe-7126-4565-b184-2dbb1cc12e69\") " pod="openshift-monitoring/node-exporter-phnrd" Apr 22 19:59:04.609167 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.608909 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/201a3fbe-7126-4565-b184-2dbb1cc12e69-node-exporter-tls\") pod \"node-exporter-phnrd\" (UID: \"201a3fbe-7126-4565-b184-2dbb1cc12e69\") " pod="openshift-monitoring/node-exporter-phnrd" Apr 22 19:59:04.609167 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.608935 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/201a3fbe-7126-4565-b184-2dbb1cc12e69-node-exporter-wtmp\") pod \"node-exporter-phnrd\" (UID: \"201a3fbe-7126-4565-b184-2dbb1cc12e69\") " pod="openshift-monitoring/node-exporter-phnrd" Apr 22 19:59:04.609167 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.608971 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/201a3fbe-7126-4565-b184-2dbb1cc12e69-node-exporter-textfile\") pod \"node-exporter-phnrd\" (UID: \"201a3fbe-7126-4565-b184-2dbb1cc12e69\") " pod="openshift-monitoring/node-exporter-phnrd" Apr 22 19:59:04.609167 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.608997 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/201a3fbe-7126-4565-b184-2dbb1cc12e69-root\") pod \"node-exporter-phnrd\" (UID: \"201a3fbe-7126-4565-b184-2dbb1cc12e69\") " pod="openshift-monitoring/node-exporter-phnrd" Apr 22 19:59:04.636880 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.636842 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gptpd" event={"ID":"80c4f1f9-74db-4e1b-bb9c-05d1618ca285","Type":"ContainerStarted","Data":"a1727e8c6a661e43afb8a9d108b9e9c0109f2c49384d89035901401b067673c3"} Apr 22 19:59:04.654825 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.654773 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-gptpd" podStartSLOduration=67.540423156 podStartE2EDuration="1m8.654756898s" podCreationTimestamp="2026-04-22 19:57:56 +0000 UTC" firstStartedPulling="2026-04-22 19:59:02.286212648 +0000 UTC m=+66.537712515" lastFinishedPulling="2026-04-22 19:59:03.400546386 +0000 UTC m=+67.652046257" observedRunningTime="2026-04-22 19:59:04.652791448 +0000 UTC m=+68.904291532" watchObservedRunningTime="2026-04-22 19:59:04.654756898 +0000 UTC m=+68.906256789" Apr 22 19:59:04.709371 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.709344 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/201a3fbe-7126-4565-b184-2dbb1cc12e69-metrics-client-ca\") pod \"node-exporter-phnrd\" (UID: \"201a3fbe-7126-4565-b184-2dbb1cc12e69\") " pod="openshift-monitoring/node-exporter-phnrd" Apr 22 19:59:04.709481 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.709403 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/201a3fbe-7126-4565-b184-2dbb1cc12e69-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-phnrd\" (UID: \"201a3fbe-7126-4565-b184-2dbb1cc12e69\") " pod="openshift-monitoring/node-exporter-phnrd" Apr 22 19:59:04.709481 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.709440 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dk66k\" (UniqueName: \"kubernetes.io/projected/201a3fbe-7126-4565-b184-2dbb1cc12e69-kube-api-access-dk66k\") pod \"node-exporter-phnrd\" (UID: \"201a3fbe-7126-4565-b184-2dbb1cc12e69\") " pod="openshift-monitoring/node-exporter-phnrd" Apr 22 19:59:04.709481 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.709466 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/201a3fbe-7126-4565-b184-2dbb1cc12e69-node-exporter-accelerators-collector-config\") pod \"node-exporter-phnrd\" (UID: \"201a3fbe-7126-4565-b184-2dbb1cc12e69\") " pod="openshift-monitoring/node-exporter-phnrd" Apr 22 19:59:04.709645 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.709495 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/201a3fbe-7126-4565-b184-2dbb1cc12e69-sys\") pod \"node-exporter-phnrd\" (UID: \"201a3fbe-7126-4565-b184-2dbb1cc12e69\") " pod="openshift-monitoring/node-exporter-phnrd" Apr 22 19:59:04.709645 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.709552 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/201a3fbe-7126-4565-b184-2dbb1cc12e69-node-exporter-tls\") pod \"node-exporter-phnrd\" (UID: \"201a3fbe-7126-4565-b184-2dbb1cc12e69\") " pod="openshift-monitoring/node-exporter-phnrd" Apr 22 19:59:04.709645 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.709577 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/201a3fbe-7126-4565-b184-2dbb1cc12e69-node-exporter-wtmp\") pod \"node-exporter-phnrd\" (UID: \"201a3fbe-7126-4565-b184-2dbb1cc12e69\") " pod="openshift-monitoring/node-exporter-phnrd" Apr 22 19:59:04.709645 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.709610 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/201a3fbe-7126-4565-b184-2dbb1cc12e69-node-exporter-textfile\") pod \"node-exporter-phnrd\" (UID: \"201a3fbe-7126-4565-b184-2dbb1cc12e69\") " pod="openshift-monitoring/node-exporter-phnrd" Apr 22 19:59:04.709645 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.709636 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/201a3fbe-7126-4565-b184-2dbb1cc12e69-root\") pod \"node-exporter-phnrd\" (UID: \"201a3fbe-7126-4565-b184-2dbb1cc12e69\") " pod="openshift-monitoring/node-exporter-phnrd" Apr 22 19:59:04.710890 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.710218 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/201a3fbe-7126-4565-b184-2dbb1cc12e69-sys\") pod \"node-exporter-phnrd\" (UID: \"201a3fbe-7126-4565-b184-2dbb1cc12e69\") " pod="openshift-monitoring/node-exporter-phnrd" Apr 22 19:59:04.710890 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.710349 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/201a3fbe-7126-4565-b184-2dbb1cc12e69-node-exporter-wtmp\") pod \"node-exporter-phnrd\" (UID: \"201a3fbe-7126-4565-b184-2dbb1cc12e69\") " pod="openshift-monitoring/node-exporter-phnrd" Apr 22 19:59:04.710890 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:59:04.710430 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 19:59:04.710890 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:59:04.710489 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/201a3fbe-7126-4565-b184-2dbb1cc12e69-node-exporter-tls podName:201a3fbe-7126-4565-b184-2dbb1cc12e69 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:05.21046823 +0000 UTC m=+69.461968111 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/201a3fbe-7126-4565-b184-2dbb1cc12e69-node-exporter-tls") pod "node-exporter-phnrd" (UID: "201a3fbe-7126-4565-b184-2dbb1cc12e69") : secret "node-exporter-tls" not found Apr 22 19:59:04.710890 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.710508 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/201a3fbe-7126-4565-b184-2dbb1cc12e69-metrics-client-ca\") pod \"node-exporter-phnrd\" (UID: \"201a3fbe-7126-4565-b184-2dbb1cc12e69\") " pod="openshift-monitoring/node-exporter-phnrd" Apr 22 19:59:04.710890 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.710708 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/201a3fbe-7126-4565-b184-2dbb1cc12e69-root\") pod \"node-exporter-phnrd\" (UID: \"201a3fbe-7126-4565-b184-2dbb1cc12e69\") " pod="openshift-monitoring/node-exporter-phnrd" Apr 22 19:59:04.710890 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.710730 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/201a3fbe-7126-4565-b184-2dbb1cc12e69-node-exporter-textfile\") pod \"node-exporter-phnrd\" (UID: \"201a3fbe-7126-4565-b184-2dbb1cc12e69\") " pod="openshift-monitoring/node-exporter-phnrd" Apr 22 19:59:04.711351 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.711182 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/201a3fbe-7126-4565-b184-2dbb1cc12e69-node-exporter-accelerators-collector-config\") pod \"node-exporter-phnrd\" (UID: \"201a3fbe-7126-4565-b184-2dbb1cc12e69\") " pod="openshift-monitoring/node-exporter-phnrd" Apr 22 19:59:04.713711 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.713671 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/201a3fbe-7126-4565-b184-2dbb1cc12e69-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-phnrd\" (UID: \"201a3fbe-7126-4565-b184-2dbb1cc12e69\") " pod="openshift-monitoring/node-exporter-phnrd" Apr 22 19:59:04.735232 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:04.735163 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk66k\" (UniqueName: \"kubernetes.io/projected/201a3fbe-7126-4565-b184-2dbb1cc12e69-kube-api-access-dk66k\") pod \"node-exporter-phnrd\" (UID: \"201a3fbe-7126-4565-b184-2dbb1cc12e69\") " pod="openshift-monitoring/node-exporter-phnrd" Apr 22 19:59:05.214150 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:05.214128 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/201a3fbe-7126-4565-b184-2dbb1cc12e69-node-exporter-tls\") pod \"node-exporter-phnrd\" (UID: \"201a3fbe-7126-4565-b184-2dbb1cc12e69\") " pod="openshift-monitoring/node-exporter-phnrd" Apr 22 19:59:05.216504 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:05.216473 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/201a3fbe-7126-4565-b184-2dbb1cc12e69-node-exporter-tls\") pod \"node-exporter-phnrd\" (UID: \"201a3fbe-7126-4565-b184-2dbb1cc12e69\") " pod="openshift-monitoring/node-exporter-phnrd" Apr 22 19:59:05.416644 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:05.416615 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-phnrd" Apr 22 19:59:05.425211 ip-10-0-128-239 kubenswrapper[2575]: W0422 19:59:05.425186 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod201a3fbe_7126_4565_b184_2dbb1cc12e69.slice/crio-327b6ed46d57949377ed9585aa141ae48b7ff9b8e7448779acae7313808af84c WatchSource:0}: Error finding container 327b6ed46d57949377ed9585aa141ae48b7ff9b8e7448779acae7313808af84c: Status 404 returned error can't find the container with id 327b6ed46d57949377ed9585aa141ae48b7ff9b8e7448779acae7313808af84c Apr 22 19:59:05.641224 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:05.641186 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-v2xhn" event={"ID":"a0ee49c5-2186-4f09-9a83-ceabc81a52e5","Type":"ContainerStarted","Data":"00808a73fd67a0048e91740afb07c45c90ecf80814231ad72e087f61ec1942ea"} Apr 22 19:59:05.641610 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:05.641239 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 19:59:05.642251 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:05.642224 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-phnrd" event={"ID":"201a3fbe-7126-4565-b184-2dbb1cc12e69","Type":"ContainerStarted","Data":"327b6ed46d57949377ed9585aa141ae48b7ff9b8e7448779acae7313808af84c"} Apr 22 19:59:05.680310 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:05.680260 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-v2xhn" podStartSLOduration=66.745266994 podStartE2EDuration="1m9.680244844s" podCreationTimestamp="2026-04-22 19:57:56 +0000 UTC" firstStartedPulling="2026-04-22 19:59:02.270493739 +0000 UTC m=+66.521993614" lastFinishedPulling="2026-04-22 19:59:05.205471583 +0000 UTC m=+69.456971464" observedRunningTime="2026-04-22 19:59:05.670963076 +0000 UTC m=+69.922462965" watchObservedRunningTime="2026-04-22 19:59:05.680244844 +0000 UTC m=+69.931744734" Apr 22 19:59:06.645873 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:06.645843 2575 generic.go:358] "Generic (PLEG): container finished" podID="201a3fbe-7126-4565-b184-2dbb1cc12e69" containerID="85e1c74176cdd5b12fc5613c30acefc177ba8379e784020df3016fc20254f7e5" exitCode=0 Apr 22 19:59:06.646284 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:06.645937 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-phnrd" event={"ID":"201a3fbe-7126-4565-b184-2dbb1cc12e69","Type":"ContainerDied","Data":"85e1c74176cdd5b12fc5613c30acefc177ba8379e784020df3016fc20254f7e5"} Apr 22 19:59:07.650879 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:07.650848 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-phnrd" event={"ID":"201a3fbe-7126-4565-b184-2dbb1cc12e69","Type":"ContainerStarted","Data":"e87a1e738cf084592159f5269a6fffd5b14cf524481aff69defb57c1fff5acde"} Apr 22 19:59:07.650879 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:07.650883 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-phnrd" event={"ID":"201a3fbe-7126-4565-b184-2dbb1cc12e69","Type":"ContainerStarted","Data":"3c44d35d1295febcfd9ca37857ea66ce8612ffd96906c288b6e691326d661848"} Apr 22 19:59:07.671570 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:07.671521 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-phnrd" podStartSLOduration=3.016954554 podStartE2EDuration="3.671508301s" podCreationTimestamp="2026-04-22 19:59:04 +0000 UTC" firstStartedPulling="2026-04-22 19:59:05.426819459 +0000 UTC m=+69.678319326" lastFinishedPulling="2026-04-22 19:59:06.081373192 +0000 UTC m=+70.332873073" observedRunningTime="2026-04-22 19:59:07.669819452 +0000 UTC m=+71.921319364" watchObservedRunningTime="2026-04-22 19:59:07.671508301 +0000 UTC m=+71.923008191" Apr 22 19:59:10.904693 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:10.904662 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:59:14.601051 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:14.601010 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7c8cf6cfd5-wrpnx" Apr 22 19:59:15.919917 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:15.919777 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" podUID="48a20871-3902-4d97-9b7d-6c533dbb4d37" containerName="registry" containerID="cri-o://007facee288895f1b5529293d7abca821f948e3d55ef661d56df2edc812d9da0" gracePeriod=30 Apr 22 19:59:16.152143 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.152121 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:59:16.288248 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.288192 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/48a20871-3902-4d97-9b7d-6c533dbb4d37-registry-certificates\") pod \"48a20871-3902-4d97-9b7d-6c533dbb4d37\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " Apr 22 19:59:16.288248 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.288221 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/48a20871-3902-4d97-9b7d-6c533dbb4d37-installation-pull-secrets\") pod \"48a20871-3902-4d97-9b7d-6c533dbb4d37\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " Apr 22 19:59:16.288248 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.288239 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-bound-sa-token\") pod \"48a20871-3902-4d97-9b7d-6c533dbb4d37\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " Apr 22 19:59:16.288501 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.288259 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/48a20871-3902-4d97-9b7d-6c533dbb4d37-image-registry-private-configuration\") pod \"48a20871-3902-4d97-9b7d-6c533dbb4d37\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " Apr 22 19:59:16.288501 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.288276 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48a20871-3902-4d97-9b7d-6c533dbb4d37-trusted-ca\") pod \"48a20871-3902-4d97-9b7d-6c533dbb4d37\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " Apr 22 19:59:16.288501 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.288345 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndq52\" (UniqueName: \"kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-kube-api-access-ndq52\") pod \"48a20871-3902-4d97-9b7d-6c533dbb4d37\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " Apr 22 19:59:16.288501 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.288418 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-registry-tls\") pod \"48a20871-3902-4d97-9b7d-6c533dbb4d37\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " Apr 22 19:59:16.288501 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.288449 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/48a20871-3902-4d97-9b7d-6c533dbb4d37-ca-trust-extracted\") pod \"48a20871-3902-4d97-9b7d-6c533dbb4d37\" (UID: \"48a20871-3902-4d97-9b7d-6c533dbb4d37\") " Apr 22 19:59:16.288738 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.288634 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48a20871-3902-4d97-9b7d-6c533dbb4d37-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "48a20871-3902-4d97-9b7d-6c533dbb4d37" (UID: "48a20871-3902-4d97-9b7d-6c533dbb4d37"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:59:16.288738 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.288681 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48a20871-3902-4d97-9b7d-6c533dbb4d37-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "48a20871-3902-4d97-9b7d-6c533dbb4d37" (UID: "48a20871-3902-4d97-9b7d-6c533dbb4d37"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:59:16.291164 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.291110 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-kube-api-access-ndq52" (OuterVolumeSpecName: "kube-api-access-ndq52") pod "48a20871-3902-4d97-9b7d-6c533dbb4d37" (UID: "48a20871-3902-4d97-9b7d-6c533dbb4d37"). InnerVolumeSpecName "kube-api-access-ndq52". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:59:16.291264 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.291170 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48a20871-3902-4d97-9b7d-6c533dbb4d37-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "48a20871-3902-4d97-9b7d-6c533dbb4d37" (UID: "48a20871-3902-4d97-9b7d-6c533dbb4d37"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:59:16.291264 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.291176 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "48a20871-3902-4d97-9b7d-6c533dbb4d37" (UID: "48a20871-3902-4d97-9b7d-6c533dbb4d37"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:59:16.291264 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.291207 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "48a20871-3902-4d97-9b7d-6c533dbb4d37" (UID: "48a20871-3902-4d97-9b7d-6c533dbb4d37"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:59:16.291366 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.291258 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48a20871-3902-4d97-9b7d-6c533dbb4d37-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "48a20871-3902-4d97-9b7d-6c533dbb4d37" (UID: "48a20871-3902-4d97-9b7d-6c533dbb4d37"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:59:16.296799 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.296776 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48a20871-3902-4d97-9b7d-6c533dbb4d37-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "48a20871-3902-4d97-9b7d-6c533dbb4d37" (UID: "48a20871-3902-4d97-9b7d-6c533dbb4d37"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:59:16.389623 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.389598 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/48a20871-3902-4d97-9b7d-6c533dbb4d37-ca-trust-extracted\") on node \"ip-10-0-128-239.ec2.internal\" DevicePath \"\"" Apr 22 19:59:16.389623 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.389619 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/48a20871-3902-4d97-9b7d-6c533dbb4d37-registry-certificates\") on node \"ip-10-0-128-239.ec2.internal\" DevicePath \"\"" Apr 22 19:59:16.389756 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.389629 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/48a20871-3902-4d97-9b7d-6c533dbb4d37-installation-pull-secrets\") on node \"ip-10-0-128-239.ec2.internal\" DevicePath \"\"" Apr 22 19:59:16.389756 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.389638 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-bound-sa-token\") on node \"ip-10-0-128-239.ec2.internal\" DevicePath \"\"" Apr 22 19:59:16.389756 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.389648 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/48a20871-3902-4d97-9b7d-6c533dbb4d37-image-registry-private-configuration\") on node \"ip-10-0-128-239.ec2.internal\" DevicePath \"\"" Apr 22 19:59:16.389756 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.389657 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/48a20871-3902-4d97-9b7d-6c533dbb4d37-trusted-ca\") on node \"ip-10-0-128-239.ec2.internal\" DevicePath \"\"" Apr 22 19:59:16.389756 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.389665 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ndq52\" (UniqueName: \"kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-kube-api-access-ndq52\") on node \"ip-10-0-128-239.ec2.internal\" DevicePath \"\"" Apr 22 19:59:16.389756 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.389673 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/48a20871-3902-4d97-9b7d-6c533dbb4d37-registry-tls\") on node \"ip-10-0-128-239.ec2.internal\" DevicePath \"\"" Apr 22 19:59:16.678541 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.678512 2575 generic.go:358] "Generic (PLEG): container finished" podID="48a20871-3902-4d97-9b7d-6c533dbb4d37" containerID="007facee288895f1b5529293d7abca821f948e3d55ef661d56df2edc812d9da0" exitCode=0 Apr 22 19:59:16.678657 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.678573 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" event={"ID":"48a20871-3902-4d97-9b7d-6c533dbb4d37","Type":"ContainerDied","Data":"007facee288895f1b5529293d7abca821f948e3d55ef661d56df2edc812d9da0"} Apr 22 19:59:16.678657 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.678579 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" Apr 22 19:59:16.678657 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.678599 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7b97ccfb5-7tjp6" event={"ID":"48a20871-3902-4d97-9b7d-6c533dbb4d37","Type":"ContainerDied","Data":"de432f97b5f39c2e253d4efbbdefa8bed322b5cb056135a8d307793015ed3234"} Apr 22 19:59:16.678657 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.678614 2575 scope.go:117] "RemoveContainer" containerID="007facee288895f1b5529293d7abca821f948e3d55ef661d56df2edc812d9da0" Apr 22 19:59:16.686059 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.686026 2575 scope.go:117] "RemoveContainer" containerID="007facee288895f1b5529293d7abca821f948e3d55ef661d56df2edc812d9da0" Apr 22 19:59:16.686333 ip-10-0-128-239 kubenswrapper[2575]: E0422 19:59:16.686312 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"007facee288895f1b5529293d7abca821f948e3d55ef661d56df2edc812d9da0\": container with ID starting with 007facee288895f1b5529293d7abca821f948e3d55ef661d56df2edc812d9da0 not found: ID does not exist" containerID="007facee288895f1b5529293d7abca821f948e3d55ef661d56df2edc812d9da0" Apr 22 19:59:16.686390 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.686341 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"007facee288895f1b5529293d7abca821f948e3d55ef661d56df2edc812d9da0"} err="failed to get container status \"007facee288895f1b5529293d7abca821f948e3d55ef661d56df2edc812d9da0\": rpc error: code = NotFound desc = could not find container \"007facee288895f1b5529293d7abca821f948e3d55ef661d56df2edc812d9da0\": container with ID starting with 007facee288895f1b5529293d7abca821f948e3d55ef661d56df2edc812d9da0 not found: ID does not exist" Apr 22 19:59:16.694494 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.694473 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7b97ccfb5-7tjp6"] Apr 22 19:59:16.698098 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:16.698080 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7b97ccfb5-7tjp6"] Apr 22 19:59:18.303854 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:18.303820 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48a20871-3902-4d97-9b7d-6c533dbb4d37" path="/var/lib/kubelet/pods/48a20871-3902-4d97-9b7d-6c533dbb4d37/volumes" Apr 22 19:59:33.539886 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:33.539855 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-vjxnz_85f994f1-e18a-4886-bc92-4b762096129a/serve-healthcheck-canary/0.log" Apr 22 19:59:36.648631 ip-10-0-128-239 kubenswrapper[2575]: I0422 19:59:36.648600 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-v2xhn" Apr 22 20:02:56.190178 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:02:56.190151 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 20:04:09.220599 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:09.220515 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-v7jgh"] Apr 22 20:04:09.221173 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:09.220739 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48a20871-3902-4d97-9b7d-6c533dbb4d37" containerName="registry" Apr 22 20:04:09.221173 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:09.220749 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="48a20871-3902-4d97-9b7d-6c533dbb4d37" containerName="registry" Apr 22 20:04:09.221173 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:09.220796 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="48a20871-3902-4d97-9b7d-6c533dbb4d37" containerName="registry" Apr 22 20:04:09.223468 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:09.223451 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-v7jgh" Apr 22 20:04:09.227222 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:09.227195 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-p4pjx\"" Apr 22 20:04:09.227344 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:09.227265 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 22 20:04:09.229844 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:09.229824 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 20:04:09.231776 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:09.231757 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 20:04:09.238663 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:09.238639 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-v7jgh"] Apr 22 20:04:09.294474 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:09.294446 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmj82\" (UniqueName: \"kubernetes.io/projected/82ba789a-2f8a-44e1-99a9-e31c9d915ded-kube-api-access-mmj82\") pod \"odh-model-controller-696fc77849-v7jgh\" (UID: \"82ba789a-2f8a-44e1-99a9-e31c9d915ded\") " pod="kserve/odh-model-controller-696fc77849-v7jgh" Apr 22 20:04:09.294644 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:09.294485 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/82ba789a-2f8a-44e1-99a9-e31c9d915ded-cert\") pod \"odh-model-controller-696fc77849-v7jgh\" (UID: \"82ba789a-2f8a-44e1-99a9-e31c9d915ded\") " pod="kserve/odh-model-controller-696fc77849-v7jgh" Apr 22 20:04:09.395302 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:09.395275 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmj82\" (UniqueName: \"kubernetes.io/projected/82ba789a-2f8a-44e1-99a9-e31c9d915ded-kube-api-access-mmj82\") pod \"odh-model-controller-696fc77849-v7jgh\" (UID: \"82ba789a-2f8a-44e1-99a9-e31c9d915ded\") " pod="kserve/odh-model-controller-696fc77849-v7jgh" Apr 22 20:04:09.395574 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:09.395549 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/82ba789a-2f8a-44e1-99a9-e31c9d915ded-cert\") pod \"odh-model-controller-696fc77849-v7jgh\" (UID: \"82ba789a-2f8a-44e1-99a9-e31c9d915ded\") " pod="kserve/odh-model-controller-696fc77849-v7jgh" Apr 22 20:04:09.395677 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:04:09.395660 2575 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 22 20:04:09.395731 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:04:09.395710 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82ba789a-2f8a-44e1-99a9-e31c9d915ded-cert podName:82ba789a-2f8a-44e1-99a9-e31c9d915ded nodeName:}" failed. No retries permitted until 2026-04-22 20:04:09.895696042 +0000 UTC m=+374.147195914 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/82ba789a-2f8a-44e1-99a9-e31c9d915ded-cert") pod "odh-model-controller-696fc77849-v7jgh" (UID: "82ba789a-2f8a-44e1-99a9-e31c9d915ded") : secret "odh-model-controller-webhook-cert" not found Apr 22 20:04:09.404101 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:09.404078 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmj82\" (UniqueName: \"kubernetes.io/projected/82ba789a-2f8a-44e1-99a9-e31c9d915ded-kube-api-access-mmj82\") pod \"odh-model-controller-696fc77849-v7jgh\" (UID: \"82ba789a-2f8a-44e1-99a9-e31c9d915ded\") " pod="kserve/odh-model-controller-696fc77849-v7jgh" Apr 22 20:04:09.898656 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:09.898629 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/82ba789a-2f8a-44e1-99a9-e31c9d915ded-cert\") pod \"odh-model-controller-696fc77849-v7jgh\" (UID: \"82ba789a-2f8a-44e1-99a9-e31c9d915ded\") " pod="kserve/odh-model-controller-696fc77849-v7jgh" Apr 22 20:04:09.898804 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:04:09.898743 2575 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 22 20:04:09.898804 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:04:09.898790 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82ba789a-2f8a-44e1-99a9-e31c9d915ded-cert podName:82ba789a-2f8a-44e1-99a9-e31c9d915ded nodeName:}" failed. No retries permitted until 2026-04-22 20:04:10.89877718 +0000 UTC m=+375.150277047 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/82ba789a-2f8a-44e1-99a9-e31c9d915ded-cert") pod "odh-model-controller-696fc77849-v7jgh" (UID: "82ba789a-2f8a-44e1-99a9-e31c9d915ded") : secret "odh-model-controller-webhook-cert" not found Apr 22 20:04:10.906697 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:10.906668 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/82ba789a-2f8a-44e1-99a9-e31c9d915ded-cert\") pod \"odh-model-controller-696fc77849-v7jgh\" (UID: \"82ba789a-2f8a-44e1-99a9-e31c9d915ded\") " pod="kserve/odh-model-controller-696fc77849-v7jgh" Apr 22 20:04:10.908907 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:10.908888 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/82ba789a-2f8a-44e1-99a9-e31c9d915ded-cert\") pod \"odh-model-controller-696fc77849-v7jgh\" (UID: \"82ba789a-2f8a-44e1-99a9-e31c9d915ded\") " pod="kserve/odh-model-controller-696fc77849-v7jgh" Apr 22 20:04:11.036148 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:11.036102 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-v7jgh" Apr 22 20:04:11.149367 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:11.149286 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-v7jgh"] Apr 22 20:04:11.151505 ip-10-0-128-239 kubenswrapper[2575]: W0422 20:04:11.151478 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82ba789a_2f8a_44e1_99a9_e31c9d915ded.slice/crio-af5618888bd9f74fd43a1bb447ddad787848c8c71cde5f5af35b4362397b213d WatchSource:0}: Error finding container af5618888bd9f74fd43a1bb447ddad787848c8c71cde5f5af35b4362397b213d: Status 404 returned error can't find the container with id af5618888bd9f74fd43a1bb447ddad787848c8c71cde5f5af35b4362397b213d Apr 22 20:04:11.152701 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:11.152679 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:04:11.412157 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:11.412127 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-v7jgh" event={"ID":"82ba789a-2f8a-44e1-99a9-e31c9d915ded","Type":"ContainerStarted","Data":"af5618888bd9f74fd43a1bb447ddad787848c8c71cde5f5af35b4362397b213d"} Apr 22 20:04:14.424706 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:14.424664 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-v7jgh" event={"ID":"82ba789a-2f8a-44e1-99a9-e31c9d915ded","Type":"ContainerStarted","Data":"b532fbee5e4fbed64db9f6a187981daf8610761436c9fbb1cfdae1de2befedc9"} Apr 22 20:04:14.425151 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:14.424903 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-v7jgh" Apr 22 20:04:14.439579 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:14.439541 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-v7jgh" podStartSLOduration=2.5139962259999997 podStartE2EDuration="5.439529523s" podCreationTimestamp="2026-04-22 20:04:09 +0000 UTC" firstStartedPulling="2026-04-22 20:04:11.152866462 +0000 UTC m=+375.404366344" lastFinishedPulling="2026-04-22 20:04:14.078399771 +0000 UTC m=+378.329899641" observedRunningTime="2026-04-22 20:04:14.439003503 +0000 UTC m=+378.690503392" watchObservedRunningTime="2026-04-22 20:04:14.439529523 +0000 UTC m=+378.691029460" Apr 22 20:04:25.429443 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:25.429411 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-v7jgh" Apr 22 20:04:46.396024 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:46.395981 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m"] Apr 22 20:04:46.399766 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:46.399744 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" Apr 22 20:04:46.401913 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:46.401895 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-hjbmb\"" Apr 22 20:04:46.405824 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:46.405786 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m"] Apr 22 20:04:46.529231 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:46.529191 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6c7431a5-1545-45be-9b51-ac8361ee3fcc-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m\" (UID: \"6c7431a5-1545-45be-9b51-ac8361ee3fcc\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" Apr 22 20:04:46.629513 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:46.629484 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6c7431a5-1545-45be-9b51-ac8361ee3fcc-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m\" (UID: \"6c7431a5-1545-45be-9b51-ac8361ee3fcc\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" Apr 22 20:04:46.629873 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:46.629856 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6c7431a5-1545-45be-9b51-ac8361ee3fcc-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m\" (UID: \"6c7431a5-1545-45be-9b51-ac8361ee3fcc\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" Apr 22 20:04:46.710585 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:46.710512 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" Apr 22 20:04:46.832102 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:46.832073 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m"] Apr 22 20:04:46.835240 ip-10-0-128-239 kubenswrapper[2575]: W0422 20:04:46.835213 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c7431a5_1545_45be_9b51_ac8361ee3fcc.slice/crio-dfe34d013c56cf35a6185096d9fcf4d31a245ce612e5fe6ef9406c02bd683828 WatchSource:0}: Error finding container dfe34d013c56cf35a6185096d9fcf4d31a245ce612e5fe6ef9406c02bd683828: Status 404 returned error can't find the container with id dfe34d013c56cf35a6185096d9fcf4d31a245ce612e5fe6ef9406c02bd683828 Apr 22 20:04:47.503080 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:47.503015 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" event={"ID":"6c7431a5-1545-45be-9b51-ac8361ee3fcc","Type":"ContainerStarted","Data":"dfe34d013c56cf35a6185096d9fcf4d31a245ce612e5fe6ef9406c02bd683828"} Apr 22 20:04:51.514438 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:51.514403 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" event={"ID":"6c7431a5-1545-45be-9b51-ac8361ee3fcc","Type":"ContainerStarted","Data":"612ffc1842fa0498e25dc7c2d5dbb89947382a65afdfe125949737ef2bb95b82"} Apr 22 20:04:55.524832 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:55.524801 2575 generic.go:358] "Generic (PLEG): container finished" podID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerID="612ffc1842fa0498e25dc7c2d5dbb89947382a65afdfe125949737ef2bb95b82" exitCode=0 Apr 22 20:04:55.525201 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:04:55.524874 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" event={"ID":"6c7431a5-1545-45be-9b51-ac8361ee3fcc","Type":"ContainerDied","Data":"612ffc1842fa0498e25dc7c2d5dbb89947382a65afdfe125949737ef2bb95b82"} Apr 22 20:05:10.571623 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:05:10.571587 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" event={"ID":"6c7431a5-1545-45be-9b51-ac8361ee3fcc","Type":"ContainerStarted","Data":"09ef23045e0c5b512897a80a1092faad792396910cda8ca340b92d8a2121ba20"} Apr 22 20:05:12.578736 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:05:12.578657 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" event={"ID":"6c7431a5-1545-45be-9b51-ac8361ee3fcc","Type":"ContainerStarted","Data":"1332ed8d4be903aa60e996f5fd11134ab4294d8161c1521e2b5fcdf931b522c4"} Apr 22 20:05:12.579104 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:05:12.578826 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" Apr 22 20:05:12.580168 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:05:12.580140 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 22 20:05:12.594873 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:05:12.594831 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" podStartSLOduration=1.136372645 podStartE2EDuration="26.594818442s" podCreationTimestamp="2026-04-22 20:04:46 +0000 UTC" firstStartedPulling="2026-04-22 20:04:46.836943728 +0000 UTC m=+411.088443599" lastFinishedPulling="2026-04-22 20:05:12.295389528 +0000 UTC m=+436.546889396" observedRunningTime="2026-04-22 20:05:12.593480764 +0000 UTC m=+436.844980666" watchObservedRunningTime="2026-04-22 20:05:12.594818442 +0000 UTC m=+436.846318331" Apr 22 20:05:13.580947 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:05:13.580909 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" Apr 22 20:05:13.581393 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:05:13.581024 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 22 20:05:13.581905 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:05:13.581883 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:05:14.584090 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:05:14.584053 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 22 20:05:14.584455 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:05:14.584373 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:05:24.584348 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:05:24.584264 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 22 20:05:24.584841 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:05:24.584785 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:05:34.584226 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:05:34.584180 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 22 20:05:34.584677 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:05:34.584591 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:05:44.584405 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:05:44.584351 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 22 20:05:44.584858 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:05:44.584789 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:05:54.584867 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:05:54.584826 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 22 20:05:54.585381 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:05:54.585330 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:06:04.584133 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:04.584090 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 22 20:06:04.584653 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:04.584485 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:06:14.585233 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:14.585201 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" Apr 22 20:06:14.585760 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:14.585663 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" Apr 22 20:06:21.475695 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:21.475663 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m"] Apr 22 20:06:21.476259 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:21.476065 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="kserve-container" containerID="cri-o://09ef23045e0c5b512897a80a1092faad792396910cda8ca340b92d8a2121ba20" gracePeriod=30 Apr 22 20:06:21.476722 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:21.476435 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="agent" containerID="cri-o://1332ed8d4be903aa60e996f5fd11134ab4294d8161c1521e2b5fcdf931b522c4" gracePeriod=30 Apr 22 20:06:21.539595 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:21.539567 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m"] Apr 22 20:06:21.542650 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:21.542632 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m" Apr 22 20:06:21.554127 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:21.554101 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m"] Apr 22 20:06:21.588195 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:21.588170 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx"] Apr 22 20:06:21.591288 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:21.591268 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx" Apr 22 20:06:21.598225 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:21.598184 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx"] Apr 22 20:06:21.684945 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:21.684916 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c28960d2-0a72-41e2-b831-f4e123784e15-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m\" (UID: \"c28960d2-0a72-41e2-b831-f4e123784e15\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m" Apr 22 20:06:21.685119 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:21.684996 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ad94efb-b19f-4f67-8525-442359a14999-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx\" (UID: \"5ad94efb-b19f-4f67-8525-442359a14999\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx" Apr 22 20:06:21.786350 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:21.786277 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ad94efb-b19f-4f67-8525-442359a14999-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx\" (UID: \"5ad94efb-b19f-4f67-8525-442359a14999\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx" Apr 22 20:06:21.786350 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:21.786322 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c28960d2-0a72-41e2-b831-f4e123784e15-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m\" (UID: \"c28960d2-0a72-41e2-b831-f4e123784e15\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m" Apr 22 20:06:21.786654 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:21.786636 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ad94efb-b19f-4f67-8525-442359a14999-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx\" (UID: \"5ad94efb-b19f-4f67-8525-442359a14999\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx" Apr 22 20:06:21.786700 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:21.786684 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c28960d2-0a72-41e2-b831-f4e123784e15-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m\" (UID: \"c28960d2-0a72-41e2-b831-f4e123784e15\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m" Apr 22 20:06:21.854001 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:21.853974 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m" Apr 22 20:06:21.902394 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:21.902362 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx" Apr 22 20:06:21.972747 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:21.972716 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m"] Apr 22 20:06:21.976140 ip-10-0-128-239 kubenswrapper[2575]: W0422 20:06:21.976106 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc28960d2_0a72_41e2_b831_f4e123784e15.slice/crio-f0b6d86a2e7c87aeb90c2969a013a86b1875b805c02c02a401c3a9d57782eab7 WatchSource:0}: Error finding container f0b6d86a2e7c87aeb90c2969a013a86b1875b805c02c02a401c3a9d57782eab7: Status 404 returned error can't find the container with id f0b6d86a2e7c87aeb90c2969a013a86b1875b805c02c02a401c3a9d57782eab7 Apr 22 20:06:22.031150 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:22.031123 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx"] Apr 22 20:06:22.039011 ip-10-0-128-239 kubenswrapper[2575]: W0422 20:06:22.038982 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ad94efb_b19f_4f67_8525_442359a14999.slice/crio-ceed2eb72852c5b961fae7ab1dce48fcf69f2f0e7c9db9e822ff5aa7530ce23f WatchSource:0}: Error finding container ceed2eb72852c5b961fae7ab1dce48fcf69f2f0e7c9db9e822ff5aa7530ce23f: Status 404 returned error can't find the container with id ceed2eb72852c5b961fae7ab1dce48fcf69f2f0e7c9db9e822ff5aa7530ce23f Apr 22 20:06:22.765117 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:22.765073 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m" event={"ID":"c28960d2-0a72-41e2-b831-f4e123784e15","Type":"ContainerStarted","Data":"aa131b3415b391cc50bbca2089aca07a37a7f4911b5e27b2bb0313a87ef2cb96"} Apr 22 20:06:22.765117 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:22.765124 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m" event={"ID":"c28960d2-0a72-41e2-b831-f4e123784e15","Type":"ContainerStarted","Data":"f0b6d86a2e7c87aeb90c2969a013a86b1875b805c02c02a401c3a9d57782eab7"} Apr 22 20:06:22.766452 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:22.766420 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx" event={"ID":"5ad94efb-b19f-4f67-8525-442359a14999","Type":"ContainerStarted","Data":"c2cb35794616d59ba1246f1a478cc3c579de411b77db028337c3ffeb8a1b0734"} Apr 22 20:06:22.766452 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:22.766454 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx" event={"ID":"5ad94efb-b19f-4f67-8525-442359a14999","Type":"ContainerStarted","Data":"ceed2eb72852c5b961fae7ab1dce48fcf69f2f0e7c9db9e822ff5aa7530ce23f"} Apr 22 20:06:24.584529 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:24.584485 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 22 20:06:24.587159 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:24.587132 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:06:25.777328 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:25.777296 2575 generic.go:358] "Generic (PLEG): container finished" podID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerID="09ef23045e0c5b512897a80a1092faad792396910cda8ca340b92d8a2121ba20" exitCode=0 Apr 22 20:06:25.777634 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:25.777368 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" event={"ID":"6c7431a5-1545-45be-9b51-ac8361ee3fcc","Type":"ContainerDied","Data":"09ef23045e0c5b512897a80a1092faad792396910cda8ca340b92d8a2121ba20"} Apr 22 20:06:25.778513 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:25.778494 2575 generic.go:358] "Generic (PLEG): container finished" podID="c28960d2-0a72-41e2-b831-f4e123784e15" containerID="aa131b3415b391cc50bbca2089aca07a37a7f4911b5e27b2bb0313a87ef2cb96" exitCode=0 Apr 22 20:06:25.778600 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:25.778544 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m" event={"ID":"c28960d2-0a72-41e2-b831-f4e123784e15","Type":"ContainerDied","Data":"aa131b3415b391cc50bbca2089aca07a37a7f4911b5e27b2bb0313a87ef2cb96"} Apr 22 20:06:26.782697 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:26.782666 2575 generic.go:358] "Generic (PLEG): container finished" podID="5ad94efb-b19f-4f67-8525-442359a14999" containerID="c2cb35794616d59ba1246f1a478cc3c579de411b77db028337c3ffeb8a1b0734" exitCode=0 Apr 22 20:06:26.783209 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:26.782738 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx" event={"ID":"5ad94efb-b19f-4f67-8525-442359a14999","Type":"ContainerDied","Data":"c2cb35794616d59ba1246f1a478cc3c579de411b77db028337c3ffeb8a1b0734"} Apr 22 20:06:26.784270 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:26.784247 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m" event={"ID":"c28960d2-0a72-41e2-b831-f4e123784e15","Type":"ContainerStarted","Data":"4ad11ad393c738b6864b1d863f82be0af955849d4e1920de7d25eaee8f7082d1"} Apr 22 20:06:26.784488 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:26.784472 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m" Apr 22 20:06:26.785640 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:26.785618 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m" podUID="c28960d2-0a72-41e2-b831-f4e123784e15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:8080: connect: connection refused" Apr 22 20:06:26.812119 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:26.812084 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m" podStartSLOduration=5.8120726430000005 podStartE2EDuration="5.812072643s" podCreationTimestamp="2026-04-22 20:06:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:06:26.810927242 +0000 UTC m=+511.062427352" watchObservedRunningTime="2026-04-22 20:06:26.812072643 +0000 UTC m=+511.063572532" Apr 22 20:06:27.788313 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:27.788273 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m" podUID="c28960d2-0a72-41e2-b831-f4e123784e15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:8080: connect: connection refused" Apr 22 20:06:34.584734 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:34.584693 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 22 20:06:34.586380 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:34.586348 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:06:37.788704 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:37.788653 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m" podUID="c28960d2-0a72-41e2-b831-f4e123784e15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:8080: connect: connection refused" Apr 22 20:06:44.584665 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:44.584622 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.16:8080: connect: connection refused" Apr 22 20:06:44.585122 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:44.584770 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" Apr 22 20:06:44.586264 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:44.586227 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:06:44.586428 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:44.586355 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" Apr 22 20:06:47.789407 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:47.789324 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m" podUID="c28960d2-0a72-41e2-b831-f4e123784e15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:8080: connect: connection refused" Apr 22 20:06:48.857024 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:48.856987 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx" event={"ID":"5ad94efb-b19f-4f67-8525-442359a14999","Type":"ContainerStarted","Data":"12da1dfbd5ae85c94875b44086867d9ec87cad8e9471680999b50c6c2d6a522b"} Apr 22 20:06:48.857452 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:48.857290 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx" Apr 22 20:06:48.858598 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:48.858573 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx" podUID="5ad94efb-b19f-4f67-8525-442359a14999" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 22 20:06:48.871631 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:48.871592 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx" podStartSLOduration=6.662978148 podStartE2EDuration="27.871581032s" podCreationTimestamp="2026-04-22 20:06:21 +0000 UTC" firstStartedPulling="2026-04-22 20:06:26.784019364 +0000 UTC m=+511.035519231" lastFinishedPulling="2026-04-22 20:06:47.992622243 +0000 UTC m=+532.244122115" observedRunningTime="2026-04-22 20:06:48.870897483 +0000 UTC m=+533.122397373" watchObservedRunningTime="2026-04-22 20:06:48.871581032 +0000 UTC m=+533.123080921" Apr 22 20:06:49.860677 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:49.860637 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx" podUID="5ad94efb-b19f-4f67-8525-442359a14999" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 22 20:06:51.613005 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:51.612983 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" Apr 22 20:06:51.627834 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:51.627813 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6c7431a5-1545-45be-9b51-ac8361ee3fcc-kserve-provision-location\") pod \"6c7431a5-1545-45be-9b51-ac8361ee3fcc\" (UID: \"6c7431a5-1545-45be-9b51-ac8361ee3fcc\") " Apr 22 20:06:51.628138 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:51.628115 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c7431a5-1545-45be-9b51-ac8361ee3fcc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6c7431a5-1545-45be-9b51-ac8361ee3fcc" (UID: "6c7431a5-1545-45be-9b51-ac8361ee3fcc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:06:51.728364 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:51.728335 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6c7431a5-1545-45be-9b51-ac8361ee3fcc-kserve-provision-location\") on node \"ip-10-0-128-239.ec2.internal\" DevicePath \"\"" Apr 22 20:06:51.868468 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:51.868433 2575 generic.go:358] "Generic (PLEG): container finished" podID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerID="1332ed8d4be903aa60e996f5fd11134ab4294d8161c1521e2b5fcdf931b522c4" exitCode=0 Apr 22 20:06:51.868623 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:51.868520 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" Apr 22 20:06:51.868623 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:51.868533 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" event={"ID":"6c7431a5-1545-45be-9b51-ac8361ee3fcc","Type":"ContainerDied","Data":"1332ed8d4be903aa60e996f5fd11134ab4294d8161c1521e2b5fcdf931b522c4"} Apr 22 20:06:51.868623 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:51.868586 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m" event={"ID":"6c7431a5-1545-45be-9b51-ac8361ee3fcc","Type":"ContainerDied","Data":"dfe34d013c56cf35a6185096d9fcf4d31a245ce612e5fe6ef9406c02bd683828"} Apr 22 20:06:51.868623 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:51.868608 2575 scope.go:117] "RemoveContainer" containerID="1332ed8d4be903aa60e996f5fd11134ab4294d8161c1521e2b5fcdf931b522c4" Apr 22 20:06:51.875927 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:51.875898 2575 scope.go:117] "RemoveContainer" containerID="09ef23045e0c5b512897a80a1092faad792396910cda8ca340b92d8a2121ba20" Apr 22 20:06:51.883305 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:51.883290 2575 scope.go:117] "RemoveContainer" containerID="612ffc1842fa0498e25dc7c2d5dbb89947382a65afdfe125949737ef2bb95b82" Apr 22 20:06:51.890572 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:51.890546 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m"] Apr 22 20:06:51.891774 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:51.891267 2575 scope.go:117] "RemoveContainer" containerID="1332ed8d4be903aa60e996f5fd11134ab4294d8161c1521e2b5fcdf931b522c4" Apr 22 20:06:51.892243 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:06:51.892108 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1332ed8d4be903aa60e996f5fd11134ab4294d8161c1521e2b5fcdf931b522c4\": container with ID starting with 1332ed8d4be903aa60e996f5fd11134ab4294d8161c1521e2b5fcdf931b522c4 not found: ID does not exist" containerID="1332ed8d4be903aa60e996f5fd11134ab4294d8161c1521e2b5fcdf931b522c4" Apr 22 20:06:51.892243 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:51.892142 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1332ed8d4be903aa60e996f5fd11134ab4294d8161c1521e2b5fcdf931b522c4"} err="failed to get container status \"1332ed8d4be903aa60e996f5fd11134ab4294d8161c1521e2b5fcdf931b522c4\": rpc error: code = NotFound desc = could not find container \"1332ed8d4be903aa60e996f5fd11134ab4294d8161c1521e2b5fcdf931b522c4\": container with ID starting with 1332ed8d4be903aa60e996f5fd11134ab4294d8161c1521e2b5fcdf931b522c4 not found: ID does not exist" Apr 22 20:06:51.892243 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:51.892165 2575 scope.go:117] "RemoveContainer" containerID="09ef23045e0c5b512897a80a1092faad792396910cda8ca340b92d8a2121ba20" Apr 22 20:06:51.892540 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:06:51.892520 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09ef23045e0c5b512897a80a1092faad792396910cda8ca340b92d8a2121ba20\": container with ID starting with 09ef23045e0c5b512897a80a1092faad792396910cda8ca340b92d8a2121ba20 not found: ID does not exist" containerID="09ef23045e0c5b512897a80a1092faad792396910cda8ca340b92d8a2121ba20" Apr 22 20:06:51.892622 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:51.892548 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09ef23045e0c5b512897a80a1092faad792396910cda8ca340b92d8a2121ba20"} err="failed to get container status \"09ef23045e0c5b512897a80a1092faad792396910cda8ca340b92d8a2121ba20\": rpc error: code = NotFound desc = could not find container \"09ef23045e0c5b512897a80a1092faad792396910cda8ca340b92d8a2121ba20\": container with ID starting with 09ef23045e0c5b512897a80a1092faad792396910cda8ca340b92d8a2121ba20 not found: ID does not exist" Apr 22 20:06:51.892622 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:51.892570 2575 scope.go:117] "RemoveContainer" containerID="612ffc1842fa0498e25dc7c2d5dbb89947382a65afdfe125949737ef2bb95b82" Apr 22 20:06:51.892852 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:06:51.892834 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"612ffc1842fa0498e25dc7c2d5dbb89947382a65afdfe125949737ef2bb95b82\": container with ID starting with 612ffc1842fa0498e25dc7c2d5dbb89947382a65afdfe125949737ef2bb95b82 not found: ID does not exist" containerID="612ffc1842fa0498e25dc7c2d5dbb89947382a65afdfe125949737ef2bb95b82" Apr 22 20:06:51.892913 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:51.892859 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"612ffc1842fa0498e25dc7c2d5dbb89947382a65afdfe125949737ef2bb95b82"} err="failed to get container status \"612ffc1842fa0498e25dc7c2d5dbb89947382a65afdfe125949737ef2bb95b82\": rpc error: code = NotFound desc = could not find container \"612ffc1842fa0498e25dc7c2d5dbb89947382a65afdfe125949737ef2bb95b82\": container with ID starting with 612ffc1842fa0498e25dc7c2d5dbb89947382a65afdfe125949737ef2bb95b82 not found: ID does not exist" Apr 22 20:06:51.893359 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:51.893341 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-586f1-predictor-55776767cf-j6t2m"] Apr 22 20:06:52.303045 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:52.302961 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" path="/var/lib/kubelet/pods/6c7431a5-1545-45be-9b51-ac8361ee3fcc/volumes" Apr 22 20:06:57.788932 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:57.788896 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m" podUID="c28960d2-0a72-41e2-b831-f4e123784e15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:8080: connect: connection refused" Apr 22 20:06:59.861351 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:06:59.861302 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx" podUID="5ad94efb-b19f-4f67-8525-442359a14999" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 22 20:07:07.788835 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:07.788792 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m" podUID="c28960d2-0a72-41e2-b831-f4e123784e15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:8080: connect: connection refused" Apr 22 20:07:09.861445 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:09.861382 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx" podUID="5ad94efb-b19f-4f67-8525-442359a14999" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 22 20:07:17.788960 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:17.788904 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m" podUID="c28960d2-0a72-41e2-b831-f4e123784e15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:8080: connect: connection refused" Apr 22 20:07:19.861127 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:19.861082 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx" podUID="5ad94efb-b19f-4f67-8525-442359a14999" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 22 20:07:27.788464 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:27.788423 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m" podUID="c28960d2-0a72-41e2-b831-f4e123784e15" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.17:8080: connect: connection refused" Apr 22 20:07:29.860997 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:29.860949 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx" podUID="5ad94efb-b19f-4f67-8525-442359a14999" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 22 20:07:37.789234 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:37.789200 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m" Apr 22 20:07:39.861289 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:39.861245 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx" podUID="5ad94efb-b19f-4f67-8525-442359a14999" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.18:8080: connect: connection refused" Apr 22 20:07:49.862275 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:49.862241 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx" Apr 22 20:07:51.579364 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:51.579331 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr"] Apr 22 20:07:51.579792 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:51.579610 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="kserve-container" Apr 22 20:07:51.579792 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:51.579624 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="kserve-container" Apr 22 20:07:51.579792 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:51.579643 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="storage-initializer" Apr 22 20:07:51.579792 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:51.579649 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="storage-initializer" Apr 22 20:07:51.579792 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:51.579655 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="agent" Apr 22 20:07:51.579792 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:51.579660 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="agent" Apr 22 20:07:51.579792 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:51.579704 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="agent" Apr 22 20:07:51.579792 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:51.579711 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c7431a5-1545-45be-9b51-ac8361ee3fcc" containerName="kserve-container" Apr 22 20:07:51.582287 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:51.582271 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr" Apr 22 20:07:51.584525 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:51.584503 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-81bcf-serving-cert\"" Apr 22 20:07:51.584675 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:51.584655 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 20:07:51.584756 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:51.584660 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-81bcf-kube-rbac-proxy-sar-config\"" Apr 22 20:07:51.590024 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:51.590003 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr"] Apr 22 20:07:51.634288 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:51.634262 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/864b82cb-96bf-4004-a936-729e7e69e8b8-proxy-tls\") pod \"model-chainer-raw-81bcf-f88d6597b-xj4kr\" (UID: \"864b82cb-96bf-4004-a936-729e7e69e8b8\") " pod="kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr" Apr 22 20:07:51.634379 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:51.634306 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/864b82cb-96bf-4004-a936-729e7e69e8b8-openshift-service-ca-bundle\") pod \"model-chainer-raw-81bcf-f88d6597b-xj4kr\" (UID: \"864b82cb-96bf-4004-a936-729e7e69e8b8\") " pod="kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr" Apr 22 20:07:51.735300 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:51.735274 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/864b82cb-96bf-4004-a936-729e7e69e8b8-proxy-tls\") pod \"model-chainer-raw-81bcf-f88d6597b-xj4kr\" (UID: \"864b82cb-96bf-4004-a936-729e7e69e8b8\") " pod="kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr" Apr 22 20:07:51.735404 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:51.735326 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/864b82cb-96bf-4004-a936-729e7e69e8b8-openshift-service-ca-bundle\") pod \"model-chainer-raw-81bcf-f88d6597b-xj4kr\" (UID: \"864b82cb-96bf-4004-a936-729e7e69e8b8\") " pod="kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr" Apr 22 20:07:51.735465 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:07:51.735404 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-81bcf-serving-cert: secret "model-chainer-raw-81bcf-serving-cert" not found Apr 22 20:07:51.735520 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:07:51.735482 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/864b82cb-96bf-4004-a936-729e7e69e8b8-proxy-tls podName:864b82cb-96bf-4004-a936-729e7e69e8b8 nodeName:}" failed. No retries permitted until 2026-04-22 20:07:52.235464589 +0000 UTC m=+596.486964456 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/864b82cb-96bf-4004-a936-729e7e69e8b8-proxy-tls") pod "model-chainer-raw-81bcf-f88d6597b-xj4kr" (UID: "864b82cb-96bf-4004-a936-729e7e69e8b8") : secret "model-chainer-raw-81bcf-serving-cert" not found Apr 22 20:07:51.735988 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:51.735967 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/864b82cb-96bf-4004-a936-729e7e69e8b8-openshift-service-ca-bundle\") pod \"model-chainer-raw-81bcf-f88d6597b-xj4kr\" (UID: \"864b82cb-96bf-4004-a936-729e7e69e8b8\") " pod="kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr" Apr 22 20:07:52.239146 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:52.239101 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/864b82cb-96bf-4004-a936-729e7e69e8b8-proxy-tls\") pod \"model-chainer-raw-81bcf-f88d6597b-xj4kr\" (UID: \"864b82cb-96bf-4004-a936-729e7e69e8b8\") " pod="kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr" Apr 22 20:07:52.241414 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:52.241385 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/864b82cb-96bf-4004-a936-729e7e69e8b8-proxy-tls\") pod \"model-chainer-raw-81bcf-f88d6597b-xj4kr\" (UID: \"864b82cb-96bf-4004-a936-729e7e69e8b8\") " pod="kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr" Apr 22 20:07:52.492318 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:52.492236 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr" Apr 22 20:07:52.605326 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:52.605296 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr"] Apr 22 20:07:52.608247 ip-10-0-128-239 kubenswrapper[2575]: W0422 20:07:52.608224 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod864b82cb_96bf_4004_a936_729e7e69e8b8.slice/crio-685903ed98f543f61b1d502b2596be66f7f4871ec7ad5abe01ed4f90ed613eeb WatchSource:0}: Error finding container 685903ed98f543f61b1d502b2596be66f7f4871ec7ad5abe01ed4f90ed613eeb: Status 404 returned error can't find the container with id 685903ed98f543f61b1d502b2596be66f7f4871ec7ad5abe01ed4f90ed613eeb Apr 22 20:07:53.030765 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:53.030734 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr" event={"ID":"864b82cb-96bf-4004-a936-729e7e69e8b8","Type":"ContainerStarted","Data":"685903ed98f543f61b1d502b2596be66f7f4871ec7ad5abe01ed4f90ed613eeb"} Apr 22 20:07:56.038973 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:56.038939 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr" event={"ID":"864b82cb-96bf-4004-a936-729e7e69e8b8","Type":"ContainerStarted","Data":"46713e392b5b140e0df71f22f28010df643edf636ed5152e37661fbf96b3e2fc"} Apr 22 20:07:56.039450 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:56.039072 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr" Apr 22 20:07:56.059624 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:07:56.059566 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr" podStartSLOduration=2.674654254 podStartE2EDuration="5.05954707s" podCreationTimestamp="2026-04-22 20:07:51 +0000 UTC" firstStartedPulling="2026-04-22 20:07:52.610423114 +0000 UTC m=+596.861922981" lastFinishedPulling="2026-04-22 20:07:54.99531593 +0000 UTC m=+599.246815797" observedRunningTime="2026-04-22 20:07:56.057458855 +0000 UTC m=+600.308958746" watchObservedRunningTime="2026-04-22 20:07:56.05954707 +0000 UTC m=+600.311046960" Apr 22 20:08:01.610066 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:01.610017 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr"] Apr 22 20:08:01.610541 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:01.610306 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr" podUID="864b82cb-96bf-4004-a936-729e7e69e8b8" containerName="model-chainer-raw-81bcf" containerID="cri-o://46713e392b5b140e0df71f22f28010df643edf636ed5152e37661fbf96b3e2fc" gracePeriod=30 Apr 22 20:08:01.617220 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:01.617185 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr" podUID="864b82cb-96bf-4004-a936-729e7e69e8b8" containerName="model-chainer-raw-81bcf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:08:01.763732 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:01.763698 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m"] Apr 22 20:08:01.764079 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:01.764026 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m" podUID="c28960d2-0a72-41e2-b831-f4e123784e15" containerName="kserve-container" containerID="cri-o://4ad11ad393c738b6864b1d863f82be0af955849d4e1920de7d25eaee8f7082d1" gracePeriod=30 Apr 22 20:08:01.795744 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:01.795715 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr"] Apr 22 20:08:01.807676 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:01.807653 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr"] Apr 22 20:08:01.807804 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:01.807797 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr" Apr 22 20:08:01.866728 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:01.866669 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt"] Apr 22 20:08:01.870086 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:01.870068 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt" Apr 22 20:08:01.879740 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:01.879719 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt"] Apr 22 20:08:01.908528 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:01.908500 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebf23c2a-33ad-46fc-9385-3cbe6370d937-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr\" (UID: \"ebf23c2a-33ad-46fc-9385-3cbe6370d937\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr" Apr 22 20:08:01.962446 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:01.962417 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx"] Apr 22 20:08:01.962721 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:01.962696 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx" podUID="5ad94efb-b19f-4f67-8525-442359a14999" containerName="kserve-container" containerID="cri-o://12da1dfbd5ae85c94875b44086867d9ec87cad8e9471680999b50c6c2d6a522b" gracePeriod=30 Apr 22 20:08:02.009372 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:02.009343 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebf23c2a-33ad-46fc-9385-3cbe6370d937-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr\" (UID: \"ebf23c2a-33ad-46fc-9385-3cbe6370d937\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr" Apr 22 20:08:02.009492 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:02.009386 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/296784ea-f704-464e-be95-0525b956e003-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt\" (UID: \"296784ea-f704-464e-be95-0525b956e003\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt" Apr 22 20:08:02.009733 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:02.009713 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebf23c2a-33ad-46fc-9385-3cbe6370d937-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr\" (UID: \"ebf23c2a-33ad-46fc-9385-3cbe6370d937\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr" Apr 22 20:08:02.109730 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:02.109700 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/296784ea-f704-464e-be95-0525b956e003-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt\" (UID: \"296784ea-f704-464e-be95-0525b956e003\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt" Apr 22 20:08:02.110055 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:02.110021 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/296784ea-f704-464e-be95-0525b956e003-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt\" (UID: \"296784ea-f704-464e-be95-0525b956e003\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt" Apr 22 20:08:02.120834 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:02.120816 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr" Apr 22 20:08:02.180191 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:02.180159 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt" Apr 22 20:08:02.251319 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:02.251271 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr"] Apr 22 20:08:02.253817 ip-10-0-128-239 kubenswrapper[2575]: W0422 20:08:02.253791 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebf23c2a_33ad_46fc_9385_3cbe6370d937.slice/crio-7980f1ebf65969c6491b7d1ec06000d9b1760f5397e391078d8c529aefacefbf WatchSource:0}: Error finding container 7980f1ebf65969c6491b7d1ec06000d9b1760f5397e391078d8c529aefacefbf: Status 404 returned error can't find the container with id 7980f1ebf65969c6491b7d1ec06000d9b1760f5397e391078d8c529aefacefbf Apr 22 20:08:02.312023 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:02.311997 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt"] Apr 22 20:08:02.314671 ip-10-0-128-239 kubenswrapper[2575]: W0422 20:08:02.314645 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod296784ea_f704_464e_be95_0525b956e003.slice/crio-1e21c4c998dbc44501c2c6426010abbc4ba5aed5ee6cf5f1a799e8b1d2ca278d WatchSource:0}: Error finding container 1e21c4c998dbc44501c2c6426010abbc4ba5aed5ee6cf5f1a799e8b1d2ca278d: Status 404 returned error can't find the container with id 1e21c4c998dbc44501c2c6426010abbc4ba5aed5ee6cf5f1a799e8b1d2ca278d Apr 22 20:08:03.057884 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:03.057851 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr" event={"ID":"ebf23c2a-33ad-46fc-9385-3cbe6370d937","Type":"ContainerStarted","Data":"36a732b91a562a85efd07d210d4c4a2b7c3691ea5cfa50737fc77838601e6b8d"} Apr 22 20:08:03.057884 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:03.057884 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr" event={"ID":"ebf23c2a-33ad-46fc-9385-3cbe6370d937","Type":"ContainerStarted","Data":"7980f1ebf65969c6491b7d1ec06000d9b1760f5397e391078d8c529aefacefbf"} Apr 22 20:08:03.059071 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:03.059049 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt" event={"ID":"296784ea-f704-464e-be95-0525b956e003","Type":"ContainerStarted","Data":"071a7f9f80d703de924ae96e7c871e0329f16a647452a5befe0679fd94d1d9f8"} Apr 22 20:08:03.059174 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:03.059075 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt" event={"ID":"296784ea-f704-464e-be95-0525b956e003","Type":"ContainerStarted","Data":"1e21c4c998dbc44501c2c6426010abbc4ba5aed5ee6cf5f1a799e8b1d2ca278d"} Apr 22 20:08:05.394373 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:05.394352 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx" Apr 22 20:08:05.535072 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:05.534985 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ad94efb-b19f-4f67-8525-442359a14999-kserve-provision-location\") pod \"5ad94efb-b19f-4f67-8525-442359a14999\" (UID: \"5ad94efb-b19f-4f67-8525-442359a14999\") " Apr 22 20:08:05.535328 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:05.535301 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ad94efb-b19f-4f67-8525-442359a14999-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5ad94efb-b19f-4f67-8525-442359a14999" (UID: "5ad94efb-b19f-4f67-8525-442359a14999"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:08:05.635638 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:05.635614 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ad94efb-b19f-4f67-8525-442359a14999-kserve-provision-location\") on node \"ip-10-0-128-239.ec2.internal\" DevicePath \"\"" Apr 22 20:08:05.796634 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:05.796614 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m" Apr 22 20:08:05.938319 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:05.938294 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c28960d2-0a72-41e2-b831-f4e123784e15-kserve-provision-location\") pod \"c28960d2-0a72-41e2-b831-f4e123784e15\" (UID: \"c28960d2-0a72-41e2-b831-f4e123784e15\") " Apr 22 20:08:05.938611 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:05.938586 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c28960d2-0a72-41e2-b831-f4e123784e15-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c28960d2-0a72-41e2-b831-f4e123784e15" (UID: "c28960d2-0a72-41e2-b831-f4e123784e15"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:08:06.039741 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:06.039658 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c28960d2-0a72-41e2-b831-f4e123784e15-kserve-provision-location\") on node \"ip-10-0-128-239.ec2.internal\" DevicePath \"\"" Apr 22 20:08:06.069821 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:06.069789 2575 generic.go:358] "Generic (PLEG): container finished" podID="c28960d2-0a72-41e2-b831-f4e123784e15" containerID="4ad11ad393c738b6864b1d863f82be0af955849d4e1920de7d25eaee8f7082d1" exitCode=0 Apr 22 20:08:06.069928 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:06.069860 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m" Apr 22 20:08:06.069928 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:06.069865 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m" event={"ID":"c28960d2-0a72-41e2-b831-f4e123784e15","Type":"ContainerDied","Data":"4ad11ad393c738b6864b1d863f82be0af955849d4e1920de7d25eaee8f7082d1"} Apr 22 20:08:06.069928 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:06.069911 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m" event={"ID":"c28960d2-0a72-41e2-b831-f4e123784e15","Type":"ContainerDied","Data":"f0b6d86a2e7c87aeb90c2969a013a86b1875b805c02c02a401c3a9d57782eab7"} Apr 22 20:08:06.069928 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:06.069927 2575 scope.go:117] "RemoveContainer" containerID="4ad11ad393c738b6864b1d863f82be0af955849d4e1920de7d25eaee8f7082d1" Apr 22 20:08:06.071294 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:06.071268 2575 generic.go:358] "Generic (PLEG): container finished" podID="5ad94efb-b19f-4f67-8525-442359a14999" containerID="12da1dfbd5ae85c94875b44086867d9ec87cad8e9471680999b50c6c2d6a522b" exitCode=0 Apr 22 20:08:06.071401 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:06.071313 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx" event={"ID":"5ad94efb-b19f-4f67-8525-442359a14999","Type":"ContainerDied","Data":"12da1dfbd5ae85c94875b44086867d9ec87cad8e9471680999b50c6c2d6a522b"} Apr 22 20:08:06.071401 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:06.071333 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx" event={"ID":"5ad94efb-b19f-4f67-8525-442359a14999","Type":"ContainerDied","Data":"ceed2eb72852c5b961fae7ab1dce48fcf69f2f0e7c9db9e822ff5aa7530ce23f"} Apr 22 20:08:06.071401 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:06.071350 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx" Apr 22 20:08:06.123626 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:06.123602 2575 scope.go:117] "RemoveContainer" containerID="aa131b3415b391cc50bbca2089aca07a37a7f4911b5e27b2bb0313a87ef2cb96" Apr 22 20:08:06.130147 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:06.130130 2575 scope.go:117] "RemoveContainer" containerID="4ad11ad393c738b6864b1d863f82be0af955849d4e1920de7d25eaee8f7082d1" Apr 22 20:08:06.130395 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:08:06.130370 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ad11ad393c738b6864b1d863f82be0af955849d4e1920de7d25eaee8f7082d1\": container with ID starting with 4ad11ad393c738b6864b1d863f82be0af955849d4e1920de7d25eaee8f7082d1 not found: ID does not exist" containerID="4ad11ad393c738b6864b1d863f82be0af955849d4e1920de7d25eaee8f7082d1" Apr 22 20:08:06.130437 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:06.130407 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ad11ad393c738b6864b1d863f82be0af955849d4e1920de7d25eaee8f7082d1"} err="failed to get container status \"4ad11ad393c738b6864b1d863f82be0af955849d4e1920de7d25eaee8f7082d1\": rpc error: code = NotFound desc = could not find container \"4ad11ad393c738b6864b1d863f82be0af955849d4e1920de7d25eaee8f7082d1\": container with ID starting with 4ad11ad393c738b6864b1d863f82be0af955849d4e1920de7d25eaee8f7082d1 not found: ID does not exist" Apr 22 20:08:06.130437 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:06.130432 2575 scope.go:117] "RemoveContainer" containerID="aa131b3415b391cc50bbca2089aca07a37a7f4911b5e27b2bb0313a87ef2cb96" Apr 22 20:08:06.130653 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:08:06.130633 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa131b3415b391cc50bbca2089aca07a37a7f4911b5e27b2bb0313a87ef2cb96\": container with ID starting with aa131b3415b391cc50bbca2089aca07a37a7f4911b5e27b2bb0313a87ef2cb96 not found: ID does not exist" containerID="aa131b3415b391cc50bbca2089aca07a37a7f4911b5e27b2bb0313a87ef2cb96" Apr 22 20:08:06.130739 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:06.130656 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa131b3415b391cc50bbca2089aca07a37a7f4911b5e27b2bb0313a87ef2cb96"} err="failed to get container status \"aa131b3415b391cc50bbca2089aca07a37a7f4911b5e27b2bb0313a87ef2cb96\": rpc error: code = NotFound desc = could not find container \"aa131b3415b391cc50bbca2089aca07a37a7f4911b5e27b2bb0313a87ef2cb96\": container with ID starting with aa131b3415b391cc50bbca2089aca07a37a7f4911b5e27b2bb0313a87ef2cb96 not found: ID does not exist" Apr 22 20:08:06.130739 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:06.130671 2575 scope.go:117] "RemoveContainer" containerID="12da1dfbd5ae85c94875b44086867d9ec87cad8e9471680999b50c6c2d6a522b" Apr 22 20:08:06.136743 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:06.136718 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx"] Apr 22 20:08:06.137714 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:06.137697 2575 scope.go:117] "RemoveContainer" containerID="c2cb35794616d59ba1246f1a478cc3c579de411b77db028337c3ffeb8a1b0734" Apr 22 20:08:06.139755 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:06.139731 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-81bcf-predictor-65c8ddb5cf-7nlsx"] Apr 22 20:08:06.145244 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:06.145180 2575 scope.go:117] "RemoveContainer" containerID="12da1dfbd5ae85c94875b44086867d9ec87cad8e9471680999b50c6c2d6a522b" Apr 22 20:08:06.145474 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:08:06.145456 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12da1dfbd5ae85c94875b44086867d9ec87cad8e9471680999b50c6c2d6a522b\": container with ID starting with 12da1dfbd5ae85c94875b44086867d9ec87cad8e9471680999b50c6c2d6a522b not found: ID does not exist" containerID="12da1dfbd5ae85c94875b44086867d9ec87cad8e9471680999b50c6c2d6a522b" Apr 22 20:08:06.145539 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:06.145481 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12da1dfbd5ae85c94875b44086867d9ec87cad8e9471680999b50c6c2d6a522b"} err="failed to get container status \"12da1dfbd5ae85c94875b44086867d9ec87cad8e9471680999b50c6c2d6a522b\": rpc error: code = NotFound desc = could not find container \"12da1dfbd5ae85c94875b44086867d9ec87cad8e9471680999b50c6c2d6a522b\": container with ID starting with 12da1dfbd5ae85c94875b44086867d9ec87cad8e9471680999b50c6c2d6a522b not found: ID does not exist" Apr 22 20:08:06.145539 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:06.145499 2575 scope.go:117] "RemoveContainer" containerID="c2cb35794616d59ba1246f1a478cc3c579de411b77db028337c3ffeb8a1b0734" Apr 22 20:08:06.145787 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:08:06.145768 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2cb35794616d59ba1246f1a478cc3c579de411b77db028337c3ffeb8a1b0734\": container with ID starting with c2cb35794616d59ba1246f1a478cc3c579de411b77db028337c3ffeb8a1b0734 not found: ID does not exist" containerID="c2cb35794616d59ba1246f1a478cc3c579de411b77db028337c3ffeb8a1b0734" Apr 22 20:08:06.145845 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:06.145791 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2cb35794616d59ba1246f1a478cc3c579de411b77db028337c3ffeb8a1b0734"} err="failed to get container status \"c2cb35794616d59ba1246f1a478cc3c579de411b77db028337c3ffeb8a1b0734\": rpc error: code = NotFound desc = could not find container \"c2cb35794616d59ba1246f1a478cc3c579de411b77db028337c3ffeb8a1b0734\": container with ID starting with c2cb35794616d59ba1246f1a478cc3c579de411b77db028337c3ffeb8a1b0734 not found: ID does not exist" Apr 22 20:08:06.148799 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:06.148778 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m"] Apr 22 20:08:06.150520 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:06.150501 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-81bcf-predictor-745f867bd5-jj65m"] Apr 22 20:08:06.303564 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:06.303514 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ad94efb-b19f-4f67-8525-442359a14999" path="/var/lib/kubelet/pods/5ad94efb-b19f-4f67-8525-442359a14999/volumes" Apr 22 20:08:06.303859 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:06.303846 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c28960d2-0a72-41e2-b831-f4e123784e15" path="/var/lib/kubelet/pods/c28960d2-0a72-41e2-b831-f4e123784e15/volumes" Apr 22 20:08:06.614442 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:06.614361 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr" podUID="864b82cb-96bf-4004-a936-729e7e69e8b8" containerName="model-chainer-raw-81bcf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:08:07.075188 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:07.075155 2575 generic.go:358] "Generic (PLEG): container finished" podID="296784ea-f704-464e-be95-0525b956e003" containerID="071a7f9f80d703de924ae96e7c871e0329f16a647452a5befe0679fd94d1d9f8" exitCode=0 Apr 22 20:08:07.075445 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:07.075231 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt" event={"ID":"296784ea-f704-464e-be95-0525b956e003","Type":"ContainerDied","Data":"071a7f9f80d703de924ae96e7c871e0329f16a647452a5befe0679fd94d1d9f8"} Apr 22 20:08:07.077852 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:07.077831 2575 generic.go:358] "Generic (PLEG): container finished" podID="ebf23c2a-33ad-46fc-9385-3cbe6370d937" containerID="36a732b91a562a85efd07d210d4c4a2b7c3691ea5cfa50737fc77838601e6b8d" exitCode=0 Apr 22 20:08:07.077929 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:07.077880 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr" event={"ID":"ebf23c2a-33ad-46fc-9385-3cbe6370d937","Type":"ContainerDied","Data":"36a732b91a562a85efd07d210d4c4a2b7c3691ea5cfa50737fc77838601e6b8d"} Apr 22 20:08:08.086431 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:08.086396 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr" event={"ID":"ebf23c2a-33ad-46fc-9385-3cbe6370d937","Type":"ContainerStarted","Data":"1e15368d1776e1d09397e796df1d26ad3fd83160ad428092b9a6d5301ba84a74"} Apr 22 20:08:08.086887 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:08.086714 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr" Apr 22 20:08:08.087941 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:08.087907 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt" event={"ID":"296784ea-f704-464e-be95-0525b956e003","Type":"ContainerStarted","Data":"d88fe9302c93f4c7e0f21afc41a9b8020468d90e2067c2134b0a7c13e61a643e"} Apr 22 20:08:08.088294 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:08.088250 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr" podUID="ebf23c2a-33ad-46fc-9385-3cbe6370d937" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 22 20:08:08.088473 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:08.088293 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt" Apr 22 20:08:08.089228 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:08.089204 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt" podUID="296784ea-f704-464e-be95-0525b956e003" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 22 20:08:08.103131 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:08.103087 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr" podStartSLOduration=7.103073756 podStartE2EDuration="7.103073756s" podCreationTimestamp="2026-04-22 20:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:08:08.100607466 +0000 UTC m=+612.352107387" watchObservedRunningTime="2026-04-22 20:08:08.103073756 +0000 UTC m=+612.354573646" Apr 22 20:08:08.115896 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:08.115858 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt" podStartSLOduration=7.115849666 podStartE2EDuration="7.115849666s" podCreationTimestamp="2026-04-22 20:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:08:08.114209449 +0000 UTC m=+612.365709338" watchObservedRunningTime="2026-04-22 20:08:08.115849666 +0000 UTC m=+612.367349937" Apr 22 20:08:09.091608 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:09.091563 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr" podUID="ebf23c2a-33ad-46fc-9385-3cbe6370d937" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 22 20:08:09.091974 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:09.091563 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt" podUID="296784ea-f704-464e-be95-0525b956e003" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 22 20:08:11.614382 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:11.614347 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr" podUID="864b82cb-96bf-4004-a936-729e7e69e8b8" containerName="model-chainer-raw-81bcf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:08:16.614774 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:16.614693 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr" podUID="864b82cb-96bf-4004-a936-729e7e69e8b8" containerName="model-chainer-raw-81bcf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:08:19.091576 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:19.091533 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt" podUID="296784ea-f704-464e-be95-0525b956e003" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 22 20:08:19.091935 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:19.091533 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr" podUID="ebf23c2a-33ad-46fc-9385-3cbe6370d937" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 22 20:08:21.616949 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:21.616888 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr" podUID="864b82cb-96bf-4004-a936-729e7e69e8b8" containerName="model-chainer-raw-81bcf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:08:26.614008 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:26.613959 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr" podUID="864b82cb-96bf-4004-a936-729e7e69e8b8" containerName="model-chainer-raw-81bcf" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:08:29.092258 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:29.092215 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt" podUID="296784ea-f704-464e-be95-0525b956e003" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 22 20:08:29.092635 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:29.092216 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr" podUID="ebf23c2a-33ad-46fc-9385-3cbe6370d937" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 22 20:08:31.614300 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:31.614266 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr" podUID="864b82cb-96bf-4004-a936-729e7e69e8b8" containerName="model-chainer-raw-81bcf" probeResult="failure" output="Get \"https://10.133.0.19:8080/readyz\": dial tcp 10.133.0.19:8080: connect: connection refused" Apr 22 20:08:32.156968 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:32.156865 2575 generic.go:358] "Generic (PLEG): container finished" podID="864b82cb-96bf-4004-a936-729e7e69e8b8" containerID="46713e392b5b140e0df71f22f28010df643edf636ed5152e37661fbf96b3e2fc" exitCode=0 Apr 22 20:08:32.156968 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:32.156944 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr" event={"ID":"864b82cb-96bf-4004-a936-729e7e69e8b8","Type":"ContainerDied","Data":"46713e392b5b140e0df71f22f28010df643edf636ed5152e37661fbf96b3e2fc"} Apr 22 20:08:32.243337 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:32.243315 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr" Apr 22 20:08:32.414776 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:32.414704 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/864b82cb-96bf-4004-a936-729e7e69e8b8-openshift-service-ca-bundle\") pod \"864b82cb-96bf-4004-a936-729e7e69e8b8\" (UID: \"864b82cb-96bf-4004-a936-729e7e69e8b8\") " Apr 22 20:08:32.414776 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:32.414745 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/864b82cb-96bf-4004-a936-729e7e69e8b8-proxy-tls\") pod \"864b82cb-96bf-4004-a936-729e7e69e8b8\" (UID: \"864b82cb-96bf-4004-a936-729e7e69e8b8\") " Apr 22 20:08:32.415064 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:32.415021 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/864b82cb-96bf-4004-a936-729e7e69e8b8-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "864b82cb-96bf-4004-a936-729e7e69e8b8" (UID: "864b82cb-96bf-4004-a936-729e7e69e8b8"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:08:32.416803 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:32.416783 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/864b82cb-96bf-4004-a936-729e7e69e8b8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "864b82cb-96bf-4004-a936-729e7e69e8b8" (UID: "864b82cb-96bf-4004-a936-729e7e69e8b8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:08:32.515329 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:32.515296 2575 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/864b82cb-96bf-4004-a936-729e7e69e8b8-openshift-service-ca-bundle\") on node \"ip-10-0-128-239.ec2.internal\" DevicePath \"\"" Apr 22 20:08:32.515329 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:32.515325 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/864b82cb-96bf-4004-a936-729e7e69e8b8-proxy-tls\") on node \"ip-10-0-128-239.ec2.internal\" DevicePath \"\"" Apr 22 20:08:33.161184 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:33.161154 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr" Apr 22 20:08:33.161644 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:33.161154 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr" event={"ID":"864b82cb-96bf-4004-a936-729e7e69e8b8","Type":"ContainerDied","Data":"685903ed98f543f61b1d502b2596be66f7f4871ec7ad5abe01ed4f90ed613eeb"} Apr 22 20:08:33.161644 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:33.161272 2575 scope.go:117] "RemoveContainer" containerID="46713e392b5b140e0df71f22f28010df643edf636ed5152e37661fbf96b3e2fc" Apr 22 20:08:33.180798 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:33.180773 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr"] Apr 22 20:08:33.185700 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:33.185668 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-81bcf-f88d6597b-xj4kr"] Apr 22 20:08:34.303755 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:34.303716 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="864b82cb-96bf-4004-a936-729e7e69e8b8" path="/var/lib/kubelet/pods/864b82cb-96bf-4004-a936-729e7e69e8b8/volumes" Apr 22 20:08:39.092561 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:39.092512 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr" podUID="ebf23c2a-33ad-46fc-9385-3cbe6370d937" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 22 20:08:39.093023 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:39.092517 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt" podUID="296784ea-f704-464e-be95-0525b956e003" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 22 20:08:49.091877 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:49.091835 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt" podUID="296784ea-f704-464e-be95-0525b956e003" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 22 20:08:49.092278 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:49.091835 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr" podUID="ebf23c2a-33ad-46fc-9385-3cbe6370d937" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 22 20:08:59.092241 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:59.092200 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt" podUID="296784ea-f704-464e-be95-0525b956e003" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.21:8080: connect: connection refused" Apr 22 20:08:59.092619 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:08:59.092210 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr" podUID="ebf23c2a-33ad-46fc-9385-3cbe6370d937" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.20:8080: connect: connection refused" Apr 22 20:09:09.092003 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:09.091975 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr" Apr 22 20:09:09.092583 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:09.092356 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt" Apr 22 20:09:21.823755 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:21.823718 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4"] Apr 22 20:09:21.824227 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:21.823967 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c28960d2-0a72-41e2-b831-f4e123784e15" containerName="storage-initializer" Apr 22 20:09:21.824227 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:21.823979 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c28960d2-0a72-41e2-b831-f4e123784e15" containerName="storage-initializer" Apr 22 20:09:21.824227 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:21.823991 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c28960d2-0a72-41e2-b831-f4e123784e15" containerName="kserve-container" Apr 22 20:09:21.824227 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:21.823996 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c28960d2-0a72-41e2-b831-f4e123784e15" containerName="kserve-container" Apr 22 20:09:21.824227 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:21.824003 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="864b82cb-96bf-4004-a936-729e7e69e8b8" containerName="model-chainer-raw-81bcf" Apr 22 20:09:21.824227 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:21.824009 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="864b82cb-96bf-4004-a936-729e7e69e8b8" containerName="model-chainer-raw-81bcf" Apr 22 20:09:21.824227 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:21.824024 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ad94efb-b19f-4f67-8525-442359a14999" containerName="kserve-container" Apr 22 20:09:21.824227 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:21.824050 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ad94efb-b19f-4f67-8525-442359a14999" containerName="kserve-container" Apr 22 20:09:21.824227 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:21.824059 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ad94efb-b19f-4f67-8525-442359a14999" containerName="storage-initializer" Apr 22 20:09:21.824227 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:21.824065 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ad94efb-b19f-4f67-8525-442359a14999" containerName="storage-initializer" Apr 22 20:09:21.824227 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:21.824105 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="864b82cb-96bf-4004-a936-729e7e69e8b8" containerName="model-chainer-raw-81bcf" Apr 22 20:09:21.824227 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:21.824114 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c28960d2-0a72-41e2-b831-f4e123784e15" containerName="kserve-container" Apr 22 20:09:21.824227 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:21.824120 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ad94efb-b19f-4f67-8525-442359a14999" containerName="kserve-container" Apr 22 20:09:21.826916 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:21.826899 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4" Apr 22 20:09:21.829105 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:21.829081 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 20:09:21.829248 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:21.829105 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-3e39c-kube-rbac-proxy-sar-config\"" Apr 22 20:09:21.829248 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:21.829221 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-3e39c-serving-cert\"" Apr 22 20:09:21.834399 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:21.834377 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4"] Apr 22 20:09:21.843226 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:21.843205 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c010b35c-b6e6-4b4e-ba52-1d9a65bc7391-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4\" (UID: \"c010b35c-b6e6-4b4e-ba52-1d9a65bc7391\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4" Apr 22 20:09:21.843323 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:21.843260 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c010b35c-b6e6-4b4e-ba52-1d9a65bc7391-proxy-tls\") pod \"model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4\" (UID: \"c010b35c-b6e6-4b4e-ba52-1d9a65bc7391\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4" Apr 22 20:09:21.944567 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:21.944533 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c010b35c-b6e6-4b4e-ba52-1d9a65bc7391-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4\" (UID: \"c010b35c-b6e6-4b4e-ba52-1d9a65bc7391\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4" Apr 22 20:09:21.944725 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:21.944580 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c010b35c-b6e6-4b4e-ba52-1d9a65bc7391-proxy-tls\") pod \"model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4\" (UID: \"c010b35c-b6e6-4b4e-ba52-1d9a65bc7391\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4" Apr 22 20:09:21.944725 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:09:21.944672 2575 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-serving-cert: secret "model-chainer-raw-hpa-3e39c-serving-cert" not found Apr 22 20:09:21.944811 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:09:21.944735 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c010b35c-b6e6-4b4e-ba52-1d9a65bc7391-proxy-tls podName:c010b35c-b6e6-4b4e-ba52-1d9a65bc7391 nodeName:}" failed. No retries permitted until 2026-04-22 20:09:22.44471967 +0000 UTC m=+686.696219537 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c010b35c-b6e6-4b4e-ba52-1d9a65bc7391-proxy-tls") pod "model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4" (UID: "c010b35c-b6e6-4b4e-ba52-1d9a65bc7391") : secret "model-chainer-raw-hpa-3e39c-serving-cert" not found Apr 22 20:09:21.945193 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:21.945171 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c010b35c-b6e6-4b4e-ba52-1d9a65bc7391-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4\" (UID: \"c010b35c-b6e6-4b4e-ba52-1d9a65bc7391\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4" Apr 22 20:09:22.448299 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:22.448269 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c010b35c-b6e6-4b4e-ba52-1d9a65bc7391-proxy-tls\") pod \"model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4\" (UID: \"c010b35c-b6e6-4b4e-ba52-1d9a65bc7391\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4" Apr 22 20:09:22.450564 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:22.450545 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c010b35c-b6e6-4b4e-ba52-1d9a65bc7391-proxy-tls\") pod \"model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4\" (UID: \"c010b35c-b6e6-4b4e-ba52-1d9a65bc7391\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4" Apr 22 20:09:22.738614 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:22.738522 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4" Apr 22 20:09:22.849648 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:22.849621 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4"] Apr 22 20:09:22.852772 ip-10-0-128-239 kubenswrapper[2575]: W0422 20:09:22.852747 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc010b35c_b6e6_4b4e_ba52_1d9a65bc7391.slice/crio-9912952e63ba085a30383015d9e8820765f64cdbb238c4fa3d0cdfd8367e11b5 WatchSource:0}: Error finding container 9912952e63ba085a30383015d9e8820765f64cdbb238c4fa3d0cdfd8367e11b5: Status 404 returned error can't find the container with id 9912952e63ba085a30383015d9e8820765f64cdbb238c4fa3d0cdfd8367e11b5 Apr 22 20:09:22.859308 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:22.854900 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:09:23.296409 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:23.296372 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4" event={"ID":"c010b35c-b6e6-4b4e-ba52-1d9a65bc7391","Type":"ContainerStarted","Data":"1c0290c106c279f214f1bdf7eb162a93ad66e2df4d5d8e96d1df28d6e6a4d26c"} Apr 22 20:09:23.296409 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:23.296411 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4" event={"ID":"c010b35c-b6e6-4b4e-ba52-1d9a65bc7391","Type":"ContainerStarted","Data":"9912952e63ba085a30383015d9e8820765f64cdbb238c4fa3d0cdfd8367e11b5"} Apr 22 20:09:23.296607 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:23.296515 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4" Apr 22 20:09:23.313256 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:23.313217 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4" podStartSLOduration=2.31320518 podStartE2EDuration="2.31320518s" podCreationTimestamp="2026-04-22 20:09:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:09:23.311405726 +0000 UTC m=+687.562905615" watchObservedRunningTime="2026-04-22 20:09:23.31320518 +0000 UTC m=+687.564705073" Apr 22 20:09:29.305454 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:29.305422 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4" Apr 22 20:09:31.876848 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:31.876814 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4"] Apr 22 20:09:31.877251 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:31.877062 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4" podUID="c010b35c-b6e6-4b4e-ba52-1d9a65bc7391" containerName="model-chainer-raw-hpa-3e39c" containerID="cri-o://1c0290c106c279f214f1bdf7eb162a93ad66e2df4d5d8e96d1df28d6e6a4d26c" gracePeriod=30 Apr 22 20:09:31.993217 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:31.993166 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt"] Apr 22 20:09:31.993458 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:31.993436 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt" podUID="296784ea-f704-464e-be95-0525b956e003" containerName="kserve-container" containerID="cri-o://d88fe9302c93f4c7e0f21afc41a9b8020468d90e2067c2134b0a7c13e61a643e" gracePeriod=30 Apr 22 20:09:32.039006 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:32.038979 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr"] Apr 22 20:09:32.039283 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:32.039262 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr" podUID="ebf23c2a-33ad-46fc-9385-3cbe6370d937" containerName="kserve-container" containerID="cri-o://1e15368d1776e1d09397e796df1d26ad3fd83160ad428092b9a6d5301ba84a74" gracePeriod=30 Apr 22 20:09:34.304738 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:34.304704 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4" podUID="c010b35c-b6e6-4b4e-ba52-1d9a65bc7391" containerName="model-chainer-raw-hpa-3e39c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:09:35.533250 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:35.533229 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt" Apr 22 20:09:35.633015 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:35.632951 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/296784ea-f704-464e-be95-0525b956e003-kserve-provision-location\") pod \"296784ea-f704-464e-be95-0525b956e003\" (UID: \"296784ea-f704-464e-be95-0525b956e003\") " Apr 22 20:09:35.633281 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:35.633259 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/296784ea-f704-464e-be95-0525b956e003-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "296784ea-f704-464e-be95-0525b956e003" (UID: "296784ea-f704-464e-be95-0525b956e003"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:09:35.734125 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:35.734089 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/296784ea-f704-464e-be95-0525b956e003-kserve-provision-location\") on node \"ip-10-0-128-239.ec2.internal\" DevicePath \"\"" Apr 22 20:09:36.328320 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:36.328289 2575 generic.go:358] "Generic (PLEG): container finished" podID="296784ea-f704-464e-be95-0525b956e003" containerID="d88fe9302c93f4c7e0f21afc41a9b8020468d90e2067c2134b0a7c13e61a643e" exitCode=0 Apr 22 20:09:36.328437 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:36.328361 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt" Apr 22 20:09:36.328437 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:36.328367 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt" event={"ID":"296784ea-f704-464e-be95-0525b956e003","Type":"ContainerDied","Data":"d88fe9302c93f4c7e0f21afc41a9b8020468d90e2067c2134b0a7c13e61a643e"} Apr 22 20:09:36.328437 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:36.328403 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt" event={"ID":"296784ea-f704-464e-be95-0525b956e003","Type":"ContainerDied","Data":"1e21c4c998dbc44501c2c6426010abbc4ba5aed5ee6cf5f1a799e8b1d2ca278d"} Apr 22 20:09:36.328437 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:36.328423 2575 scope.go:117] "RemoveContainer" containerID="d88fe9302c93f4c7e0f21afc41a9b8020468d90e2067c2134b0a7c13e61a643e" Apr 22 20:09:36.337285 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:36.337267 2575 scope.go:117] "RemoveContainer" containerID="071a7f9f80d703de924ae96e7c871e0329f16a647452a5befe0679fd94d1d9f8" Apr 22 20:09:36.342968 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:36.342944 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt"] Apr 22 20:09:36.344580 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:36.344560 2575 scope.go:117] "RemoveContainer" containerID="d88fe9302c93f4c7e0f21afc41a9b8020468d90e2067c2134b0a7c13e61a643e" Apr 22 20:09:36.344817 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:09:36.344800 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d88fe9302c93f4c7e0f21afc41a9b8020468d90e2067c2134b0a7c13e61a643e\": container with ID starting with d88fe9302c93f4c7e0f21afc41a9b8020468d90e2067c2134b0a7c13e61a643e not found: ID does not exist" containerID="d88fe9302c93f4c7e0f21afc41a9b8020468d90e2067c2134b0a7c13e61a643e" Apr 22 20:09:36.344867 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:36.344824 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d88fe9302c93f4c7e0f21afc41a9b8020468d90e2067c2134b0a7c13e61a643e"} err="failed to get container status \"d88fe9302c93f4c7e0f21afc41a9b8020468d90e2067c2134b0a7c13e61a643e\": rpc error: code = NotFound desc = could not find container \"d88fe9302c93f4c7e0f21afc41a9b8020468d90e2067c2134b0a7c13e61a643e\": container with ID starting with d88fe9302c93f4c7e0f21afc41a9b8020468d90e2067c2134b0a7c13e61a643e not found: ID does not exist" Apr 22 20:09:36.344867 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:36.344841 2575 scope.go:117] "RemoveContainer" containerID="071a7f9f80d703de924ae96e7c871e0329f16a647452a5befe0679fd94d1d9f8" Apr 22 20:09:36.344988 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:36.344972 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-3e39c-predictor-7cf7df947-6rttt"] Apr 22 20:09:36.345114 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:09:36.345098 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"071a7f9f80d703de924ae96e7c871e0329f16a647452a5befe0679fd94d1d9f8\": container with ID starting with 071a7f9f80d703de924ae96e7c871e0329f16a647452a5befe0679fd94d1d9f8 not found: ID does not exist" containerID="071a7f9f80d703de924ae96e7c871e0329f16a647452a5befe0679fd94d1d9f8" Apr 22 20:09:36.345173 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:36.345119 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"071a7f9f80d703de924ae96e7c871e0329f16a647452a5befe0679fd94d1d9f8"} err="failed to get container status \"071a7f9f80d703de924ae96e7c871e0329f16a647452a5befe0679fd94d1d9f8\": rpc error: code = NotFound desc = could not find container \"071a7f9f80d703de924ae96e7c871e0329f16a647452a5befe0679fd94d1d9f8\": container with ID starting with 071a7f9f80d703de924ae96e7c871e0329f16a647452a5befe0679fd94d1d9f8 not found: ID does not exist" Apr 22 20:09:36.580799 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:36.580777 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr" Apr 22 20:09:36.640505 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:36.640478 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebf23c2a-33ad-46fc-9385-3cbe6370d937-kserve-provision-location\") pod \"ebf23c2a-33ad-46fc-9385-3cbe6370d937\" (UID: \"ebf23c2a-33ad-46fc-9385-3cbe6370d937\") " Apr 22 20:09:36.640787 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:36.640764 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebf23c2a-33ad-46fc-9385-3cbe6370d937-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ebf23c2a-33ad-46fc-9385-3cbe6370d937" (UID: "ebf23c2a-33ad-46fc-9385-3cbe6370d937"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:09:36.741475 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:36.741453 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebf23c2a-33ad-46fc-9385-3cbe6370d937-kserve-provision-location\") on node \"ip-10-0-128-239.ec2.internal\" DevicePath \"\"" Apr 22 20:09:37.333111 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:37.333068 2575 generic.go:358] "Generic (PLEG): container finished" podID="ebf23c2a-33ad-46fc-9385-3cbe6370d937" containerID="1e15368d1776e1d09397e796df1d26ad3fd83160ad428092b9a6d5301ba84a74" exitCode=0 Apr 22 20:09:37.333289 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:37.333121 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr" event={"ID":"ebf23c2a-33ad-46fc-9385-3cbe6370d937","Type":"ContainerDied","Data":"1e15368d1776e1d09397e796df1d26ad3fd83160ad428092b9a6d5301ba84a74"} Apr 22 20:09:37.333289 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:37.333133 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr" Apr 22 20:09:37.333289 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:37.333145 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr" event={"ID":"ebf23c2a-33ad-46fc-9385-3cbe6370d937","Type":"ContainerDied","Data":"7980f1ebf65969c6491b7d1ec06000d9b1760f5397e391078d8c529aefacefbf"} Apr 22 20:09:37.333289 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:37.333163 2575 scope.go:117] "RemoveContainer" containerID="1e15368d1776e1d09397e796df1d26ad3fd83160ad428092b9a6d5301ba84a74" Apr 22 20:09:37.340444 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:37.340424 2575 scope.go:117] "RemoveContainer" containerID="36a732b91a562a85efd07d210d4c4a2b7c3691ea5cfa50737fc77838601e6b8d" Apr 22 20:09:37.347012 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:37.346994 2575 scope.go:117] "RemoveContainer" containerID="1e15368d1776e1d09397e796df1d26ad3fd83160ad428092b9a6d5301ba84a74" Apr 22 20:09:37.347275 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:09:37.347256 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e15368d1776e1d09397e796df1d26ad3fd83160ad428092b9a6d5301ba84a74\": container with ID starting with 1e15368d1776e1d09397e796df1d26ad3fd83160ad428092b9a6d5301ba84a74 not found: ID does not exist" containerID="1e15368d1776e1d09397e796df1d26ad3fd83160ad428092b9a6d5301ba84a74" Apr 22 20:09:37.347325 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:37.347283 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e15368d1776e1d09397e796df1d26ad3fd83160ad428092b9a6d5301ba84a74"} err="failed to get container status \"1e15368d1776e1d09397e796df1d26ad3fd83160ad428092b9a6d5301ba84a74\": rpc error: code = NotFound desc = could not find container \"1e15368d1776e1d09397e796df1d26ad3fd83160ad428092b9a6d5301ba84a74\": container with ID starting with 1e15368d1776e1d09397e796df1d26ad3fd83160ad428092b9a6d5301ba84a74 not found: ID does not exist" Apr 22 20:09:37.347325 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:37.347300 2575 scope.go:117] "RemoveContainer" containerID="36a732b91a562a85efd07d210d4c4a2b7c3691ea5cfa50737fc77838601e6b8d" Apr 22 20:09:37.347542 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:09:37.347522 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36a732b91a562a85efd07d210d4c4a2b7c3691ea5cfa50737fc77838601e6b8d\": container with ID starting with 36a732b91a562a85efd07d210d4c4a2b7c3691ea5cfa50737fc77838601e6b8d not found: ID does not exist" containerID="36a732b91a562a85efd07d210d4c4a2b7c3691ea5cfa50737fc77838601e6b8d" Apr 22 20:09:37.347585 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:37.347550 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36a732b91a562a85efd07d210d4c4a2b7c3691ea5cfa50737fc77838601e6b8d"} err="failed to get container status \"36a732b91a562a85efd07d210d4c4a2b7c3691ea5cfa50737fc77838601e6b8d\": rpc error: code = NotFound desc = could not find container \"36a732b91a562a85efd07d210d4c4a2b7c3691ea5cfa50737fc77838601e6b8d\": container with ID starting with 36a732b91a562a85efd07d210d4c4a2b7c3691ea5cfa50737fc77838601e6b8d not found: ID does not exist" Apr 22 20:09:37.353994 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:37.353975 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr"] Apr 22 20:09:37.357279 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:37.357259 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-3e39c-predictor-57c8568889-x96mr"] Apr 22 20:09:38.303876 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:38.303838 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="296784ea-f704-464e-be95-0525b956e003" path="/var/lib/kubelet/pods/296784ea-f704-464e-be95-0525b956e003/volumes" Apr 22 20:09:38.304281 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:38.304224 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebf23c2a-33ad-46fc-9385-3cbe6370d937" path="/var/lib/kubelet/pods/ebf23c2a-33ad-46fc-9385-3cbe6370d937/volumes" Apr 22 20:09:39.304331 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:39.304299 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4" podUID="c010b35c-b6e6-4b4e-ba52-1d9a65bc7391" containerName="model-chainer-raw-hpa-3e39c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:09:42.079249 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:42.079222 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496"] Apr 22 20:09:42.079670 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:42.079472 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="296784ea-f704-464e-be95-0525b956e003" containerName="storage-initializer" Apr 22 20:09:42.079670 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:42.079485 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="296784ea-f704-464e-be95-0525b956e003" containerName="storage-initializer" Apr 22 20:09:42.079670 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:42.079496 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebf23c2a-33ad-46fc-9385-3cbe6370d937" containerName="storage-initializer" Apr 22 20:09:42.079670 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:42.079502 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf23c2a-33ad-46fc-9385-3cbe6370d937" containerName="storage-initializer" Apr 22 20:09:42.079670 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:42.079512 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebf23c2a-33ad-46fc-9385-3cbe6370d937" containerName="kserve-container" Apr 22 20:09:42.079670 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:42.079518 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf23c2a-33ad-46fc-9385-3cbe6370d937" containerName="kserve-container" Apr 22 20:09:42.079670 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:42.079524 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="296784ea-f704-464e-be95-0525b956e003" containerName="kserve-container" Apr 22 20:09:42.079670 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:42.079530 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="296784ea-f704-464e-be95-0525b956e003" containerName="kserve-container" Apr 22 20:09:42.079670 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:42.079568 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="296784ea-f704-464e-be95-0525b956e003" containerName="kserve-container" Apr 22 20:09:42.079670 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:42.079577 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebf23c2a-33ad-46fc-9385-3cbe6370d937" containerName="kserve-container" Apr 22 20:09:42.084006 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:42.083989 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" Apr 22 20:09:42.090671 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:42.090647 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496"] Apr 22 20:09:42.176816 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:42.176780 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b67eab5e-b3f4-434c-90e9-aa7ff413b66a-kserve-provision-location\") pod \"isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496\" (UID: \"b67eab5e-b3f4-434c-90e9-aa7ff413b66a\") " pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" Apr 22 20:09:42.278116 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:42.278087 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b67eab5e-b3f4-434c-90e9-aa7ff413b66a-kserve-provision-location\") pod \"isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496\" (UID: \"b67eab5e-b3f4-434c-90e9-aa7ff413b66a\") " pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" Apr 22 20:09:42.278367 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:42.278348 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b67eab5e-b3f4-434c-90e9-aa7ff413b66a-kserve-provision-location\") pod \"isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496\" (UID: \"b67eab5e-b3f4-434c-90e9-aa7ff413b66a\") " pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" Apr 22 20:09:42.395530 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:42.395506 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" Apr 22 20:09:42.506671 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:42.506644 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496"] Apr 22 20:09:42.509321 ip-10-0-128-239 kubenswrapper[2575]: W0422 20:09:42.509293 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb67eab5e_b3f4_434c_90e9_aa7ff413b66a.slice/crio-c1fde3bbb86b0e02f0758816b04da3da4e3a6c3a92036af50fb1794e00b9e503 WatchSource:0}: Error finding container c1fde3bbb86b0e02f0758816b04da3da4e3a6c3a92036af50fb1794e00b9e503: Status 404 returned error can't find the container with id c1fde3bbb86b0e02f0758816b04da3da4e3a6c3a92036af50fb1794e00b9e503 Apr 22 20:09:43.348297 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:43.348260 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" event={"ID":"b67eab5e-b3f4-434c-90e9-aa7ff413b66a","Type":"ContainerStarted","Data":"b9546916efc9ee26ff977cf8564a4230938446319ebd0a71defc482c0d3e2525"} Apr 22 20:09:43.348297 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:43.348295 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" event={"ID":"b67eab5e-b3f4-434c-90e9-aa7ff413b66a","Type":"ContainerStarted","Data":"c1fde3bbb86b0e02f0758816b04da3da4e3a6c3a92036af50fb1794e00b9e503"} Apr 22 20:09:44.303830 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:44.303790 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4" podUID="c010b35c-b6e6-4b4e-ba52-1d9a65bc7391" containerName="model-chainer-raw-hpa-3e39c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:09:44.304689 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:44.304667 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4" Apr 22 20:09:46.357456 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:46.357422 2575 generic.go:358] "Generic (PLEG): container finished" podID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerID="b9546916efc9ee26ff977cf8564a4230938446319ebd0a71defc482c0d3e2525" exitCode=0 Apr 22 20:09:46.357812 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:46.357455 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" event={"ID":"b67eab5e-b3f4-434c-90e9-aa7ff413b66a","Type":"ContainerDied","Data":"b9546916efc9ee26ff977cf8564a4230938446319ebd0a71defc482c0d3e2525"} Apr 22 20:09:47.362802 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:47.362764 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" event={"ID":"b67eab5e-b3f4-434c-90e9-aa7ff413b66a","Type":"ContainerStarted","Data":"3910ff0fa1c2eaea3c574387e9ade011f422b49149e76e452d70a496da041970"} Apr 22 20:09:47.363191 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:47.362811 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" event={"ID":"b67eab5e-b3f4-434c-90e9-aa7ff413b66a","Type":"ContainerStarted","Data":"6fdba492049c44b2a0fcd9c9a8d843943aa3158a45cc81a40e0ae39d3a9b78f0"} Apr 22 20:09:47.363191 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:47.363175 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" Apr 22 20:09:47.364420 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:47.364388 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 22 20:09:47.380147 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:47.380112 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" podStartSLOduration=5.380100754 podStartE2EDuration="5.380100754s" podCreationTimestamp="2026-04-22 20:09:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:09:47.378511174 +0000 UTC m=+711.630011062" watchObservedRunningTime="2026-04-22 20:09:47.380100754 +0000 UTC m=+711.631600644" Apr 22 20:09:48.365468 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:48.365419 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" Apr 22 20:09:48.365876 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:48.365538 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 22 20:09:48.366499 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:48.366466 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:09:49.303841 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:49.303806 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4" podUID="c010b35c-b6e6-4b4e-ba52-1d9a65bc7391" containerName="model-chainer-raw-hpa-3e39c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:09:49.368098 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:49.368062 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 22 20:09:49.368502 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:49.368315 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:09:54.304709 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:54.304672 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4" podUID="c010b35c-b6e6-4b4e-ba52-1d9a65bc7391" containerName="model-chainer-raw-hpa-3e39c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:09:59.303883 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:59.303847 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4" podUID="c010b35c-b6e6-4b4e-ba52-1d9a65bc7391" containerName="model-chainer-raw-hpa-3e39c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:09:59.368471 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:59.368433 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 22 20:09:59.368825 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:09:59.368803 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:10:01.901768 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:10:01.901738 2575 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc010b35c_b6e6_4b4e_ba52_1d9a65bc7391.slice/crio-1c0290c106c279f214f1bdf7eb162a93ad66e2df4d5d8e96d1df28d6e6a4d26c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc010b35c_b6e6_4b4e_ba52_1d9a65bc7391.slice/crio-conmon-1c0290c106c279f214f1bdf7eb162a93ad66e2df4d5d8e96d1df28d6e6a4d26c.scope\": RecentStats: unable to find data in memory cache]" Apr 22 20:10:01.902073 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:10:01.901748 2575 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc010b35c_b6e6_4b4e_ba52_1d9a65bc7391.slice/crio-conmon-1c0290c106c279f214f1bdf7eb162a93ad66e2df4d5d8e96d1df28d6e6a4d26c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc010b35c_b6e6_4b4e_ba52_1d9a65bc7391.slice/crio-1c0290c106c279f214f1bdf7eb162a93ad66e2df4d5d8e96d1df28d6e6a4d26c.scope\": RecentStats: unable to find data in memory cache]" Apr 22 20:10:02.013925 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:02.013904 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4" Apr 22 20:10:02.117363 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:02.117335 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c010b35c-b6e6-4b4e-ba52-1d9a65bc7391-proxy-tls\") pod \"c010b35c-b6e6-4b4e-ba52-1d9a65bc7391\" (UID: \"c010b35c-b6e6-4b4e-ba52-1d9a65bc7391\") " Apr 22 20:10:02.117532 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:02.117376 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c010b35c-b6e6-4b4e-ba52-1d9a65bc7391-openshift-service-ca-bundle\") pod \"c010b35c-b6e6-4b4e-ba52-1d9a65bc7391\" (UID: \"c010b35c-b6e6-4b4e-ba52-1d9a65bc7391\") " Apr 22 20:10:02.117753 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:02.117730 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c010b35c-b6e6-4b4e-ba52-1d9a65bc7391-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "c010b35c-b6e6-4b4e-ba52-1d9a65bc7391" (UID: "c010b35c-b6e6-4b4e-ba52-1d9a65bc7391"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:10:02.119402 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:02.119378 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c010b35c-b6e6-4b4e-ba52-1d9a65bc7391-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c010b35c-b6e6-4b4e-ba52-1d9a65bc7391" (UID: "c010b35c-b6e6-4b4e-ba52-1d9a65bc7391"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:10:02.218787 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:02.218720 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c010b35c-b6e6-4b4e-ba52-1d9a65bc7391-proxy-tls\") on node \"ip-10-0-128-239.ec2.internal\" DevicePath \"\"" Apr 22 20:10:02.218787 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:02.218744 2575 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c010b35c-b6e6-4b4e-ba52-1d9a65bc7391-openshift-service-ca-bundle\") on node \"ip-10-0-128-239.ec2.internal\" DevicePath \"\"" Apr 22 20:10:02.398304 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:02.398272 2575 generic.go:358] "Generic (PLEG): container finished" podID="c010b35c-b6e6-4b4e-ba52-1d9a65bc7391" containerID="1c0290c106c279f214f1bdf7eb162a93ad66e2df4d5d8e96d1df28d6e6a4d26c" exitCode=0 Apr 22 20:10:02.398408 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:02.398332 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4" Apr 22 20:10:02.398408 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:02.398359 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4" event={"ID":"c010b35c-b6e6-4b4e-ba52-1d9a65bc7391","Type":"ContainerDied","Data":"1c0290c106c279f214f1bdf7eb162a93ad66e2df4d5d8e96d1df28d6e6a4d26c"} Apr 22 20:10:02.398408 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:02.398396 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4" event={"ID":"c010b35c-b6e6-4b4e-ba52-1d9a65bc7391","Type":"ContainerDied","Data":"9912952e63ba085a30383015d9e8820765f64cdbb238c4fa3d0cdfd8367e11b5"} Apr 22 20:10:02.398529 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:02.398417 2575 scope.go:117] "RemoveContainer" containerID="1c0290c106c279f214f1bdf7eb162a93ad66e2df4d5d8e96d1df28d6e6a4d26c" Apr 22 20:10:02.405660 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:02.405644 2575 scope.go:117] "RemoveContainer" containerID="1c0290c106c279f214f1bdf7eb162a93ad66e2df4d5d8e96d1df28d6e6a4d26c" Apr 22 20:10:02.405913 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:10:02.405891 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c0290c106c279f214f1bdf7eb162a93ad66e2df4d5d8e96d1df28d6e6a4d26c\": container with ID starting with 1c0290c106c279f214f1bdf7eb162a93ad66e2df4d5d8e96d1df28d6e6a4d26c not found: ID does not exist" containerID="1c0290c106c279f214f1bdf7eb162a93ad66e2df4d5d8e96d1df28d6e6a4d26c" Apr 22 20:10:02.405962 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:02.405924 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c0290c106c279f214f1bdf7eb162a93ad66e2df4d5d8e96d1df28d6e6a4d26c"} err="failed to get container status \"1c0290c106c279f214f1bdf7eb162a93ad66e2df4d5d8e96d1df28d6e6a4d26c\": rpc error: code = NotFound desc = could not find container \"1c0290c106c279f214f1bdf7eb162a93ad66e2df4d5d8e96d1df28d6e6a4d26c\": container with ID starting with 1c0290c106c279f214f1bdf7eb162a93ad66e2df4d5d8e96d1df28d6e6a4d26c not found: ID does not exist" Apr 22 20:10:02.412808 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:02.412790 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4"] Apr 22 20:10:02.416235 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:02.416216 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-3e39c-565d6b7c6d-5xzs4"] Apr 22 20:10:04.303760 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:04.303727 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c010b35c-b6e6-4b4e-ba52-1d9a65bc7391" path="/var/lib/kubelet/pods/c010b35c-b6e6-4b4e-ba52-1d9a65bc7391/volumes" Apr 22 20:10:09.368113 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:09.368053 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 22 20:10:09.368580 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:09.368505 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:10:19.368356 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:19.368300 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 22 20:10:19.368943 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:19.368756 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:10:29.368527 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:29.368478 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 22 20:10:29.369024 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:29.369002 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:10:39.368103 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:39.368054 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 22 20:10:39.368559 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:39.368514 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:10:49.369249 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:49.369219 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" Apr 22 20:10:49.369677 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:49.369635 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" Apr 22 20:10:57.246365 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:57.246334 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496"] Apr 22 20:10:57.246793 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:57.246627 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="kserve-container" containerID="cri-o://6fdba492049c44b2a0fcd9c9a8d843943aa3158a45cc81a40e0ae39d3a9b78f0" gracePeriod=30 Apr 22 20:10:57.246793 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:57.246675 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="agent" containerID="cri-o://3910ff0fa1c2eaea3c574387e9ade011f422b49149e76e452d70a496da041970" gracePeriod=30 Apr 22 20:10:57.293459 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:57.293435 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j"] Apr 22 20:10:57.293683 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:57.293672 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c010b35c-b6e6-4b4e-ba52-1d9a65bc7391" containerName="model-chainer-raw-hpa-3e39c" Apr 22 20:10:57.293731 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:57.293685 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c010b35c-b6e6-4b4e-ba52-1d9a65bc7391" containerName="model-chainer-raw-hpa-3e39c" Apr 22 20:10:57.293765 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:57.293733 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c010b35c-b6e6-4b4e-ba52-1d9a65bc7391" containerName="model-chainer-raw-hpa-3e39c" Apr 22 20:10:57.296772 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:57.296754 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" Apr 22 20:10:57.307362 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:57.307343 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j"] Apr 22 20:10:57.390943 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:57.390905 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c85295af-4265-48b5-9a4c-ac42f718f1b1-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j\" (UID: \"c85295af-4265-48b5-9a4c-ac42f718f1b1\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" Apr 22 20:10:57.491746 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:57.491704 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c85295af-4265-48b5-9a4c-ac42f718f1b1-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j\" (UID: \"c85295af-4265-48b5-9a4c-ac42f718f1b1\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" Apr 22 20:10:57.492055 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:57.492017 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c85295af-4265-48b5-9a4c-ac42f718f1b1-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j\" (UID: \"c85295af-4265-48b5-9a4c-ac42f718f1b1\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" Apr 22 20:10:57.607289 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:57.607218 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" Apr 22 20:10:57.722397 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:57.722343 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j"] Apr 22 20:10:57.725019 ip-10-0-128-239 kubenswrapper[2575]: W0422 20:10:57.724992 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc85295af_4265_48b5_9a4c_ac42f718f1b1.slice/crio-56d6baa24b592d0df9c94d6cca5e8fc2cb5a7b0e132c0f997d3acbad2265d35b WatchSource:0}: Error finding container 56d6baa24b592d0df9c94d6cca5e8fc2cb5a7b0e132c0f997d3acbad2265d35b: Status 404 returned error can't find the container with id 56d6baa24b592d0df9c94d6cca5e8fc2cb5a7b0e132c0f997d3acbad2265d35b Apr 22 20:10:58.542433 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:58.542397 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" event={"ID":"c85295af-4265-48b5-9a4c-ac42f718f1b1","Type":"ContainerStarted","Data":"21a29672e32f421d306037fc9ae5971a7dafad6ccc3318aac3675139ebe1a0bb"} Apr 22 20:10:58.542795 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:58.542438 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" event={"ID":"c85295af-4265-48b5-9a4c-ac42f718f1b1","Type":"ContainerStarted","Data":"56d6baa24b592d0df9c94d6cca5e8fc2cb5a7b0e132c0f997d3acbad2265d35b"} Apr 22 20:10:59.368214 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:59.368172 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 22 20:10:59.370869 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:10:59.370835 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:11:01.550906 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:01.550878 2575 generic.go:358] "Generic (PLEG): container finished" podID="c85295af-4265-48b5-9a4c-ac42f718f1b1" containerID="21a29672e32f421d306037fc9ae5971a7dafad6ccc3318aac3675139ebe1a0bb" exitCode=0 Apr 22 20:11:01.551214 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:01.550948 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" event={"ID":"c85295af-4265-48b5-9a4c-ac42f718f1b1","Type":"ContainerDied","Data":"21a29672e32f421d306037fc9ae5971a7dafad6ccc3318aac3675139ebe1a0bb"} Apr 22 20:11:01.552849 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:01.552828 2575 generic.go:358] "Generic (PLEG): container finished" podID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerID="6fdba492049c44b2a0fcd9c9a8d843943aa3158a45cc81a40e0ae39d3a9b78f0" exitCode=0 Apr 22 20:11:01.552946 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:01.552867 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" event={"ID":"b67eab5e-b3f4-434c-90e9-aa7ff413b66a","Type":"ContainerDied","Data":"6fdba492049c44b2a0fcd9c9a8d843943aa3158a45cc81a40e0ae39d3a9b78f0"} Apr 22 20:11:02.557834 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:02.557796 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" event={"ID":"c85295af-4265-48b5-9a4c-ac42f718f1b1","Type":"ContainerStarted","Data":"99748d4dc0ce96820b6157115bd173a8eb92440d330a416bfc1b1fd83f0f7326"} Apr 22 20:11:02.558280 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:02.558113 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" Apr 22 20:11:02.559671 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:02.559637 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" podUID="c85295af-4265-48b5-9a4c-ac42f718f1b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 22 20:11:02.574796 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:02.574753 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" podStartSLOduration=5.5747400030000005 podStartE2EDuration="5.574740003s" podCreationTimestamp="2026-04-22 20:10:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:11:02.572979242 +0000 UTC m=+786.824479134" watchObservedRunningTime="2026-04-22 20:11:02.574740003 +0000 UTC m=+786.826239959" Apr 22 20:11:03.561340 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:03.561302 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" podUID="c85295af-4265-48b5-9a4c-ac42f718f1b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 22 20:11:09.368313 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:09.368270 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 22 20:11:09.370961 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:09.370931 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:11:13.561902 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:13.561860 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" podUID="c85295af-4265-48b5-9a4c-ac42f718f1b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 22 20:11:19.369009 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:19.368910 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.23:8080: connect: connection refused" Apr 22 20:11:19.369437 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:19.369087 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" Apr 22 20:11:19.370529 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:19.370494 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:11:19.370619 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:19.370589 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" Apr 22 20:11:23.562278 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:23.562234 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" podUID="c85295af-4265-48b5-9a4c-ac42f718f1b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 22 20:11:27.419774 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:27.419746 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" Apr 22 20:11:27.494175 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:27.494148 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b67eab5e-b3f4-434c-90e9-aa7ff413b66a-kserve-provision-location\") pod \"b67eab5e-b3f4-434c-90e9-aa7ff413b66a\" (UID: \"b67eab5e-b3f4-434c-90e9-aa7ff413b66a\") " Apr 22 20:11:27.494438 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:27.494417 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b67eab5e-b3f4-434c-90e9-aa7ff413b66a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b67eab5e-b3f4-434c-90e9-aa7ff413b66a" (UID: "b67eab5e-b3f4-434c-90e9-aa7ff413b66a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:11:27.595054 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:27.594984 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b67eab5e-b3f4-434c-90e9-aa7ff413b66a-kserve-provision-location\") on node \"ip-10-0-128-239.ec2.internal\" DevicePath \"\"" Apr 22 20:11:27.618015 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:27.617986 2575 generic.go:358] "Generic (PLEG): container finished" podID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerID="3910ff0fa1c2eaea3c574387e9ade011f422b49149e76e452d70a496da041970" exitCode=137 Apr 22 20:11:27.618142 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:27.618066 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" event={"ID":"b67eab5e-b3f4-434c-90e9-aa7ff413b66a","Type":"ContainerDied","Data":"3910ff0fa1c2eaea3c574387e9ade011f422b49149e76e452d70a496da041970"} Apr 22 20:11:27.618142 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:27.618100 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" event={"ID":"b67eab5e-b3f4-434c-90e9-aa7ff413b66a","Type":"ContainerDied","Data":"c1fde3bbb86b0e02f0758816b04da3da4e3a6c3a92036af50fb1794e00b9e503"} Apr 22 20:11:27.618142 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:27.618115 2575 scope.go:117] "RemoveContainer" containerID="3910ff0fa1c2eaea3c574387e9ade011f422b49149e76e452d70a496da041970" Apr 22 20:11:27.618285 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:27.618071 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496" Apr 22 20:11:27.625672 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:27.625526 2575 scope.go:117] "RemoveContainer" containerID="6fdba492049c44b2a0fcd9c9a8d843943aa3158a45cc81a40e0ae39d3a9b78f0" Apr 22 20:11:27.632350 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:27.632330 2575 scope.go:117] "RemoveContainer" containerID="b9546916efc9ee26ff977cf8564a4230938446319ebd0a71defc482c0d3e2525" Apr 22 20:11:27.637800 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:27.637781 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496"] Apr 22 20:11:27.638736 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:27.638722 2575 scope.go:117] "RemoveContainer" containerID="3910ff0fa1c2eaea3c574387e9ade011f422b49149e76e452d70a496da041970" Apr 22 20:11:27.638984 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:11:27.638967 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3910ff0fa1c2eaea3c574387e9ade011f422b49149e76e452d70a496da041970\": container with ID starting with 3910ff0fa1c2eaea3c574387e9ade011f422b49149e76e452d70a496da041970 not found: ID does not exist" containerID="3910ff0fa1c2eaea3c574387e9ade011f422b49149e76e452d70a496da041970" Apr 22 20:11:27.639063 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:27.638997 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3910ff0fa1c2eaea3c574387e9ade011f422b49149e76e452d70a496da041970"} err="failed to get container status \"3910ff0fa1c2eaea3c574387e9ade011f422b49149e76e452d70a496da041970\": rpc error: code = NotFound desc = could not find container \"3910ff0fa1c2eaea3c574387e9ade011f422b49149e76e452d70a496da041970\": container with ID starting with 3910ff0fa1c2eaea3c574387e9ade011f422b49149e76e452d70a496da041970 not found: ID does not exist" Apr 22 20:11:27.639063 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:27.639021 2575 scope.go:117] "RemoveContainer" containerID="6fdba492049c44b2a0fcd9c9a8d843943aa3158a45cc81a40e0ae39d3a9b78f0" Apr 22 20:11:27.639284 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:11:27.639267 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fdba492049c44b2a0fcd9c9a8d843943aa3158a45cc81a40e0ae39d3a9b78f0\": container with ID starting with 6fdba492049c44b2a0fcd9c9a8d843943aa3158a45cc81a40e0ae39d3a9b78f0 not found: ID does not exist" containerID="6fdba492049c44b2a0fcd9c9a8d843943aa3158a45cc81a40e0ae39d3a9b78f0" Apr 22 20:11:27.639325 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:27.639292 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fdba492049c44b2a0fcd9c9a8d843943aa3158a45cc81a40e0ae39d3a9b78f0"} err="failed to get container status \"6fdba492049c44b2a0fcd9c9a8d843943aa3158a45cc81a40e0ae39d3a9b78f0\": rpc error: code = NotFound desc = could not find container \"6fdba492049c44b2a0fcd9c9a8d843943aa3158a45cc81a40e0ae39d3a9b78f0\": container with ID starting with 6fdba492049c44b2a0fcd9c9a8d843943aa3158a45cc81a40e0ae39d3a9b78f0 not found: ID does not exist" Apr 22 20:11:27.639325 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:27.639308 2575 scope.go:117] "RemoveContainer" containerID="b9546916efc9ee26ff977cf8564a4230938446319ebd0a71defc482c0d3e2525" Apr 22 20:11:27.639497 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:11:27.639481 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9546916efc9ee26ff977cf8564a4230938446319ebd0a71defc482c0d3e2525\": container with ID starting with b9546916efc9ee26ff977cf8564a4230938446319ebd0a71defc482c0d3e2525 not found: ID does not exist" containerID="b9546916efc9ee26ff977cf8564a4230938446319ebd0a71defc482c0d3e2525" Apr 22 20:11:27.639532 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:27.639503 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9546916efc9ee26ff977cf8564a4230938446319ebd0a71defc482c0d3e2525"} err="failed to get container status \"b9546916efc9ee26ff977cf8564a4230938446319ebd0a71defc482c0d3e2525\": rpc error: code = NotFound desc = could not find container \"b9546916efc9ee26ff977cf8564a4230938446319ebd0a71defc482c0d3e2525\": container with ID starting with b9546916efc9ee26ff977cf8564a4230938446319ebd0a71defc482c0d3e2525 not found: ID does not exist" Apr 22 20:11:27.643328 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:27.643308 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-d0c94-predictor-84ddff5b74-vs496"] Apr 22 20:11:28.303713 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:28.303682 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" path="/var/lib/kubelet/pods/b67eab5e-b3f4-434c-90e9-aa7ff413b66a/volumes" Apr 22 20:11:33.562052 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:33.562004 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" podUID="c85295af-4265-48b5-9a4c-ac42f718f1b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 22 20:11:43.561714 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:43.561675 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" podUID="c85295af-4265-48b5-9a4c-ac42f718f1b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 22 20:11:53.561568 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:11:53.561525 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" podUID="c85295af-4265-48b5-9a4c-ac42f718f1b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 22 20:12:03.561449 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:12:03.561399 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" podUID="c85295af-4265-48b5-9a4c-ac42f718f1b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 22 20:12:13.561349 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:12:13.561304 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" podUID="c85295af-4265-48b5-9a4c-ac42f718f1b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 22 20:12:20.299942 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:12:20.299894 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" podUID="c85295af-4265-48b5-9a4c-ac42f718f1b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 22 20:12:30.304388 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:12:30.304345 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" podUID="c85295af-4265-48b5-9a4c-ac42f718f1b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 22 20:12:40.300445 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:12:40.300398 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" podUID="c85295af-4265-48b5-9a4c-ac42f718f1b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 22 20:12:50.300425 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:12:50.300338 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" podUID="c85295af-4265-48b5-9a4c-ac42f718f1b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 22 20:13:00.300712 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:00.300670 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" podUID="c85295af-4265-48b5-9a4c-ac42f718f1b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 22 20:13:10.304551 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:10.304523 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" Apr 22 20:13:17.471257 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:17.471225 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j"] Apr 22 20:13:17.471695 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:17.471553 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" podUID="c85295af-4265-48b5-9a4c-ac42f718f1b1" containerName="kserve-container" containerID="cri-o://99748d4dc0ce96820b6157115bd173a8eb92440d330a416bfc1b1fd83f0f7326" gracePeriod=30 Apr 22 20:13:17.573258 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:17.573229 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97"] Apr 22 20:13:17.573475 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:17.573463 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="storage-initializer" Apr 22 20:13:17.573517 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:17.573476 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="storage-initializer" Apr 22 20:13:17.573517 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:17.573491 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="agent" Apr 22 20:13:17.573517 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:17.573497 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="agent" Apr 22 20:13:17.573517 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:17.573505 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="kserve-container" Apr 22 20:13:17.573517 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:17.573510 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="kserve-container" Apr 22 20:13:17.573654 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:17.573548 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="kserve-container" Apr 22 20:13:17.573654 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:17.573557 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="b67eab5e-b3f4-434c-90e9-aa7ff413b66a" containerName="agent" Apr 22 20:13:17.576349 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:17.576333 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97" Apr 22 20:13:17.584922 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:17.584901 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97"] Apr 22 20:13:17.625904 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:17.625883 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c279315-6c3a-4cdd-89fb-e644689ecff5-kserve-provision-location\") pod \"isvc-primary-13f916-predictor-7f4b544bb-7sn97\" (UID: \"7c279315-6c3a-4cdd-89fb-e644689ecff5\") " pod="kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97" Apr 22 20:13:17.726230 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:17.726161 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c279315-6c3a-4cdd-89fb-e644689ecff5-kserve-provision-location\") pod \"isvc-primary-13f916-predictor-7f4b544bb-7sn97\" (UID: \"7c279315-6c3a-4cdd-89fb-e644689ecff5\") " pod="kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97" Apr 22 20:13:17.726516 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:17.726496 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c279315-6c3a-4cdd-89fb-e644689ecff5-kserve-provision-location\") pod \"isvc-primary-13f916-predictor-7f4b544bb-7sn97\" (UID: \"7c279315-6c3a-4cdd-89fb-e644689ecff5\") " pod="kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97" Apr 22 20:13:17.887632 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:17.887602 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97" Apr 22 20:13:17.997466 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:17.997400 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97"] Apr 22 20:13:18.000214 ip-10-0-128-239 kubenswrapper[2575]: W0422 20:13:18.000179 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c279315_6c3a_4cdd_89fb_e644689ecff5.slice/crio-1f025b0e584e1c0d441195b43404b9173ca594681f4dd75bb6dfb68ff37c031d WatchSource:0}: Error finding container 1f025b0e584e1c0d441195b43404b9173ca594681f4dd75bb6dfb68ff37c031d: Status 404 returned error can't find the container with id 1f025b0e584e1c0d441195b43404b9173ca594681f4dd75bb6dfb68ff37c031d Apr 22 20:13:18.898928 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:18.898891 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97" event={"ID":"7c279315-6c3a-4cdd-89fb-e644689ecff5","Type":"ContainerStarted","Data":"4ba9def09a4d5c1174e8576f455302fa589688b191bf5121c41969f62e2b35d9"} Apr 22 20:13:18.898928 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:18.898927 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97" event={"ID":"7c279315-6c3a-4cdd-89fb-e644689ecff5","Type":"ContainerStarted","Data":"1f025b0e584e1c0d441195b43404b9173ca594681f4dd75bb6dfb68ff37c031d"} Apr 22 20:13:20.300324 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:20.300275 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" podUID="c85295af-4265-48b5-9a4c-ac42f718f1b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 22 20:13:21.907271 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:21.907243 2575 generic.go:358] "Generic (PLEG): container finished" podID="7c279315-6c3a-4cdd-89fb-e644689ecff5" containerID="4ba9def09a4d5c1174e8576f455302fa589688b191bf5121c41969f62e2b35d9" exitCode=0 Apr 22 20:13:21.907636 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:21.907299 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97" event={"ID":"7c279315-6c3a-4cdd-89fb-e644689ecff5","Type":"ContainerDied","Data":"4ba9def09a4d5c1174e8576f455302fa589688b191bf5121c41969f62e2b35d9"} Apr 22 20:13:22.911536 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:22.911501 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97" event={"ID":"7c279315-6c3a-4cdd-89fb-e644689ecff5","Type":"ContainerStarted","Data":"79f1f551db9f0e3ae3259eefda85068c533b4f16d3dbb1468408a14f0d67c916"} Apr 22 20:13:22.911900 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:22.911875 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97" Apr 22 20:13:22.913073 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:22.913050 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97" podUID="7c279315-6c3a-4cdd-89fb-e644689ecff5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 22 20:13:22.928172 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:22.928127 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97" podStartSLOduration=5.928116054 podStartE2EDuration="5.928116054s" podCreationTimestamp="2026-04-22 20:13:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:13:22.926619854 +0000 UTC m=+927.178119742" watchObservedRunningTime="2026-04-22 20:13:22.928116054 +0000 UTC m=+927.179615944" Apr 22 20:13:23.914362 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:23.914328 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97" podUID="7c279315-6c3a-4cdd-89fb-e644689ecff5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 22 20:13:25.415850 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:25.415826 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" Apr 22 20:13:25.475457 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:25.475438 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c85295af-4265-48b5-9a4c-ac42f718f1b1-kserve-provision-location\") pod \"c85295af-4265-48b5-9a4c-ac42f718f1b1\" (UID: \"c85295af-4265-48b5-9a4c-ac42f718f1b1\") " Apr 22 20:13:25.475731 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:25.475709 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c85295af-4265-48b5-9a4c-ac42f718f1b1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c85295af-4265-48b5-9a4c-ac42f718f1b1" (UID: "c85295af-4265-48b5-9a4c-ac42f718f1b1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:13:25.576312 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:25.576264 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c85295af-4265-48b5-9a4c-ac42f718f1b1-kserve-provision-location\") on node \"ip-10-0-128-239.ec2.internal\" DevicePath \"\"" Apr 22 20:13:25.920698 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:25.920669 2575 generic.go:358] "Generic (PLEG): container finished" podID="c85295af-4265-48b5-9a4c-ac42f718f1b1" containerID="99748d4dc0ce96820b6157115bd173a8eb92440d330a416bfc1b1fd83f0f7326" exitCode=0 Apr 22 20:13:25.920847 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:25.920718 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" event={"ID":"c85295af-4265-48b5-9a4c-ac42f718f1b1","Type":"ContainerDied","Data":"99748d4dc0ce96820b6157115bd173a8eb92440d330a416bfc1b1fd83f0f7326"} Apr 22 20:13:25.920847 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:25.920728 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" Apr 22 20:13:25.920847 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:25.920751 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j" event={"ID":"c85295af-4265-48b5-9a4c-ac42f718f1b1","Type":"ContainerDied","Data":"56d6baa24b592d0df9c94d6cca5e8fc2cb5a7b0e132c0f997d3acbad2265d35b"} Apr 22 20:13:25.920847 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:25.920768 2575 scope.go:117] "RemoveContainer" containerID="99748d4dc0ce96820b6157115bd173a8eb92440d330a416bfc1b1fd83f0f7326" Apr 22 20:13:25.929531 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:25.929515 2575 scope.go:117] "RemoveContainer" containerID="21a29672e32f421d306037fc9ae5971a7dafad6ccc3318aac3675139ebe1a0bb" Apr 22 20:13:25.936495 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:25.936481 2575 scope.go:117] "RemoveContainer" containerID="99748d4dc0ce96820b6157115bd173a8eb92440d330a416bfc1b1fd83f0f7326" Apr 22 20:13:25.936728 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:13:25.936708 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99748d4dc0ce96820b6157115bd173a8eb92440d330a416bfc1b1fd83f0f7326\": container with ID starting with 99748d4dc0ce96820b6157115bd173a8eb92440d330a416bfc1b1fd83f0f7326 not found: ID does not exist" containerID="99748d4dc0ce96820b6157115bd173a8eb92440d330a416bfc1b1fd83f0f7326" Apr 22 20:13:25.936776 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:25.936736 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99748d4dc0ce96820b6157115bd173a8eb92440d330a416bfc1b1fd83f0f7326"} err="failed to get container status \"99748d4dc0ce96820b6157115bd173a8eb92440d330a416bfc1b1fd83f0f7326\": rpc error: code = NotFound desc = could not find container \"99748d4dc0ce96820b6157115bd173a8eb92440d330a416bfc1b1fd83f0f7326\": container with ID starting with 99748d4dc0ce96820b6157115bd173a8eb92440d330a416bfc1b1fd83f0f7326 not found: ID does not exist" Apr 22 20:13:25.936776 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:25.936753 2575 scope.go:117] "RemoveContainer" containerID="21a29672e32f421d306037fc9ae5971a7dafad6ccc3318aac3675139ebe1a0bb" Apr 22 20:13:25.936961 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:13:25.936944 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21a29672e32f421d306037fc9ae5971a7dafad6ccc3318aac3675139ebe1a0bb\": container with ID starting with 21a29672e32f421d306037fc9ae5971a7dafad6ccc3318aac3675139ebe1a0bb not found: ID does not exist" containerID="21a29672e32f421d306037fc9ae5971a7dafad6ccc3318aac3675139ebe1a0bb" Apr 22 20:13:25.937005 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:25.936965 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21a29672e32f421d306037fc9ae5971a7dafad6ccc3318aac3675139ebe1a0bb"} err="failed to get container status \"21a29672e32f421d306037fc9ae5971a7dafad6ccc3318aac3675139ebe1a0bb\": rpc error: code = NotFound desc = could not find container \"21a29672e32f421d306037fc9ae5971a7dafad6ccc3318aac3675139ebe1a0bb\": container with ID starting with 21a29672e32f421d306037fc9ae5971a7dafad6ccc3318aac3675139ebe1a0bb not found: ID does not exist" Apr 22 20:13:25.942146 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:25.942128 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j"] Apr 22 20:13:25.946248 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:25.946226 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-45df6-predictor-854b479db7-7rn5j"] Apr 22 20:13:26.303623 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:26.303542 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c85295af-4265-48b5-9a4c-ac42f718f1b1" path="/var/lib/kubelet/pods/c85295af-4265-48b5-9a4c-ac42f718f1b1/volumes" Apr 22 20:13:33.914333 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:33.914289 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97" podUID="7c279315-6c3a-4cdd-89fb-e644689ecff5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 22 20:13:43.915165 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:43.915115 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97" podUID="7c279315-6c3a-4cdd-89fb-e644689ecff5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 22 20:13:53.914447 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:13:53.914404 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97" podUID="7c279315-6c3a-4cdd-89fb-e644689ecff5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 22 20:14:03.915178 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:03.915137 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97" podUID="7c279315-6c3a-4cdd-89fb-e644689ecff5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 22 20:14:13.915084 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:13.915023 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97" podUID="7c279315-6c3a-4cdd-89fb-e644689ecff5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 22 20:14:23.916171 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:23.916138 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97" Apr 22 20:14:27.694170 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:27.694142 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-13f916-predictor-d9cf89fc5-skn6q"] Apr 22 20:14:27.694600 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:27.694393 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c85295af-4265-48b5-9a4c-ac42f718f1b1" containerName="storage-initializer" Apr 22 20:14:27.694600 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:27.694404 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c85295af-4265-48b5-9a4c-ac42f718f1b1" containerName="storage-initializer" Apr 22 20:14:27.694600 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:27.694413 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c85295af-4265-48b5-9a4c-ac42f718f1b1" containerName="kserve-container" Apr 22 20:14:27.694600 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:27.694419 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c85295af-4265-48b5-9a4c-ac42f718f1b1" containerName="kserve-container" Apr 22 20:14:27.694600 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:27.694474 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c85295af-4265-48b5-9a4c-ac42f718f1b1" containerName="kserve-container" Apr 22 20:14:27.697329 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:27.697310 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-13f916-predictor-d9cf89fc5-skn6q" Apr 22 20:14:27.699690 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:27.699667 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 22 20:14:27.699802 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:27.699707 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-13f916\"" Apr 22 20:14:27.700370 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:27.700349 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-13f916-dockercfg-rmglh\"" Apr 22 20:14:27.704861 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:27.704843 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-13f916-predictor-d9cf89fc5-skn6q"] Apr 22 20:14:27.805397 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:27.805376 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e98535e9-0e7b-41a9-b598-b082c4c7d137-kserve-provision-location\") pod \"isvc-secondary-13f916-predictor-d9cf89fc5-skn6q\" (UID: \"e98535e9-0e7b-41a9-b598-b082c4c7d137\") " pod="kserve-ci-e2e-test/isvc-secondary-13f916-predictor-d9cf89fc5-skn6q" Apr 22 20:14:27.805494 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:27.805414 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/e98535e9-0e7b-41a9-b598-b082c4c7d137-cabundle-cert\") pod \"isvc-secondary-13f916-predictor-d9cf89fc5-skn6q\" (UID: \"e98535e9-0e7b-41a9-b598-b082c4c7d137\") " pod="kserve-ci-e2e-test/isvc-secondary-13f916-predictor-d9cf89fc5-skn6q" Apr 22 20:14:27.905782 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:27.905758 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/e98535e9-0e7b-41a9-b598-b082c4c7d137-cabundle-cert\") pod \"isvc-secondary-13f916-predictor-d9cf89fc5-skn6q\" (UID: \"e98535e9-0e7b-41a9-b598-b082c4c7d137\") " pod="kserve-ci-e2e-test/isvc-secondary-13f916-predictor-d9cf89fc5-skn6q" Apr 22 20:14:27.905867 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:27.905814 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e98535e9-0e7b-41a9-b598-b082c4c7d137-kserve-provision-location\") pod \"isvc-secondary-13f916-predictor-d9cf89fc5-skn6q\" (UID: \"e98535e9-0e7b-41a9-b598-b082c4c7d137\") " pod="kserve-ci-e2e-test/isvc-secondary-13f916-predictor-d9cf89fc5-skn6q" Apr 22 20:14:27.906130 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:27.906114 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e98535e9-0e7b-41a9-b598-b082c4c7d137-kserve-provision-location\") pod \"isvc-secondary-13f916-predictor-d9cf89fc5-skn6q\" (UID: \"e98535e9-0e7b-41a9-b598-b082c4c7d137\") " pod="kserve-ci-e2e-test/isvc-secondary-13f916-predictor-d9cf89fc5-skn6q" Apr 22 20:14:27.906355 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:27.906340 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/e98535e9-0e7b-41a9-b598-b082c4c7d137-cabundle-cert\") pod \"isvc-secondary-13f916-predictor-d9cf89fc5-skn6q\" (UID: \"e98535e9-0e7b-41a9-b598-b082c4c7d137\") " pod="kserve-ci-e2e-test/isvc-secondary-13f916-predictor-d9cf89fc5-skn6q" Apr 22 20:14:28.009160 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:28.009106 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-13f916-predictor-d9cf89fc5-skn6q" Apr 22 20:14:28.122262 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:28.122233 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-13f916-predictor-d9cf89fc5-skn6q"] Apr 22 20:14:28.125407 ip-10-0-128-239 kubenswrapper[2575]: W0422 20:14:28.125376 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode98535e9_0e7b_41a9_b598_b082c4c7d137.slice/crio-34b8351116f9375dc5e9f79ec20ef67f6849bf4a19292e633c44d5ef86b91d2a WatchSource:0}: Error finding container 34b8351116f9375dc5e9f79ec20ef67f6849bf4a19292e633c44d5ef86b91d2a: Status 404 returned error can't find the container with id 34b8351116f9375dc5e9f79ec20ef67f6849bf4a19292e633c44d5ef86b91d2a Apr 22 20:14:28.127207 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:28.127190 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:14:29.081386 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:29.081352 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-13f916-predictor-d9cf89fc5-skn6q" event={"ID":"e98535e9-0e7b-41a9-b598-b082c4c7d137","Type":"ContainerStarted","Data":"1ec238127ff6f1a390eebf2263aa1f4cb33ff601a70057ce40628a9a16469095"} Apr 22 20:14:29.081386 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:29.081390 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-13f916-predictor-d9cf89fc5-skn6q" event={"ID":"e98535e9-0e7b-41a9-b598-b082c4c7d137","Type":"ContainerStarted","Data":"34b8351116f9375dc5e9f79ec20ef67f6849bf4a19292e633c44d5ef86b91d2a"} Apr 22 20:14:32.089926 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:32.089895 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-13f916-predictor-d9cf89fc5-skn6q_e98535e9-0e7b-41a9-b598-b082c4c7d137/storage-initializer/0.log" Apr 22 20:14:32.090307 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:32.089935 2575 generic.go:358] "Generic (PLEG): container finished" podID="e98535e9-0e7b-41a9-b598-b082c4c7d137" containerID="1ec238127ff6f1a390eebf2263aa1f4cb33ff601a70057ce40628a9a16469095" exitCode=1 Apr 22 20:14:32.090307 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:32.089962 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-13f916-predictor-d9cf89fc5-skn6q" event={"ID":"e98535e9-0e7b-41a9-b598-b082c4c7d137","Type":"ContainerDied","Data":"1ec238127ff6f1a390eebf2263aa1f4cb33ff601a70057ce40628a9a16469095"} Apr 22 20:14:33.094276 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:33.094248 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-13f916-predictor-d9cf89fc5-skn6q_e98535e9-0e7b-41a9-b598-b082c4c7d137/storage-initializer/0.log" Apr 22 20:14:33.094651 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:33.094332 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-13f916-predictor-d9cf89fc5-skn6q" event={"ID":"e98535e9-0e7b-41a9-b598-b082c4c7d137","Type":"ContainerStarted","Data":"a4afb91199dd65709ff08978708ea442b6960793e154e7f74e3ee48da2ee84aa"} Apr 22 20:14:38.108250 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:38.108178 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-13f916-predictor-d9cf89fc5-skn6q_e98535e9-0e7b-41a9-b598-b082c4c7d137/storage-initializer/1.log" Apr 22 20:14:38.108595 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:38.108535 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-13f916-predictor-d9cf89fc5-skn6q_e98535e9-0e7b-41a9-b598-b082c4c7d137/storage-initializer/0.log" Apr 22 20:14:38.108595 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:38.108569 2575 generic.go:358] "Generic (PLEG): container finished" podID="e98535e9-0e7b-41a9-b598-b082c4c7d137" containerID="a4afb91199dd65709ff08978708ea442b6960793e154e7f74e3ee48da2ee84aa" exitCode=1 Apr 22 20:14:38.108668 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:38.108612 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-13f916-predictor-d9cf89fc5-skn6q" event={"ID":"e98535e9-0e7b-41a9-b598-b082c4c7d137","Type":"ContainerDied","Data":"a4afb91199dd65709ff08978708ea442b6960793e154e7f74e3ee48da2ee84aa"} Apr 22 20:14:38.108668 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:38.108646 2575 scope.go:117] "RemoveContainer" containerID="1ec238127ff6f1a390eebf2263aa1f4cb33ff601a70057ce40628a9a16469095" Apr 22 20:14:38.109020 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:38.108999 2575 scope.go:117] "RemoveContainer" containerID="1ec238127ff6f1a390eebf2263aa1f4cb33ff601a70057ce40628a9a16469095" Apr 22 20:14:38.118735 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:14:38.118697 2575 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-13f916-predictor-d9cf89fc5-skn6q_kserve-ci-e2e-test_e98535e9-0e7b-41a9-b598-b082c4c7d137_0 in pod sandbox 34b8351116f9375dc5e9f79ec20ef67f6849bf4a19292e633c44d5ef86b91d2a from index: no such id: '1ec238127ff6f1a390eebf2263aa1f4cb33ff601a70057ce40628a9a16469095'" containerID="1ec238127ff6f1a390eebf2263aa1f4cb33ff601a70057ce40628a9a16469095" Apr 22 20:14:38.118823 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:38.118746 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ec238127ff6f1a390eebf2263aa1f4cb33ff601a70057ce40628a9a16469095"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-13f916-predictor-d9cf89fc5-skn6q_kserve-ci-e2e-test_e98535e9-0e7b-41a9-b598-b082c4c7d137_0 in pod sandbox 34b8351116f9375dc5e9f79ec20ef67f6849bf4a19292e633c44d5ef86b91d2a from index: no such id: '1ec238127ff6f1a390eebf2263aa1f4cb33ff601a70057ce40628a9a16469095'" Apr 22 20:14:38.118945 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:14:38.118925 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-13f916-predictor-d9cf89fc5-skn6q_kserve-ci-e2e-test(e98535e9-0e7b-41a9-b598-b082c4c7d137)\"" pod="kserve-ci-e2e-test/isvc-secondary-13f916-predictor-d9cf89fc5-skn6q" podUID="e98535e9-0e7b-41a9-b598-b082c4c7d137" Apr 22 20:14:39.112803 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:39.112773 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-13f916-predictor-d9cf89fc5-skn6q_e98535e9-0e7b-41a9-b598-b082c4c7d137/storage-initializer/1.log" Apr 22 20:14:43.755792 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:43.755760 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-13f916-predictor-d9cf89fc5-skn6q"] Apr 22 20:14:43.818386 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:43.818356 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97"] Apr 22 20:14:43.818700 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:43.818671 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97" podUID="7c279315-6c3a-4cdd-89fb-e644689ecff5" containerName="kserve-container" containerID="cri-o://79f1f551db9f0e3ae3259eefda85068c533b4f16d3dbb1468408a14f0d67c916" gracePeriod=30 Apr 22 20:14:43.857843 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:43.857817 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h"] Apr 22 20:14:43.862577 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:43.862554 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h" Apr 22 20:14:43.864658 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:43.864637 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-dcfe87\"" Apr 22 20:14:43.864768 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:43.864721 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-dcfe87-dockercfg-xrjhz\"" Apr 22 20:14:43.868328 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:43.868291 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h"] Apr 22 20:14:43.893567 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:43.893552 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-13f916-predictor-d9cf89fc5-skn6q_e98535e9-0e7b-41a9-b598-b082c4c7d137/storage-initializer/1.log" Apr 22 20:14:43.893657 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:43.893602 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-13f916-predictor-d9cf89fc5-skn6q" Apr 22 20:14:43.914294 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:43.914269 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97" podUID="7c279315-6c3a-4cdd-89fb-e644689ecff5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 22 20:14:44.009247 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:44.009187 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/e98535e9-0e7b-41a9-b598-b082c4c7d137-cabundle-cert\") pod \"e98535e9-0e7b-41a9-b598-b082c4c7d137\" (UID: \"e98535e9-0e7b-41a9-b598-b082c4c7d137\") " Apr 22 20:14:44.009247 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:44.009226 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e98535e9-0e7b-41a9-b598-b082c4c7d137-kserve-provision-location\") pod \"e98535e9-0e7b-41a9-b598-b082c4c7d137\" (UID: \"e98535e9-0e7b-41a9-b598-b082c4c7d137\") " Apr 22 20:14:44.009406 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:44.009363 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d2b9873e-d0f0-4974-81f0-069756d60cd6-kserve-provision-location\") pod \"isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h\" (UID: \"d2b9873e-d0f0-4974-81f0-069756d60cd6\") " pod="kserve-ci-e2e-test/isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h" Apr 22 20:14:44.009460 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:44.009419 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/d2b9873e-d0f0-4974-81f0-069756d60cd6-cabundle-cert\") pod \"isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h\" (UID: \"d2b9873e-d0f0-4974-81f0-069756d60cd6\") " pod="kserve-ci-e2e-test/isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h" Apr 22 20:14:44.009516 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:44.009493 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e98535e9-0e7b-41a9-b598-b082c4c7d137-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e98535e9-0e7b-41a9-b598-b082c4c7d137" (UID: "e98535e9-0e7b-41a9-b598-b082c4c7d137"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:14:44.009568 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:44.009536 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e98535e9-0e7b-41a9-b598-b082c4c7d137-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "e98535e9-0e7b-41a9-b598-b082c4c7d137" (UID: "e98535e9-0e7b-41a9-b598-b082c4c7d137"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:14:44.110497 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:44.110464 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d2b9873e-d0f0-4974-81f0-069756d60cd6-kserve-provision-location\") pod \"isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h\" (UID: \"d2b9873e-d0f0-4974-81f0-069756d60cd6\") " pod="kserve-ci-e2e-test/isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h" Apr 22 20:14:44.110632 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:44.110508 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/d2b9873e-d0f0-4974-81f0-069756d60cd6-cabundle-cert\") pod \"isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h\" (UID: \"d2b9873e-d0f0-4974-81f0-069756d60cd6\") " pod="kserve-ci-e2e-test/isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h" Apr 22 20:14:44.110632 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:44.110566 2575 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/e98535e9-0e7b-41a9-b598-b082c4c7d137-cabundle-cert\") on node \"ip-10-0-128-239.ec2.internal\" DevicePath \"\"" Apr 22 20:14:44.110632 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:44.110578 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e98535e9-0e7b-41a9-b598-b082c4c7d137-kserve-provision-location\") on node \"ip-10-0-128-239.ec2.internal\" DevicePath \"\"" Apr 22 20:14:44.110841 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:44.110822 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d2b9873e-d0f0-4974-81f0-069756d60cd6-kserve-provision-location\") pod \"isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h\" (UID: \"d2b9873e-d0f0-4974-81f0-069756d60cd6\") " pod="kserve-ci-e2e-test/isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h" Apr 22 20:14:44.111094 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:44.111078 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/d2b9873e-d0f0-4974-81f0-069756d60cd6-cabundle-cert\") pod \"isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h\" (UID: \"d2b9873e-d0f0-4974-81f0-069756d60cd6\") " pod="kserve-ci-e2e-test/isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h" Apr 22 20:14:44.126618 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:44.126599 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-13f916-predictor-d9cf89fc5-skn6q_e98535e9-0e7b-41a9-b598-b082c4c7d137/storage-initializer/1.log" Apr 22 20:14:44.126732 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:44.126642 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-13f916-predictor-d9cf89fc5-skn6q" event={"ID":"e98535e9-0e7b-41a9-b598-b082c4c7d137","Type":"ContainerDied","Data":"34b8351116f9375dc5e9f79ec20ef67f6849bf4a19292e633c44d5ef86b91d2a"} Apr 22 20:14:44.126732 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:44.126680 2575 scope.go:117] "RemoveContainer" containerID="a4afb91199dd65709ff08978708ea442b6960793e154e7f74e3ee48da2ee84aa" Apr 22 20:14:44.126732 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:44.126703 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-13f916-predictor-d9cf89fc5-skn6q" Apr 22 20:14:44.161428 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:44.161398 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-13f916-predictor-d9cf89fc5-skn6q"] Apr 22 20:14:44.165116 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:44.165096 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-13f916-predictor-d9cf89fc5-skn6q"] Apr 22 20:14:44.173804 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:44.173786 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h" Apr 22 20:14:44.303742 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:44.303701 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e98535e9-0e7b-41a9-b598-b082c4c7d137" path="/var/lib/kubelet/pods/e98535e9-0e7b-41a9-b598-b082c4c7d137/volumes" Apr 22 20:14:44.304086 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:44.304065 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h"] Apr 22 20:14:44.304357 ip-10-0-128-239 kubenswrapper[2575]: W0422 20:14:44.304329 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2b9873e_d0f0_4974_81f0_069756d60cd6.slice/crio-d03b4072ea223aae6b479e8df5e4bdd3ad23d0fb2813fc700a50bb60c2b7563b WatchSource:0}: Error finding container d03b4072ea223aae6b479e8df5e4bdd3ad23d0fb2813fc700a50bb60c2b7563b: Status 404 returned error can't find the container with id d03b4072ea223aae6b479e8df5e4bdd3ad23d0fb2813fc700a50bb60c2b7563b Apr 22 20:14:45.131155 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:45.131122 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h" event={"ID":"d2b9873e-d0f0-4974-81f0-069756d60cd6","Type":"ContainerStarted","Data":"0d6138e85f9e178de1c2810452304a6fc484a4fdd33bb3d9b4013dcf6de16821"} Apr 22 20:14:45.131155 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:45.131157 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h" event={"ID":"d2b9873e-d0f0-4974-81f0-069756d60cd6","Type":"ContainerStarted","Data":"d03b4072ea223aae6b479e8df5e4bdd3ad23d0fb2813fc700a50bb60c2b7563b"} Apr 22 20:14:47.551642 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:47.551620 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97" Apr 22 20:14:47.737365 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:47.737341 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c279315-6c3a-4cdd-89fb-e644689ecff5-kserve-provision-location\") pod \"7c279315-6c3a-4cdd-89fb-e644689ecff5\" (UID: \"7c279315-6c3a-4cdd-89fb-e644689ecff5\") " Apr 22 20:14:47.737675 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:47.737650 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c279315-6c3a-4cdd-89fb-e644689ecff5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7c279315-6c3a-4cdd-89fb-e644689ecff5" (UID: "7c279315-6c3a-4cdd-89fb-e644689ecff5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:14:47.838106 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:47.838082 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c279315-6c3a-4cdd-89fb-e644689ecff5-kserve-provision-location\") on node \"ip-10-0-128-239.ec2.internal\" DevicePath \"\"" Apr 22 20:14:48.140246 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.140209 2575 generic.go:358] "Generic (PLEG): container finished" podID="7c279315-6c3a-4cdd-89fb-e644689ecff5" containerID="79f1f551db9f0e3ae3259eefda85068c533b4f16d3dbb1468408a14f0d67c916" exitCode=0 Apr 22 20:14:48.140387 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.140278 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97" Apr 22 20:14:48.140387 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.140286 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97" event={"ID":"7c279315-6c3a-4cdd-89fb-e644689ecff5","Type":"ContainerDied","Data":"79f1f551db9f0e3ae3259eefda85068c533b4f16d3dbb1468408a14f0d67c916"} Apr 22 20:14:48.140387 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.140327 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97" event={"ID":"7c279315-6c3a-4cdd-89fb-e644689ecff5","Type":"ContainerDied","Data":"1f025b0e584e1c0d441195b43404b9173ca594681f4dd75bb6dfb68ff37c031d"} Apr 22 20:14:48.140387 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.140351 2575 scope.go:117] "RemoveContainer" containerID="79f1f551db9f0e3ae3259eefda85068c533b4f16d3dbb1468408a14f0d67c916" Apr 22 20:14:48.141741 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.141706 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h_d2b9873e-d0f0-4974-81f0-069756d60cd6/storage-initializer/0.log" Apr 22 20:14:48.141833 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.141744 2575 generic.go:358] "Generic (PLEG): container finished" podID="d2b9873e-d0f0-4974-81f0-069756d60cd6" containerID="0d6138e85f9e178de1c2810452304a6fc484a4fdd33bb3d9b4013dcf6de16821" exitCode=1 Apr 22 20:14:48.141833 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.141803 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h" event={"ID":"d2b9873e-d0f0-4974-81f0-069756d60cd6","Type":"ContainerDied","Data":"0d6138e85f9e178de1c2810452304a6fc484a4fdd33bb3d9b4013dcf6de16821"} Apr 22 20:14:48.147969 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.147916 2575 scope.go:117] "RemoveContainer" containerID="4ba9def09a4d5c1174e8576f455302fa589688b191bf5121c41969f62e2b35d9" Apr 22 20:14:48.154548 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.154531 2575 scope.go:117] "RemoveContainer" containerID="79f1f551db9f0e3ae3259eefda85068c533b4f16d3dbb1468408a14f0d67c916" Apr 22 20:14:48.154799 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:14:48.154784 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79f1f551db9f0e3ae3259eefda85068c533b4f16d3dbb1468408a14f0d67c916\": container with ID starting with 79f1f551db9f0e3ae3259eefda85068c533b4f16d3dbb1468408a14f0d67c916 not found: ID does not exist" containerID="79f1f551db9f0e3ae3259eefda85068c533b4f16d3dbb1468408a14f0d67c916" Apr 22 20:14:48.154862 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.154805 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79f1f551db9f0e3ae3259eefda85068c533b4f16d3dbb1468408a14f0d67c916"} err="failed to get container status \"79f1f551db9f0e3ae3259eefda85068c533b4f16d3dbb1468408a14f0d67c916\": rpc error: code = NotFound desc = could not find container \"79f1f551db9f0e3ae3259eefda85068c533b4f16d3dbb1468408a14f0d67c916\": container with ID starting with 79f1f551db9f0e3ae3259eefda85068c533b4f16d3dbb1468408a14f0d67c916 not found: ID does not exist" Apr 22 20:14:48.154862 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.154820 2575 scope.go:117] "RemoveContainer" containerID="4ba9def09a4d5c1174e8576f455302fa589688b191bf5121c41969f62e2b35d9" Apr 22 20:14:48.155061 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:14:48.155028 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ba9def09a4d5c1174e8576f455302fa589688b191bf5121c41969f62e2b35d9\": container with ID starting with 4ba9def09a4d5c1174e8576f455302fa589688b191bf5121c41969f62e2b35d9 not found: ID does not exist" containerID="4ba9def09a4d5c1174e8576f455302fa589688b191bf5121c41969f62e2b35d9" Apr 22 20:14:48.155112 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.155066 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ba9def09a4d5c1174e8576f455302fa589688b191bf5121c41969f62e2b35d9"} err="failed to get container status \"4ba9def09a4d5c1174e8576f455302fa589688b191bf5121c41969f62e2b35d9\": rpc error: code = NotFound desc = could not find container \"4ba9def09a4d5c1174e8576f455302fa589688b191bf5121c41969f62e2b35d9\": container with ID starting with 4ba9def09a4d5c1174e8576f455302fa589688b191bf5121c41969f62e2b35d9 not found: ID does not exist" Apr 22 20:14:48.169125 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.169102 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97"] Apr 22 20:14:48.172322 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.172303 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-13f916-predictor-7f4b544bb-7sn97"] Apr 22 20:14:48.303133 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.303102 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c279315-6c3a-4cdd-89fb-e644689ecff5" path="/var/lib/kubelet/pods/7c279315-6c3a-4cdd-89fb-e644689ecff5/volumes" Apr 22 20:14:48.852798 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.852764 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h"] Apr 22 20:14:48.972980 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.972942 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-2bb0d-predictor-6755c856c-4flnf"] Apr 22 20:14:48.973305 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.973290 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7c279315-6c3a-4cdd-89fb-e644689ecff5" containerName="kserve-container" Apr 22 20:14:48.973368 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.973307 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c279315-6c3a-4cdd-89fb-e644689ecff5" containerName="kserve-container" Apr 22 20:14:48.973368 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.973325 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e98535e9-0e7b-41a9-b598-b082c4c7d137" containerName="storage-initializer" Apr 22 20:14:48.973368 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.973334 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e98535e9-0e7b-41a9-b598-b082c4c7d137" containerName="storage-initializer" Apr 22 20:14:48.973368 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.973346 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7c279315-6c3a-4cdd-89fb-e644689ecff5" containerName="storage-initializer" Apr 22 20:14:48.973368 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.973355 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c279315-6c3a-4cdd-89fb-e644689ecff5" containerName="storage-initializer" Apr 22 20:14:48.973549 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.973416 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e98535e9-0e7b-41a9-b598-b082c4c7d137" containerName="storage-initializer" Apr 22 20:14:48.973549 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.973428 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="7c279315-6c3a-4cdd-89fb-e644689ecff5" containerName="kserve-container" Apr 22 20:14:48.973549 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.973436 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e98535e9-0e7b-41a9-b598-b082c4c7d137" containerName="storage-initializer" Apr 22 20:14:48.973549 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.973495 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e98535e9-0e7b-41a9-b598-b082c4c7d137" containerName="storage-initializer" Apr 22 20:14:48.973549 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.973504 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e98535e9-0e7b-41a9-b598-b082c4c7d137" containerName="storage-initializer" Apr 22 20:14:48.976726 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.976709 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-2bb0d-predictor-6755c856c-4flnf" Apr 22 20:14:48.978945 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.978925 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-hjbmb\"" Apr 22 20:14:48.981609 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:48.981586 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-2bb0d-predictor-6755c856c-4flnf"] Apr 22 20:14:49.146124 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:49.146103 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h_d2b9873e-d0f0-4974-81f0-069756d60cd6/storage-initializer/0.log" Apr 22 20:14:49.146293 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:49.146206 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h" event={"ID":"d2b9873e-d0f0-4974-81f0-069756d60cd6","Type":"ContainerStarted","Data":"bd63d5576743a65cf7e17a2bbabfe8c2b9186d610e3ffd2f848ec998faac5332"} Apr 22 20:14:49.146384 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:49.146357 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h" podUID="d2b9873e-d0f0-4974-81f0-069756d60cd6" containerName="storage-initializer" containerID="cri-o://bd63d5576743a65cf7e17a2bbabfe8c2b9186d610e3ffd2f848ec998faac5332" gracePeriod=30 Apr 22 20:14:49.148358 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:49.148340 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33314084-ed1b-44e7-b322-fcc1910f2ac4-kserve-provision-location\") pod \"raw-sklearn-2bb0d-predictor-6755c856c-4flnf\" (UID: \"33314084-ed1b-44e7-b322-fcc1910f2ac4\") " pod="kserve-ci-e2e-test/raw-sklearn-2bb0d-predictor-6755c856c-4flnf" Apr 22 20:14:49.249129 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:49.249099 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33314084-ed1b-44e7-b322-fcc1910f2ac4-kserve-provision-location\") pod \"raw-sklearn-2bb0d-predictor-6755c856c-4flnf\" (UID: \"33314084-ed1b-44e7-b322-fcc1910f2ac4\") " pod="kserve-ci-e2e-test/raw-sklearn-2bb0d-predictor-6755c856c-4flnf" Apr 22 20:14:49.249421 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:49.249404 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33314084-ed1b-44e7-b322-fcc1910f2ac4-kserve-provision-location\") pod \"raw-sklearn-2bb0d-predictor-6755c856c-4flnf\" (UID: \"33314084-ed1b-44e7-b322-fcc1910f2ac4\") " pod="kserve-ci-e2e-test/raw-sklearn-2bb0d-predictor-6755c856c-4flnf" Apr 22 20:14:49.288837 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:49.288809 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-2bb0d-predictor-6755c856c-4flnf" Apr 22 20:14:49.414853 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:49.414793 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-2bb0d-predictor-6755c856c-4flnf"] Apr 22 20:14:49.417814 ip-10-0-128-239 kubenswrapper[2575]: W0422 20:14:49.417780 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33314084_ed1b_44e7_b322_fcc1910f2ac4.slice/crio-48c67c593dd656aa52e11a1a550720ab06af7164fc0fb457b977aa0af6e183ca WatchSource:0}: Error finding container 48c67c593dd656aa52e11a1a550720ab06af7164fc0fb457b977aa0af6e183ca: Status 404 returned error can't find the container with id 48c67c593dd656aa52e11a1a550720ab06af7164fc0fb457b977aa0af6e183ca Apr 22 20:14:50.151412 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:50.151379 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-2bb0d-predictor-6755c856c-4flnf" event={"ID":"33314084-ed1b-44e7-b322-fcc1910f2ac4","Type":"ContainerStarted","Data":"83cedf90aefd5af37e8e57fc53117c6a08fa400b5004f5518c7228add347fcbd"} Apr 22 20:14:50.151412 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:50.151417 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-2bb0d-predictor-6755c856c-4flnf" event={"ID":"33314084-ed1b-44e7-b322-fcc1910f2ac4","Type":"ContainerStarted","Data":"48c67c593dd656aa52e11a1a550720ab06af7164fc0fb457b977aa0af6e183ca"} Apr 22 20:14:53.159518 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:53.159489 2575 generic.go:358] "Generic (PLEG): container finished" podID="33314084-ed1b-44e7-b322-fcc1910f2ac4" containerID="83cedf90aefd5af37e8e57fc53117c6a08fa400b5004f5518c7228add347fcbd" exitCode=0 Apr 22 20:14:53.159861 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:53.159559 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-2bb0d-predictor-6755c856c-4flnf" event={"ID":"33314084-ed1b-44e7-b322-fcc1910f2ac4","Type":"ContainerDied","Data":"83cedf90aefd5af37e8e57fc53117c6a08fa400b5004f5518c7228add347fcbd"} Apr 22 20:14:53.160836 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:53.160811 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h_d2b9873e-d0f0-4974-81f0-069756d60cd6/storage-initializer/1.log" Apr 22 20:14:53.161190 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:53.161175 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h_d2b9873e-d0f0-4974-81f0-069756d60cd6/storage-initializer/0.log" Apr 22 20:14:53.161286 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:53.161204 2575 generic.go:358] "Generic (PLEG): container finished" podID="d2b9873e-d0f0-4974-81f0-069756d60cd6" containerID="bd63d5576743a65cf7e17a2bbabfe8c2b9186d610e3ffd2f848ec998faac5332" exitCode=1 Apr 22 20:14:53.161286 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:53.161263 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h" event={"ID":"d2b9873e-d0f0-4974-81f0-069756d60cd6","Type":"ContainerDied","Data":"bd63d5576743a65cf7e17a2bbabfe8c2b9186d610e3ffd2f848ec998faac5332"} Apr 22 20:14:53.161423 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:53.161291 2575 scope.go:117] "RemoveContainer" containerID="0d6138e85f9e178de1c2810452304a6fc484a4fdd33bb3d9b4013dcf6de16821" Apr 22 20:14:53.286617 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:53.286594 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h_d2b9873e-d0f0-4974-81f0-069756d60cd6/storage-initializer/1.log" Apr 22 20:14:53.286726 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:53.286655 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h" Apr 22 20:14:53.479152 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:53.479058 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d2b9873e-d0f0-4974-81f0-069756d60cd6-kserve-provision-location\") pod \"d2b9873e-d0f0-4974-81f0-069756d60cd6\" (UID: \"d2b9873e-d0f0-4974-81f0-069756d60cd6\") " Apr 22 20:14:53.479152 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:53.479139 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/d2b9873e-d0f0-4974-81f0-069756d60cd6-cabundle-cert\") pod \"d2b9873e-d0f0-4974-81f0-069756d60cd6\" (UID: \"d2b9873e-d0f0-4974-81f0-069756d60cd6\") " Apr 22 20:14:53.479377 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:53.479210 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2b9873e-d0f0-4974-81f0-069756d60cd6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d2b9873e-d0f0-4974-81f0-069756d60cd6" (UID: "d2b9873e-d0f0-4974-81f0-069756d60cd6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:14:53.479377 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:53.479274 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d2b9873e-d0f0-4974-81f0-069756d60cd6-kserve-provision-location\") on node \"ip-10-0-128-239.ec2.internal\" DevicePath \"\"" Apr 22 20:14:53.479547 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:53.479524 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2b9873e-d0f0-4974-81f0-069756d60cd6-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "d2b9873e-d0f0-4974-81f0-069756d60cd6" (UID: "d2b9873e-d0f0-4974-81f0-069756d60cd6"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:14:53.580479 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:53.580449 2575 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/d2b9873e-d0f0-4974-81f0-069756d60cd6-cabundle-cert\") on node \"ip-10-0-128-239.ec2.internal\" DevicePath \"\"" Apr 22 20:14:54.165728 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:54.165700 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h_d2b9873e-d0f0-4974-81f0-069756d60cd6/storage-initializer/1.log" Apr 22 20:14:54.166264 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:54.165817 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h" event={"ID":"d2b9873e-d0f0-4974-81f0-069756d60cd6","Type":"ContainerDied","Data":"d03b4072ea223aae6b479e8df5e4bdd3ad23d0fb2813fc700a50bb60c2b7563b"} Apr 22 20:14:54.166264 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:54.165845 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h" Apr 22 20:14:54.166264 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:54.165853 2575 scope.go:117] "RemoveContainer" containerID="bd63d5576743a65cf7e17a2bbabfe8c2b9186d610e3ffd2f848ec998faac5332" Apr 22 20:14:54.167658 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:54.167635 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-2bb0d-predictor-6755c856c-4flnf" event={"ID":"33314084-ed1b-44e7-b322-fcc1910f2ac4","Type":"ContainerStarted","Data":"99f29b54683752f40e7e5300be7503ec2f1e403cb619a1df6d42f2bdbb751fe6"} Apr 22 20:14:54.167926 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:54.167903 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-2bb0d-predictor-6755c856c-4flnf" Apr 22 20:14:54.169713 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:54.169688 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-2bb0d-predictor-6755c856c-4flnf" podUID="33314084-ed1b-44e7-b322-fcc1910f2ac4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 22 20:14:54.183722 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:54.183684 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-2bb0d-predictor-6755c856c-4flnf" podStartSLOduration=6.183673915 podStartE2EDuration="6.183673915s" podCreationTimestamp="2026-04-22 20:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:14:54.18270368 +0000 UTC m=+1018.434203572" watchObservedRunningTime="2026-04-22 20:14:54.183673915 +0000 UTC m=+1018.435173795" Apr 22 20:14:54.203561 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:54.203540 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h"] Apr 22 20:14:54.207100 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:54.207079 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-dcfe87-predictor-5b4466c647-ndp8h"] Apr 22 20:14:54.303825 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:54.303794 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2b9873e-d0f0-4974-81f0-069756d60cd6" path="/var/lib/kubelet/pods/d2b9873e-d0f0-4974-81f0-069756d60cd6/volumes" Apr 22 20:14:55.171619 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:14:55.171588 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-2bb0d-predictor-6755c856c-4flnf" podUID="33314084-ed1b-44e7-b322-fcc1910f2ac4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 22 20:15:05.172169 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:15:05.172127 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-2bb0d-predictor-6755c856c-4flnf" podUID="33314084-ed1b-44e7-b322-fcc1910f2ac4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 22 20:15:15.172417 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:15:15.172371 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-2bb0d-predictor-6755c856c-4flnf" podUID="33314084-ed1b-44e7-b322-fcc1910f2ac4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 22 20:15:25.172083 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:15:25.172028 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-2bb0d-predictor-6755c856c-4flnf" podUID="33314084-ed1b-44e7-b322-fcc1910f2ac4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 22 20:15:35.171925 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:15:35.171882 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-2bb0d-predictor-6755c856c-4flnf" podUID="33314084-ed1b-44e7-b322-fcc1910f2ac4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 22 20:15:45.172579 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:15:45.172481 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-2bb0d-predictor-6755c856c-4flnf" podUID="33314084-ed1b-44e7-b322-fcc1910f2ac4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 22 20:15:55.172985 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:15:55.172943 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-2bb0d-predictor-6755c856c-4flnf" Apr 22 20:15:59.086464 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:15:59.086430 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-2bb0d-predictor-6755c856c-4flnf"] Apr 22 20:15:59.086906 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:15:59.086720 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-2bb0d-predictor-6755c856c-4flnf" podUID="33314084-ed1b-44e7-b322-fcc1910f2ac4" containerName="kserve-container" containerID="cri-o://99f29b54683752f40e7e5300be7503ec2f1e403cb619a1df6d42f2bdbb751fe6" gracePeriod=30 Apr 22 20:15:59.134131 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:15:59.134103 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh"] Apr 22 20:15:59.134340 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:15:59.134329 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d2b9873e-d0f0-4974-81f0-069756d60cd6" containerName="storage-initializer" Apr 22 20:15:59.134401 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:15:59.134341 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b9873e-d0f0-4974-81f0-069756d60cd6" containerName="storage-initializer" Apr 22 20:15:59.134401 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:15:59.134394 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d2b9873e-d0f0-4974-81f0-069756d60cd6" containerName="storage-initializer" Apr 22 20:15:59.134401 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:15:59.134400 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d2b9873e-d0f0-4974-81f0-069756d60cd6" containerName="storage-initializer" Apr 22 20:15:59.134528 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:15:59.134438 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d2b9873e-d0f0-4974-81f0-069756d60cd6" containerName="storage-initializer" Apr 22 20:15:59.134528 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:15:59.134448 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b9873e-d0f0-4974-81f0-069756d60cd6" containerName="storage-initializer" Apr 22 20:15:59.137158 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:15:59.137140 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh" Apr 22 20:15:59.144131 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:15:59.144110 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh"] Apr 22 20:15:59.231255 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:15:59.231219 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45a1b910-76ee-43e8-a2e9-4fe9f36611ec-kserve-provision-location\") pod \"raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh\" (UID: \"45a1b910-76ee-43e8-a2e9-4fe9f36611ec\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh" Apr 22 20:15:59.331573 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:15:59.331536 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45a1b910-76ee-43e8-a2e9-4fe9f36611ec-kserve-provision-location\") pod \"raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh\" (UID: \"45a1b910-76ee-43e8-a2e9-4fe9f36611ec\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh" Apr 22 20:15:59.331912 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:15:59.331894 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45a1b910-76ee-43e8-a2e9-4fe9f36611ec-kserve-provision-location\") pod \"raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh\" (UID: \"45a1b910-76ee-43e8-a2e9-4fe9f36611ec\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh" Apr 22 20:15:59.448048 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:15:59.448007 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh" Apr 22 20:15:59.559731 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:15:59.559669 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh"] Apr 22 20:15:59.563179 ip-10-0-128-239 kubenswrapper[2575]: W0422 20:15:59.563146 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45a1b910_76ee_43e8_a2e9_4fe9f36611ec.slice/crio-41e8dc0c7da4a6909112dbc3058c6be5e1e5047209860abf24112bfbd722cab9 WatchSource:0}: Error finding container 41e8dc0c7da4a6909112dbc3058c6be5e1e5047209860abf24112bfbd722cab9: Status 404 returned error can't find the container with id 41e8dc0c7da4a6909112dbc3058c6be5e1e5047209860abf24112bfbd722cab9 Apr 22 20:16:00.340476 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:00.340440 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh" event={"ID":"45a1b910-76ee-43e8-a2e9-4fe9f36611ec","Type":"ContainerStarted","Data":"bf8f2d4fc4be1e765bffd98c347c67adcde110b129fa1559bb7beb4d3d117f9f"} Apr 22 20:16:00.340476 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:00.340475 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh" event={"ID":"45a1b910-76ee-43e8-a2e9-4fe9f36611ec","Type":"ContainerStarted","Data":"41e8dc0c7da4a6909112dbc3058c6be5e1e5047209860abf24112bfbd722cab9"} Apr 22 20:16:02.818399 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:02.818377 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-2bb0d-predictor-6755c856c-4flnf" Apr 22 20:16:02.956286 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:02.956214 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33314084-ed1b-44e7-b322-fcc1910f2ac4-kserve-provision-location\") pod \"33314084-ed1b-44e7-b322-fcc1910f2ac4\" (UID: \"33314084-ed1b-44e7-b322-fcc1910f2ac4\") " Apr 22 20:16:02.956515 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:02.956493 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33314084-ed1b-44e7-b322-fcc1910f2ac4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "33314084-ed1b-44e7-b322-fcc1910f2ac4" (UID: "33314084-ed1b-44e7-b322-fcc1910f2ac4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:16:03.057614 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:03.057577 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33314084-ed1b-44e7-b322-fcc1910f2ac4-kserve-provision-location\") on node \"ip-10-0-128-239.ec2.internal\" DevicePath \"\"" Apr 22 20:16:03.349728 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:03.349652 2575 generic.go:358] "Generic (PLEG): container finished" podID="33314084-ed1b-44e7-b322-fcc1910f2ac4" containerID="99f29b54683752f40e7e5300be7503ec2f1e403cb619a1df6d42f2bdbb751fe6" exitCode=0 Apr 22 20:16:03.349728 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:03.349696 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-2bb0d-predictor-6755c856c-4flnf" event={"ID":"33314084-ed1b-44e7-b322-fcc1910f2ac4","Type":"ContainerDied","Data":"99f29b54683752f40e7e5300be7503ec2f1e403cb619a1df6d42f2bdbb751fe6"} Apr 22 20:16:03.349728 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:03.349713 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-2bb0d-predictor-6755c856c-4flnf" Apr 22 20:16:03.349728 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:03.349728 2575 scope.go:117] "RemoveContainer" containerID="99f29b54683752f40e7e5300be7503ec2f1e403cb619a1df6d42f2bdbb751fe6" Apr 22 20:16:03.350016 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:03.349717 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-2bb0d-predictor-6755c856c-4flnf" event={"ID":"33314084-ed1b-44e7-b322-fcc1910f2ac4","Type":"ContainerDied","Data":"48c67c593dd656aa52e11a1a550720ab06af7164fc0fb457b977aa0af6e183ca"} Apr 22 20:16:03.357662 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:03.357622 2575 scope.go:117] "RemoveContainer" containerID="83cedf90aefd5af37e8e57fc53117c6a08fa400b5004f5518c7228add347fcbd" Apr 22 20:16:03.364142 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:03.364117 2575 scope.go:117] "RemoveContainer" containerID="99f29b54683752f40e7e5300be7503ec2f1e403cb619a1df6d42f2bdbb751fe6" Apr 22 20:16:03.364370 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:16:03.364351 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99f29b54683752f40e7e5300be7503ec2f1e403cb619a1df6d42f2bdbb751fe6\": container with ID starting with 99f29b54683752f40e7e5300be7503ec2f1e403cb619a1df6d42f2bdbb751fe6 not found: ID does not exist" containerID="99f29b54683752f40e7e5300be7503ec2f1e403cb619a1df6d42f2bdbb751fe6" Apr 22 20:16:03.364448 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:03.364384 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99f29b54683752f40e7e5300be7503ec2f1e403cb619a1df6d42f2bdbb751fe6"} err="failed to get container status \"99f29b54683752f40e7e5300be7503ec2f1e403cb619a1df6d42f2bdbb751fe6\": rpc error: code = NotFound desc = could not find container \"99f29b54683752f40e7e5300be7503ec2f1e403cb619a1df6d42f2bdbb751fe6\": container with ID starting with 99f29b54683752f40e7e5300be7503ec2f1e403cb619a1df6d42f2bdbb751fe6 not found: ID does not exist" Apr 22 20:16:03.364448 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:03.364408 2575 scope.go:117] "RemoveContainer" containerID="83cedf90aefd5af37e8e57fc53117c6a08fa400b5004f5518c7228add347fcbd" Apr 22 20:16:03.364652 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:16:03.364629 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83cedf90aefd5af37e8e57fc53117c6a08fa400b5004f5518c7228add347fcbd\": container with ID starting with 83cedf90aefd5af37e8e57fc53117c6a08fa400b5004f5518c7228add347fcbd not found: ID does not exist" containerID="83cedf90aefd5af37e8e57fc53117c6a08fa400b5004f5518c7228add347fcbd" Apr 22 20:16:03.364691 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:03.364663 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83cedf90aefd5af37e8e57fc53117c6a08fa400b5004f5518c7228add347fcbd"} err="failed to get container status \"83cedf90aefd5af37e8e57fc53117c6a08fa400b5004f5518c7228add347fcbd\": rpc error: code = NotFound desc = could not find container \"83cedf90aefd5af37e8e57fc53117c6a08fa400b5004f5518c7228add347fcbd\": container with ID starting with 83cedf90aefd5af37e8e57fc53117c6a08fa400b5004f5518c7228add347fcbd not found: ID does not exist" Apr 22 20:16:03.369239 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:03.369219 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-2bb0d-predictor-6755c856c-4flnf"] Apr 22 20:16:03.372397 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:03.372376 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-2bb0d-predictor-6755c856c-4flnf"] Apr 22 20:16:04.303995 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:04.303958 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33314084-ed1b-44e7-b322-fcc1910f2ac4" path="/var/lib/kubelet/pods/33314084-ed1b-44e7-b322-fcc1910f2ac4/volumes" Apr 22 20:16:04.353187 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:04.353163 2575 generic.go:358] "Generic (PLEG): container finished" podID="45a1b910-76ee-43e8-a2e9-4fe9f36611ec" containerID="bf8f2d4fc4be1e765bffd98c347c67adcde110b129fa1559bb7beb4d3d117f9f" exitCode=0 Apr 22 20:16:04.353320 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:04.353227 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh" event={"ID":"45a1b910-76ee-43e8-a2e9-4fe9f36611ec","Type":"ContainerDied","Data":"bf8f2d4fc4be1e765bffd98c347c67adcde110b129fa1559bb7beb4d3d117f9f"} Apr 22 20:16:05.363238 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:05.363205 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh" event={"ID":"45a1b910-76ee-43e8-a2e9-4fe9f36611ec","Type":"ContainerStarted","Data":"e0e1316f7677579a3b31d92c0000aa4d52fc0f06fb9d6a34d1da3eb2a5dbc649"} Apr 22 20:16:05.363643 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:05.363572 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh" Apr 22 20:16:05.364780 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:05.364756 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh" podUID="45a1b910-76ee-43e8-a2e9-4fe9f36611ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 20:16:05.380523 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:05.380485 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh" podStartSLOduration=6.380474479 podStartE2EDuration="6.380474479s" podCreationTimestamp="2026-04-22 20:15:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:16:05.379567966 +0000 UTC m=+1089.631067855" watchObservedRunningTime="2026-04-22 20:16:05.380474479 +0000 UTC m=+1089.631974367" Apr 22 20:16:06.366432 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:06.366386 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh" podUID="45a1b910-76ee-43e8-a2e9-4fe9f36611ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 20:16:16.366607 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:16.366564 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh" podUID="45a1b910-76ee-43e8-a2e9-4fe9f36611ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 20:16:26.366562 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:26.366523 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh" podUID="45a1b910-76ee-43e8-a2e9-4fe9f36611ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 20:16:36.366769 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:36.366729 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh" podUID="45a1b910-76ee-43e8-a2e9-4fe9f36611ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 20:16:46.366388 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:46.366351 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh" podUID="45a1b910-76ee-43e8-a2e9-4fe9f36611ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 20:16:56.367208 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:16:56.367172 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh" podUID="45a1b910-76ee-43e8-a2e9-4fe9f36611ec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 22 20:17:06.368241 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:06.368207 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh" Apr 22 20:17:09.252898 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:09.252867 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh"] Apr 22 20:17:09.253379 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:09.253134 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh" podUID="45a1b910-76ee-43e8-a2e9-4fe9f36611ec" containerName="kserve-container" containerID="cri-o://e0e1316f7677579a3b31d92c0000aa4d52fc0f06fb9d6a34d1da3eb2a5dbc649" gracePeriod=30 Apr 22 20:17:12.984382 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:12.984360 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh" Apr 22 20:17:13.021704 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:13.021641 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45a1b910-76ee-43e8-a2e9-4fe9f36611ec-kserve-provision-location\") pod \"45a1b910-76ee-43e8-a2e9-4fe9f36611ec\" (UID: \"45a1b910-76ee-43e8-a2e9-4fe9f36611ec\") " Apr 22 20:17:13.021941 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:13.021921 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45a1b910-76ee-43e8-a2e9-4fe9f36611ec-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "45a1b910-76ee-43e8-a2e9-4fe9f36611ec" (UID: "45a1b910-76ee-43e8-a2e9-4fe9f36611ec"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:17:13.122902 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:13.122872 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/45a1b910-76ee-43e8-a2e9-4fe9f36611ec-kserve-provision-location\") on node \"ip-10-0-128-239.ec2.internal\" DevicePath \"\"" Apr 22 20:17:13.532062 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:13.532017 2575 generic.go:358] "Generic (PLEG): container finished" podID="45a1b910-76ee-43e8-a2e9-4fe9f36611ec" containerID="e0e1316f7677579a3b31d92c0000aa4d52fc0f06fb9d6a34d1da3eb2a5dbc649" exitCode=0 Apr 22 20:17:13.532207 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:13.532088 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh" Apr 22 20:17:13.532207 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:13.532100 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh" event={"ID":"45a1b910-76ee-43e8-a2e9-4fe9f36611ec","Type":"ContainerDied","Data":"e0e1316f7677579a3b31d92c0000aa4d52fc0f06fb9d6a34d1da3eb2a5dbc649"} Apr 22 20:17:13.532207 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:13.532139 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh" event={"ID":"45a1b910-76ee-43e8-a2e9-4fe9f36611ec","Type":"ContainerDied","Data":"41e8dc0c7da4a6909112dbc3058c6be5e1e5047209860abf24112bfbd722cab9"} Apr 22 20:17:13.532207 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:13.532156 2575 scope.go:117] "RemoveContainer" containerID="e0e1316f7677579a3b31d92c0000aa4d52fc0f06fb9d6a34d1da3eb2a5dbc649" Apr 22 20:17:13.539949 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:13.539933 2575 scope.go:117] "RemoveContainer" containerID="bf8f2d4fc4be1e765bffd98c347c67adcde110b129fa1559bb7beb4d3d117f9f" Apr 22 20:17:13.546462 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:13.546440 2575 scope.go:117] "RemoveContainer" containerID="e0e1316f7677579a3b31d92c0000aa4d52fc0f06fb9d6a34d1da3eb2a5dbc649" Apr 22 20:17:13.546714 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:17:13.546696 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0e1316f7677579a3b31d92c0000aa4d52fc0f06fb9d6a34d1da3eb2a5dbc649\": container with ID starting with e0e1316f7677579a3b31d92c0000aa4d52fc0f06fb9d6a34d1da3eb2a5dbc649 not found: ID does not exist" containerID="e0e1316f7677579a3b31d92c0000aa4d52fc0f06fb9d6a34d1da3eb2a5dbc649" Apr 22 20:17:13.546769 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:13.546722 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0e1316f7677579a3b31d92c0000aa4d52fc0f06fb9d6a34d1da3eb2a5dbc649"} err="failed to get container status \"e0e1316f7677579a3b31d92c0000aa4d52fc0f06fb9d6a34d1da3eb2a5dbc649\": rpc error: code = NotFound desc = could not find container \"e0e1316f7677579a3b31d92c0000aa4d52fc0f06fb9d6a34d1da3eb2a5dbc649\": container with ID starting with e0e1316f7677579a3b31d92c0000aa4d52fc0f06fb9d6a34d1da3eb2a5dbc649 not found: ID does not exist" Apr 22 20:17:13.546769 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:13.546741 2575 scope.go:117] "RemoveContainer" containerID="bf8f2d4fc4be1e765bffd98c347c67adcde110b129fa1559bb7beb4d3d117f9f" Apr 22 20:17:13.546964 ip-10-0-128-239 kubenswrapper[2575]: E0422 20:17:13.546945 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf8f2d4fc4be1e765bffd98c347c67adcde110b129fa1559bb7beb4d3d117f9f\": container with ID starting with bf8f2d4fc4be1e765bffd98c347c67adcde110b129fa1559bb7beb4d3d117f9f not found: ID does not exist" containerID="bf8f2d4fc4be1e765bffd98c347c67adcde110b129fa1559bb7beb4d3d117f9f" Apr 22 20:17:13.547003 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:13.546971 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf8f2d4fc4be1e765bffd98c347c67adcde110b129fa1559bb7beb4d3d117f9f"} err="failed to get container status \"bf8f2d4fc4be1e765bffd98c347c67adcde110b129fa1559bb7beb4d3d117f9f\": rpc error: code = NotFound desc = could not find container \"bf8f2d4fc4be1e765bffd98c347c67adcde110b129fa1559bb7beb4d3d117f9f\": container with ID starting with bf8f2d4fc4be1e765bffd98c347c67adcde110b129fa1559bb7beb4d3d117f9f not found: ID does not exist" Apr 22 20:17:13.551450 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:13.551428 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh"] Apr 22 20:17:13.555006 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:13.554984 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-e7bec-predictor-7ff77696d9-5jjbh"] Apr 22 20:17:14.303385 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:14.303352 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45a1b910-76ee-43e8-a2e9-4fe9f36611ec" path="/var/lib/kubelet/pods/45a1b910-76ee-43e8-a2e9-4fe9f36611ec/volumes" Apr 22 20:17:37.710459 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:37.710425 2575 ???:1] "http: TLS handshake error from 10.0.131.194:58876: EOF" Apr 22 20:17:37.712591 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:37.712570 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-d46mn_d02dd904-b70b-4d97-9033-5614a158edbf/global-pull-secret-syncer/0.log" Apr 22 20:17:37.874861 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:37.874834 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-9jzqm_7d736e66-a7dc-4492-9e64-68b254bc8afa/konnectivity-agent/0.log" Apr 22 20:17:37.998926 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:37.998866 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-239.ec2.internal_6653588b6ccc7c46306d530ef0bfbee3/haproxy/0.log" Apr 22 20:17:41.858463 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:41.858430 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-phnrd_201a3fbe-7126-4565-b184-2dbb1cc12e69/node-exporter/0.log" Apr 22 20:17:41.876967 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:41.876935 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-phnrd_201a3fbe-7126-4565-b184-2dbb1cc12e69/kube-rbac-proxy/0.log" Apr 22 20:17:41.896461 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:41.896442 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-phnrd_201a3fbe-7126-4565-b184-2dbb1cc12e69/init-textfile/0.log" Apr 22 20:17:44.957613 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:44.957579 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mv695/perf-node-gather-daemonset-2gvfs"] Apr 22 20:17:44.958110 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:44.957913 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45a1b910-76ee-43e8-a2e9-4fe9f36611ec" containerName="storage-initializer" Apr 22 20:17:44.958110 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:44.957930 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="45a1b910-76ee-43e8-a2e9-4fe9f36611ec" containerName="storage-initializer" Apr 22 20:17:44.958110 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:44.957944 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33314084-ed1b-44e7-b322-fcc1910f2ac4" containerName="kserve-container" Apr 22 20:17:44.958110 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:44.957952 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="33314084-ed1b-44e7-b322-fcc1910f2ac4" containerName="kserve-container" Apr 22 20:17:44.958110 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:44.957960 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33314084-ed1b-44e7-b322-fcc1910f2ac4" containerName="storage-initializer" Apr 22 20:17:44.958110 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:44.957967 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="33314084-ed1b-44e7-b322-fcc1910f2ac4" containerName="storage-initializer" Apr 22 20:17:44.958110 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:44.957985 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45a1b910-76ee-43e8-a2e9-4fe9f36611ec" containerName="kserve-container" Apr 22 20:17:44.958110 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:44.957993 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="45a1b910-76ee-43e8-a2e9-4fe9f36611ec" containerName="kserve-container" Apr 22 20:17:44.958110 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:44.958071 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="33314084-ed1b-44e7-b322-fcc1910f2ac4" containerName="kserve-container" Apr 22 20:17:44.958110 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:44.958084 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="45a1b910-76ee-43e8-a2e9-4fe9f36611ec" containerName="kserve-container" Apr 22 20:17:44.961116 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:44.961097 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mv695/perf-node-gather-daemonset-2gvfs" Apr 22 20:17:44.964332 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:44.964308 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mv695\"/\"openshift-service-ca.crt\"" Apr 22 20:17:44.965160 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:44.965146 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-mv695\"/\"default-dockercfg-kkmqp\"" Apr 22 20:17:44.965231 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:44.965204 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mv695\"/\"kube-root-ca.crt\"" Apr 22 20:17:44.975040 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:44.975012 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mv695/perf-node-gather-daemonset-2gvfs"] Apr 22 20:17:45.027796 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:45.027761 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ec95c2aa-b88b-488d-abd2-b88cf6442f32-proc\") pod \"perf-node-gather-daemonset-2gvfs\" (UID: \"ec95c2aa-b88b-488d-abd2-b88cf6442f32\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-2gvfs" Apr 22 20:17:45.027796 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:45.027798 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ec95c2aa-b88b-488d-abd2-b88cf6442f32-lib-modules\") pod \"perf-node-gather-daemonset-2gvfs\" (UID: \"ec95c2aa-b88b-488d-abd2-b88cf6442f32\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-2gvfs" Apr 22 20:17:45.028064 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:45.027824 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk9ld\" (UniqueName: \"kubernetes.io/projected/ec95c2aa-b88b-488d-abd2-b88cf6442f32-kube-api-access-gk9ld\") pod \"perf-node-gather-daemonset-2gvfs\" (UID: \"ec95c2aa-b88b-488d-abd2-b88cf6442f32\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-2gvfs" Apr 22 20:17:45.028064 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:45.027882 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ec95c2aa-b88b-488d-abd2-b88cf6442f32-sys\") pod \"perf-node-gather-daemonset-2gvfs\" (UID: \"ec95c2aa-b88b-488d-abd2-b88cf6442f32\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-2gvfs" Apr 22 20:17:45.028064 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:45.027970 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ec95c2aa-b88b-488d-abd2-b88cf6442f32-podres\") pod \"perf-node-gather-daemonset-2gvfs\" (UID: \"ec95c2aa-b88b-488d-abd2-b88cf6442f32\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-2gvfs" Apr 22 20:17:45.128786 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:45.128759 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ec95c2aa-b88b-488d-abd2-b88cf6442f32-sys\") pod \"perf-node-gather-daemonset-2gvfs\" (UID: \"ec95c2aa-b88b-488d-abd2-b88cf6442f32\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-2gvfs" Apr 22 20:17:45.128884 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:45.128803 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ec95c2aa-b88b-488d-abd2-b88cf6442f32-podres\") pod \"perf-node-gather-daemonset-2gvfs\" (UID: \"ec95c2aa-b88b-488d-abd2-b88cf6442f32\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-2gvfs" Apr 22 20:17:45.128884 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:45.128845 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ec95c2aa-b88b-488d-abd2-b88cf6442f32-proc\") pod \"perf-node-gather-daemonset-2gvfs\" (UID: \"ec95c2aa-b88b-488d-abd2-b88cf6442f32\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-2gvfs" Apr 22 20:17:45.128884 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:45.128863 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ec95c2aa-b88b-488d-abd2-b88cf6442f32-lib-modules\") pod \"perf-node-gather-daemonset-2gvfs\" (UID: \"ec95c2aa-b88b-488d-abd2-b88cf6442f32\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-2gvfs" Apr 22 20:17:45.128884 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:45.128868 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ec95c2aa-b88b-488d-abd2-b88cf6442f32-sys\") pod \"perf-node-gather-daemonset-2gvfs\" (UID: \"ec95c2aa-b88b-488d-abd2-b88cf6442f32\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-2gvfs" Apr 22 20:17:45.128884 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:45.128880 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gk9ld\" (UniqueName: \"kubernetes.io/projected/ec95c2aa-b88b-488d-abd2-b88cf6442f32-kube-api-access-gk9ld\") pod \"perf-node-gather-daemonset-2gvfs\" (UID: \"ec95c2aa-b88b-488d-abd2-b88cf6442f32\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-2gvfs" Apr 22 20:17:45.129070 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:45.128995 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ec95c2aa-b88b-488d-abd2-b88cf6442f32-proc\") pod \"perf-node-gather-daemonset-2gvfs\" (UID: \"ec95c2aa-b88b-488d-abd2-b88cf6442f32\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-2gvfs" Apr 22 20:17:45.129070 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:45.129025 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ec95c2aa-b88b-488d-abd2-b88cf6442f32-lib-modules\") pod \"perf-node-gather-daemonset-2gvfs\" (UID: \"ec95c2aa-b88b-488d-abd2-b88cf6442f32\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-2gvfs" Apr 22 20:17:45.129070 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:45.129062 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ec95c2aa-b88b-488d-abd2-b88cf6442f32-podres\") pod \"perf-node-gather-daemonset-2gvfs\" (UID: \"ec95c2aa-b88b-488d-abd2-b88cf6442f32\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-2gvfs" Apr 22 20:17:45.137244 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:45.137225 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk9ld\" (UniqueName: \"kubernetes.io/projected/ec95c2aa-b88b-488d-abd2-b88cf6442f32-kube-api-access-gk9ld\") pod \"perf-node-gather-daemonset-2gvfs\" (UID: \"ec95c2aa-b88b-488d-abd2-b88cf6442f32\") " pod="openshift-must-gather-mv695/perf-node-gather-daemonset-2gvfs" Apr 22 20:17:45.270517 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:45.270464 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mv695/perf-node-gather-daemonset-2gvfs" Apr 22 20:17:45.381207 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:45.381176 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mv695/perf-node-gather-daemonset-2gvfs"] Apr 22 20:17:45.383912 ip-10-0-128-239 kubenswrapper[2575]: W0422 20:17:45.383883 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podec95c2aa_b88b_488d_abd2_b88cf6442f32.slice/crio-992c4e19d83c5c5e12974031db05a841d1e34614a6c148ddd54bda85262e6cdd WatchSource:0}: Error finding container 992c4e19d83c5c5e12974031db05a841d1e34614a6c148ddd54bda85262e6cdd: Status 404 returned error can't find the container with id 992c4e19d83c5c5e12974031db05a841d1e34614a6c148ddd54bda85262e6cdd Apr 22 20:17:45.523592 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:45.523539 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mdzds_7cb084c1-c4af-443e-afce-025ccb08ba3f/dns/0.log" Apr 22 20:17:45.542458 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:45.542436 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mdzds_7cb084c1-c4af-443e-afce-025ccb08ba3f/kube-rbac-proxy/0.log" Apr 22 20:17:45.622118 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:45.622092 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mv695/perf-node-gather-daemonset-2gvfs" event={"ID":"ec95c2aa-b88b-488d-abd2-b88cf6442f32","Type":"ContainerStarted","Data":"d003a90d4468b4911cde023e2b29066d6cc1ec19b8e9ec6a34e1555676861e28"} Apr 22 20:17:45.622206 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:45.622124 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mv695/perf-node-gather-daemonset-2gvfs" event={"ID":"ec95c2aa-b88b-488d-abd2-b88cf6442f32","Type":"ContainerStarted","Data":"992c4e19d83c5c5e12974031db05a841d1e34614a6c148ddd54bda85262e6cdd"} Apr 22 20:17:45.622247 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:45.622212 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-mv695/perf-node-gather-daemonset-2gvfs" Apr 22 20:17:45.628970 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:45.628944 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mht87_1e73951e-095f-4406-93d0-afb41cb12c4b/dns-node-resolver/0.log" Apr 22 20:17:45.639358 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:45.639324 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mv695/perf-node-gather-daemonset-2gvfs" podStartSLOduration=1.639313419 podStartE2EDuration="1.639313419s" podCreationTimestamp="2026-04-22 20:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:17:45.638150639 +0000 UTC m=+1189.889650527" watchObservedRunningTime="2026-04-22 20:17:45.639313419 +0000 UTC m=+1189.890813308" Apr 22 20:17:46.012142 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:46.012118 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7c8cf6cfd5-wrpnx_14040862-cf36-42dc-928c-1c43754396db/registry/0.log" Apr 22 20:17:46.050386 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:46.050362 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-nhz6m_658fb26d-962e-43ef-9be4-c89b573ecb41/node-ca/0.log" Apr 22 20:17:47.094107 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:47.094076 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-vjxnz_85f994f1-e18a-4886-bc92-4b762096129a/serve-healthcheck-canary/0.log" Apr 22 20:17:47.604540 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:47.604503 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jldrx_36d813c0-2bc0-48c5-a696-ae872c3dcca4/kube-rbac-proxy/0.log" Apr 22 20:17:47.623004 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:47.622972 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jldrx_36d813c0-2bc0-48c5-a696-ae872c3dcca4/exporter/0.log" Apr 22 20:17:47.643437 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:47.643413 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jldrx_36d813c0-2bc0-48c5-a696-ae872c3dcca4/extractor/0.log" Apr 22 20:17:49.684968 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:49.684932 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-v7jgh_82ba789a-2f8a-44e1-99a9-e31c9d915ded/manager/0.log" Apr 22 20:17:51.633425 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:51.633391 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-mv695/perf-node-gather-daemonset-2gvfs" Apr 22 20:17:54.649022 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:54.648996 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m69k4_6e1ef928-0940-490e-89a9-e75af398fadb/kube-multus-additional-cni-plugins/0.log" Apr 22 20:17:54.670134 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:54.670111 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m69k4_6e1ef928-0940-490e-89a9-e75af398fadb/egress-router-binary-copy/0.log" Apr 22 20:17:54.695461 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:54.695437 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m69k4_6e1ef928-0940-490e-89a9-e75af398fadb/cni-plugins/0.log" Apr 22 20:17:54.714503 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:54.714485 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m69k4_6e1ef928-0940-490e-89a9-e75af398fadb/bond-cni-plugin/0.log" Apr 22 20:17:54.734651 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:54.734634 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m69k4_6e1ef928-0940-490e-89a9-e75af398fadb/routeoverride-cni/0.log" Apr 22 20:17:54.754650 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:54.754630 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m69k4_6e1ef928-0940-490e-89a9-e75af398fadb/whereabouts-cni-bincopy/0.log" Apr 22 20:17:54.779264 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:54.779249 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m69k4_6e1ef928-0940-490e-89a9-e75af398fadb/whereabouts-cni/0.log" Apr 22 20:17:55.138938 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:55.138914 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hrx7n_bc9dd9ba-6ee7-42ef-bef4-834abd02ac13/kube-multus/0.log" Apr 22 20:17:55.230640 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:55.230615 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gptpd_80c4f1f9-74db-4e1b-bb9c-05d1618ca285/network-metrics-daemon/0.log" Apr 22 20:17:55.248396 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:55.248376 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gptpd_80c4f1f9-74db-4e1b-bb9c-05d1618ca285/kube-rbac-proxy/0.log" Apr 22 20:17:56.337503 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:56.337479 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f7mbc_a6c5f956-034b-486c-80ee-8f8ff4328b7f/ovn-controller/0.log" Apr 22 20:17:56.364298 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:56.364277 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f7mbc_a6c5f956-034b-486c-80ee-8f8ff4328b7f/ovn-acl-logging/0.log" Apr 22 20:17:56.379965 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:56.379948 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f7mbc_a6c5f956-034b-486c-80ee-8f8ff4328b7f/kube-rbac-proxy-node/0.log" Apr 22 20:17:56.401166 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:56.401143 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f7mbc_a6c5f956-034b-486c-80ee-8f8ff4328b7f/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 20:17:56.423435 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:56.423420 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f7mbc_a6c5f956-034b-486c-80ee-8f8ff4328b7f/northd/0.log" Apr 22 20:17:56.447110 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:56.447087 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f7mbc_a6c5f956-034b-486c-80ee-8f8ff4328b7f/nbdb/0.log" Apr 22 20:17:56.467689 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:56.467672 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f7mbc_a6c5f956-034b-486c-80ee-8f8ff4328b7f/sbdb/0.log" Apr 22 20:17:56.555576 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:56.555525 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f7mbc_a6c5f956-034b-486c-80ee-8f8ff4328b7f/ovnkube-controller/0.log" Apr 22 20:17:57.902180 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:57.902145 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-v2xhn_a0ee49c5-2186-4f09-9a83-ceabc81a52e5/network-check-target-container/0.log" Apr 22 20:17:58.797507 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:58.797476 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-sflf8_e5e122d2-2065-483b-a689-7d9721ed4c07/iptables-alerter/0.log" Apr 22 20:17:59.463719 ip-10-0-128-239 kubenswrapper[2575]: I0422 20:17:59.463689 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-lqzhq_46001a3b-bba4-48eb-b1ca-5e0377ba88fd/tuned/0.log"