Apr 22 19:57:43.059606 ip-10-0-133-60 systemd[1]: Starting Kubernetes Kubelet... Apr 22 19:57:43.513671 ip-10-0-133-60 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:57:43.513671 ip-10-0-133-60 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 19:57:43.513671 ip-10-0-133-60 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:57:43.513671 ip-10-0-133-60 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 19:57:43.513671 ip-10-0-133-60 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:57:43.516119 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.516030 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 19:57:43.525040 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525015 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:57:43.525040 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525036 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:57:43.525040 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525040 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:57:43.525040 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525043 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:57:43.525040 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525046 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:57:43.525040 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525049 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:57:43.525277 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525052 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:57:43.525277 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525056 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:57:43.525277 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525059 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:57:43.525277 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525061 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:57:43.525277 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525064 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:57:43.525277 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525067 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:57:43.525277 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525070 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:57:43.525277 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525073 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:57:43.525277 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525075 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:57:43.525277 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525078 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:57:43.525277 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525082 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:57:43.525277 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525085 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:57:43.525277 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525088 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:57:43.525277 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525090 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:57:43.525277 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525093 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:57:43.525277 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525096 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:57:43.525277 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525099 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:57:43.525277 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525101 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:57:43.525277 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525104 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:57:43.525277 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525106 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:57:43.525782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525109 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:57:43.525782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525111 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:57:43.525782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525114 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:57:43.525782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525117 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:57:43.525782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525120 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:57:43.525782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525122 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:57:43.525782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525125 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:57:43.525782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525128 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:57:43.525782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525130 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:57:43.525782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525133 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:57:43.525782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525135 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:57:43.525782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525138 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:57:43.525782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525140 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:57:43.525782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525144 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:57:43.525782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525147 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:57:43.525782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525149 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:57:43.525782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525152 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:57:43.525782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525156 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:57:43.525782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525160 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:57:43.525782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525163 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:57:43.526306 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525166 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:57:43.526306 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525169 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:57:43.526306 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525172 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:57:43.526306 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525175 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:57:43.526306 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525177 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:57:43.526306 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525180 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:57:43.526306 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525184 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:57:43.526306 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525189 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:57:43.526306 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525191 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:57:43.526306 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525195 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:57:43.526306 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525197 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:57:43.526306 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525200 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:57:43.526306 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525203 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:57:43.526306 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525206 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:57:43.526306 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525208 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:57:43.526306 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525211 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:57:43.526306 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525213 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:57:43.526306 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525216 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:57:43.526306 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525218 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:57:43.526802 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525221 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:57:43.526802 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525223 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:57:43.526802 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525226 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:57:43.526802 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525228 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:57:43.526802 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525231 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:57:43.526802 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525233 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:57:43.526802 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525237 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:57:43.526802 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525240 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:57:43.526802 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525242 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:57:43.526802 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525245 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:57:43.526802 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525248 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:57:43.526802 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525250 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:57:43.526802 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525253 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:57:43.526802 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525257 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:57:43.526802 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525261 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:57:43.526802 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525264 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:57:43.526802 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525267 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:57:43.526802 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525269 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:57:43.526802 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525272 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:57:43.527261 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525274 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:57:43.527261 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525277 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:57:43.527261 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525731 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:57:43.527261 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525738 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:57:43.527261 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525742 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:57:43.527261 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525745 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:57:43.527261 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525748 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:57:43.527261 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525751 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:57:43.527261 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525754 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:57:43.527261 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525757 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:57:43.527261 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525760 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:57:43.527261 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525762 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:57:43.527261 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525765 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:57:43.527261 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525768 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:57:43.527261 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525771 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:57:43.527261 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525774 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:57:43.527261 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525776 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:57:43.527261 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525779 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:57:43.527261 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525782 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:57:43.527261 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525785 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:57:43.527770 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525788 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:57:43.527770 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525791 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:57:43.527770 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525794 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:57:43.527770 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525796 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:57:43.527770 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525799 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:57:43.527770 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525801 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:57:43.527770 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525805 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:57:43.527770 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525807 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:57:43.527770 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525810 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:57:43.527770 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525812 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:57:43.527770 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525815 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:57:43.527770 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525819 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:57:43.527770 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525823 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:57:43.527770 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525825 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:57:43.527770 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525828 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:57:43.527770 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525831 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:57:43.527770 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525834 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:57:43.527770 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525836 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:57:43.527770 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525839 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:57:43.528275 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525841 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:57:43.528275 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525844 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:57:43.528275 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525847 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:57:43.528275 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525849 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:57:43.528275 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525852 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:57:43.528275 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525854 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:57:43.528275 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525857 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:57:43.528275 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525859 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:57:43.528275 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525862 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:57:43.528275 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525866 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:57:43.528275 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525869 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:57:43.528275 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525871 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:57:43.528275 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525875 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:57:43.528275 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525878 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:57:43.528275 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525880 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:57:43.528275 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525883 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:57:43.528275 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525886 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:57:43.528275 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525888 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:57:43.528275 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525891 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:57:43.528749 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525894 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:57:43.528749 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525897 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:57:43.528749 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525899 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:57:43.528749 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525902 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:57:43.528749 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525905 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:57:43.528749 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525907 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:57:43.528749 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525910 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:57:43.528749 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525912 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:57:43.528749 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525917 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:57:43.528749 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525920 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:57:43.528749 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525923 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:57:43.528749 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525925 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:57:43.528749 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525928 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:57:43.528749 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525930 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:57:43.528749 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525933 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:57:43.528749 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525935 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:57:43.528749 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525938 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:57:43.528749 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525940 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:57:43.528749 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525943 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:57:43.528749 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525945 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:57:43.529242 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525948 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:57:43.529242 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525950 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:57:43.529242 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525953 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:57:43.529242 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525956 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:57:43.529242 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525958 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:57:43.529242 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525961 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:57:43.529242 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525964 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:57:43.529242 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525966 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:57:43.529242 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525969 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:57:43.529242 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.525972 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:57:43.529242 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527489 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 19:57:43.529242 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527501 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 19:57:43.529242 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527511 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 19:57:43.529242 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527516 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 19:57:43.529242 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527521 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 19:57:43.529242 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527525 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 19:57:43.529242 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527529 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 19:57:43.529242 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527536 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 19:57:43.529242 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527540 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 19:57:43.529242 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527543 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 19:57:43.529242 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527547 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 19:57:43.529242 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527550 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 19:57:43.529804 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527553 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 19:57:43.529804 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527556 2575 flags.go:64] FLAG: --cgroup-root="" Apr 22 19:57:43.529804 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527559 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 19:57:43.529804 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527562 2575 flags.go:64] FLAG: --client-ca-file="" Apr 22 19:57:43.529804 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527565 2575 flags.go:64] FLAG: --cloud-config="" Apr 22 19:57:43.529804 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527568 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 22 19:57:43.529804 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527571 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 19:57:43.529804 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527576 2575 flags.go:64] FLAG: --cluster-domain="" Apr 22 19:57:43.529804 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527579 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 19:57:43.529804 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527583 2575 flags.go:64] FLAG: --config-dir="" Apr 22 19:57:43.529804 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527586 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 19:57:43.529804 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527589 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 19:57:43.529804 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527593 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 19:57:43.529804 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527597 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 19:57:43.529804 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527600 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 19:57:43.529804 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527603 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 19:57:43.529804 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527607 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 22 19:57:43.529804 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527610 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 19:57:43.529804 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527613 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 19:57:43.529804 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527616 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 19:57:43.529804 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527619 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 19:57:43.529804 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527623 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 19:57:43.529804 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527626 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 19:57:43.529804 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527629 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 19:57:43.529804 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527632 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 19:57:43.530427 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527636 2575 flags.go:64] FLAG: --enable-server="true" Apr 22 19:57:43.530427 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527639 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 19:57:43.530427 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527643 2575 flags.go:64] FLAG: --event-burst="100" Apr 22 19:57:43.530427 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527647 2575 flags.go:64] FLAG: --event-qps="50" Apr 22 19:57:43.530427 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527650 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 19:57:43.530427 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527653 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 19:57:43.530427 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527656 2575 flags.go:64] FLAG: --eviction-hard="" Apr 22 19:57:43.530427 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527659 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 19:57:43.530427 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527662 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 19:57:43.530427 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527665 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 19:57:43.530427 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527668 2575 flags.go:64] FLAG: --eviction-soft="" Apr 22 19:57:43.530427 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527671 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 19:57:43.530427 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527674 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 19:57:43.530427 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527677 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 19:57:43.530427 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527681 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 19:57:43.530427 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527684 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 19:57:43.530427 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527687 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 19:57:43.530427 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527690 2575 flags.go:64] FLAG: --feature-gates="" Apr 22 19:57:43.530427 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527694 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 19:57:43.530427 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527697 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 19:57:43.530427 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527700 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 19:57:43.530427 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527703 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 19:57:43.530427 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527707 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 22 19:57:43.530427 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527710 2575 flags.go:64] FLAG: --help="false" Apr 22 19:57:43.530427 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527713 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-133-60.ec2.internal" Apr 22 19:57:43.531035 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527716 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 19:57:43.531035 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527719 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 19:57:43.531035 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527722 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 19:57:43.531035 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527725 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 19:57:43.531035 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527729 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 19:57:43.531035 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527732 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 19:57:43.531035 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527735 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 19:57:43.531035 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527738 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 19:57:43.531035 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527741 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 19:57:43.531035 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527745 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 19:57:43.531035 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527748 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 19:57:43.531035 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527751 2575 flags.go:64] FLAG: --kube-reserved="" Apr 22 19:57:43.531035 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527754 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 19:57:43.531035 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527757 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 19:57:43.531035 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527760 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 19:57:43.531035 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527763 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 19:57:43.531035 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527766 2575 flags.go:64] FLAG: --lock-file="" Apr 22 19:57:43.531035 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527769 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 19:57:43.531035 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527772 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 19:57:43.531035 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527775 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 19:57:43.531035 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527781 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 19:57:43.531035 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527784 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 19:57:43.531035 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527787 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 19:57:43.531035 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527790 2575 flags.go:64] FLAG: --logging-format="text" Apr 22 19:57:43.531689 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527793 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 19:57:43.531689 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527797 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 19:57:43.531689 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527801 2575 flags.go:64] FLAG: --manifest-url="" Apr 22 19:57:43.531689 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527804 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 22 19:57:43.531689 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527808 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 19:57:43.531689 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527811 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 19:57:43.531689 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527815 2575 flags.go:64] FLAG: --max-pods="110" Apr 22 19:57:43.531689 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527819 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 19:57:43.531689 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527822 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 19:57:43.531689 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527825 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 19:57:43.531689 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527828 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 19:57:43.531689 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527831 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 19:57:43.531689 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527834 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 19:57:43.531689 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527837 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 19:57:43.531689 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527844 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 19:57:43.531689 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527848 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 19:57:43.531689 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527851 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 19:57:43.531689 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527854 2575 flags.go:64] FLAG: --pod-cidr="" Apr 22 19:57:43.531689 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527857 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 19:57:43.531689 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527863 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 19:57:43.531689 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527866 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 19:57:43.531689 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527869 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 22 19:57:43.531689 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527872 2575 flags.go:64] FLAG: --port="10250" Apr 22 19:57:43.531689 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527875 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 19:57:43.532257 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527878 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-00f2c22303b6dbdc7" Apr 22 19:57:43.532257 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527881 2575 flags.go:64] FLAG: --qos-reserved="" Apr 22 19:57:43.532257 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527884 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 22 19:57:43.532257 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527887 2575 flags.go:64] FLAG: --register-node="true" Apr 22 19:57:43.532257 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527890 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 22 19:57:43.532257 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527893 2575 flags.go:64] FLAG: --register-with-taints="" Apr 22 19:57:43.532257 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527897 2575 flags.go:64] FLAG: --registry-burst="10" Apr 22 19:57:43.532257 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527900 2575 flags.go:64] FLAG: --registry-qps="5" Apr 22 19:57:43.532257 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527903 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 22 19:57:43.532257 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527905 2575 flags.go:64] FLAG: --reserved-memory="" Apr 22 19:57:43.532257 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527910 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 19:57:43.532257 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527913 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 19:57:43.532257 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527916 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 19:57:43.532257 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527919 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 19:57:43.532257 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527922 2575 flags.go:64] FLAG: --runonce="false" Apr 22 19:57:43.532257 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527925 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 19:57:43.532257 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527928 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 19:57:43.532257 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527931 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 22 19:57:43.532257 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527934 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 19:57:43.532257 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527936 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 19:57:43.532257 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527940 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 19:57:43.532257 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527943 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 19:57:43.532257 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527946 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 19:57:43.532257 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527949 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 19:57:43.532257 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527952 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 19:57:43.532257 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527955 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 19:57:43.532949 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527958 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 19:57:43.532949 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527962 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 19:57:43.532949 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527965 2575 flags.go:64] FLAG: --system-cgroups="" Apr 22 19:57:43.532949 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527967 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 19:57:43.532949 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527973 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 19:57:43.532949 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527975 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 22 19:57:43.532949 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527978 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 19:57:43.532949 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527983 2575 flags.go:64] FLAG: --tls-min-version="" Apr 22 19:57:43.532949 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527985 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 19:57:43.532949 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527988 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 19:57:43.532949 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527991 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 19:57:43.532949 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527994 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 19:57:43.532949 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.527997 2575 flags.go:64] FLAG: --v="2" Apr 22 19:57:43.532949 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.528002 2575 flags.go:64] FLAG: --version="false" Apr 22 19:57:43.532949 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.528006 2575 flags.go:64] FLAG: --vmodule="" Apr 22 19:57:43.532949 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.528011 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 19:57:43.532949 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.528014 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 19:57:43.532949 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528116 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:57:43.532949 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528120 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:57:43.532949 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528123 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:57:43.532949 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528126 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:57:43.532949 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528128 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:57:43.532949 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528131 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:57:43.532949 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528133 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:57:43.533573 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528136 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:57:43.533573 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528138 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:57:43.533573 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528141 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:57:43.533573 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528143 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:57:43.533573 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528146 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:57:43.533573 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528149 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:57:43.533573 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528151 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:57:43.533573 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528154 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:57:43.533573 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528156 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:57:43.533573 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528159 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:57:43.533573 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528162 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:57:43.533573 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528164 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:57:43.533573 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528167 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:57:43.533573 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528170 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:57:43.533573 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528172 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:57:43.533573 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528175 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:57:43.533573 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528177 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:57:43.533573 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528180 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:57:43.533573 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528182 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:57:43.533573 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528185 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:57:43.534147 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528188 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:57:43.534147 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528190 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:57:43.534147 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528193 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:57:43.534147 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528197 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:57:43.534147 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528201 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:57:43.534147 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528203 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:57:43.534147 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528206 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:57:43.534147 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528209 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:57:43.534147 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528211 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:57:43.534147 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528214 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:57:43.534147 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528216 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:57:43.534147 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528220 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:57:43.534147 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528224 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:57:43.534147 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528227 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:57:43.534147 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528230 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:57:43.534147 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528233 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:57:43.534147 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528235 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:57:43.534147 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528238 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:57:43.534147 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528241 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:57:43.534639 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528244 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:57:43.534639 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528247 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:57:43.534639 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528249 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:57:43.534639 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528253 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:57:43.534639 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528255 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:57:43.534639 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528258 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:57:43.534639 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528260 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:57:43.534639 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528263 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:57:43.534639 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528266 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:57:43.534639 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528268 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:57:43.534639 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528271 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:57:43.534639 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528273 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:57:43.534639 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528279 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:57:43.534639 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528281 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:57:43.534639 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528284 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:57:43.534639 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528287 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:57:43.534639 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528289 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:57:43.534639 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528292 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:57:43.534639 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528294 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:57:43.534639 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528297 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:57:43.535121 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528300 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:57:43.535121 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528302 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:57:43.535121 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528305 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:57:43.535121 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528307 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:57:43.535121 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528310 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:57:43.535121 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528326 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:57:43.535121 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528330 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:57:43.535121 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528332 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:57:43.535121 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528335 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:57:43.535121 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528338 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:57:43.535121 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528340 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:57:43.535121 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528343 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:57:43.535121 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528346 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:57:43.535121 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528348 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:57:43.535121 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528351 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:57:43.535121 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528353 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:57:43.535121 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528356 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:57:43.535121 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528359 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:57:43.535121 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528362 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:57:43.535121 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.528368 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:57:43.535650 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.528373 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:57:43.535650 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.534799 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 19:57:43.535650 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.534816 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 19:57:43.535650 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534868 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:57:43.535650 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534872 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:57:43.535650 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534876 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:57:43.535650 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534879 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:57:43.535650 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534881 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:57:43.535650 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534884 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:57:43.535650 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534886 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:57:43.535650 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534889 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:57:43.535650 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534892 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:57:43.535650 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534895 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:57:43.535650 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534899 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:57:43.535650 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534903 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:57:43.536029 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534906 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:57:43.536029 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534909 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:57:43.536029 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534912 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:57:43.536029 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534915 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:57:43.536029 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534917 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:57:43.536029 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534920 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:57:43.536029 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534923 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:57:43.536029 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534926 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:57:43.536029 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534928 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:57:43.536029 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534931 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:57:43.536029 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534933 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:57:43.536029 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534936 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:57:43.536029 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534939 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:57:43.536029 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534942 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:57:43.536029 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534945 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:57:43.536029 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534947 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:57:43.536029 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534950 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:57:43.536029 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534952 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:57:43.536029 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534955 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:57:43.536029 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534959 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:57:43.536563 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534962 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:57:43.536563 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534964 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:57:43.536563 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534967 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:57:43.536563 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534970 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:57:43.536563 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534972 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:57:43.536563 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534975 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:57:43.536563 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534977 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:57:43.536563 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534980 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:57:43.536563 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534982 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:57:43.536563 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534985 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:57:43.536563 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534988 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:57:43.536563 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534991 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:57:43.536563 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534994 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:57:43.536563 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534996 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:57:43.536563 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.534999 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:57:43.536563 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535002 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:57:43.536563 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535004 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:57:43.536563 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535007 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:57:43.536563 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535010 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:57:43.537026 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535012 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:57:43.537026 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535015 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:57:43.537026 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535017 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:57:43.537026 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535019 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:57:43.537026 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535022 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:57:43.537026 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535025 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:57:43.537026 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535027 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:57:43.537026 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535030 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:57:43.537026 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535033 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:57:43.537026 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535036 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:57:43.537026 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535038 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:57:43.537026 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535041 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:57:43.537026 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535044 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:57:43.537026 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535047 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:57:43.537026 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535049 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:57:43.537026 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535052 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:57:43.537026 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535054 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:57:43.537026 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535057 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:57:43.537026 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535059 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:57:43.537026 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535062 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:57:43.537535 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535064 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:57:43.537535 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535067 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:57:43.537535 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535069 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:57:43.537535 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535077 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:57:43.537535 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535080 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:57:43.537535 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535082 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:57:43.537535 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535085 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:57:43.537535 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535088 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:57:43.537535 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535090 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:57:43.537535 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535094 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:57:43.537535 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535098 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:57:43.537535 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535101 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:57:43.537535 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535103 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:57:43.537535 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535106 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:57:43.537535 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535109 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:57:43.537910 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.535114 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:57:43.537910 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535210 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:57:43.537910 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535215 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:57:43.537910 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535218 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:57:43.537910 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535222 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:57:43.537910 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535238 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:57:43.537910 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535244 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:57:43.537910 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535247 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:57:43.537910 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535250 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:57:43.537910 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535253 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:57:43.537910 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535256 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:57:43.537910 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535259 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:57:43.537910 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535262 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:57:43.537910 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535265 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:57:43.537910 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535268 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:57:43.538286 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535271 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:57:43.538286 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535273 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:57:43.538286 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535276 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:57:43.538286 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535278 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:57:43.538286 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535281 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:57:43.538286 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535283 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:57:43.538286 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535286 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:57:43.538286 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535289 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:57:43.538286 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535291 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:57:43.538286 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535294 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:57:43.538286 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535296 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:57:43.538286 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535299 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:57:43.538286 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535301 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:57:43.538286 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535304 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:57:43.538286 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535306 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:57:43.538286 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535308 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:57:43.538286 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535325 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:57:43.538286 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535330 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:57:43.538286 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535333 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:57:43.538286 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535336 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:57:43.538782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535339 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:57:43.538782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535341 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:57:43.538782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535344 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:57:43.538782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535347 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:57:43.538782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535350 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:57:43.538782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535352 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:57:43.538782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535355 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:57:43.538782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535357 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:57:43.538782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535360 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:57:43.538782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535363 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:57:43.538782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535365 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:57:43.538782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535368 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:57:43.538782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535370 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:57:43.538782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535373 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:57:43.538782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535375 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:57:43.538782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535378 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:57:43.538782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535380 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:57:43.538782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535383 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:57:43.538782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535385 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:57:43.538782 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535388 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:57:43.539290 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535390 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:57:43.539290 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535393 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:57:43.539290 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535395 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:57:43.539290 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535398 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:57:43.539290 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535400 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:57:43.539290 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535403 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:57:43.539290 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535405 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:57:43.539290 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535407 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:57:43.539290 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535410 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:57:43.539290 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535412 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:57:43.539290 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535415 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:57:43.539290 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535418 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:57:43.539290 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535420 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:57:43.539290 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535423 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:57:43.539290 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535425 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:57:43.539290 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535428 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:57:43.539290 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535431 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:57:43.539290 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535433 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:57:43.539290 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535436 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:57:43.539290 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535439 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:57:43.539796 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535441 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:57:43.539796 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535444 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:57:43.539796 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535447 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:57:43.539796 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535449 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:57:43.539796 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535452 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:57:43.539796 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535455 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:57:43.539796 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535459 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:57:43.539796 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535462 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:57:43.539796 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535465 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:57:43.539796 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535467 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:57:43.539796 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535470 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:57:43.539796 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:43.535472 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:57:43.539796 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.535477 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:57:43.539796 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.536203 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 19:57:43.540144 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.538158 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 19:57:43.540144 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.539268 2575 server.go:1019] "Starting client certificate rotation" Apr 22 19:57:43.540144 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.539364 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:57:43.540144 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.539406 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:57:43.566004 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.565986 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:57:43.567810 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.567793 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:57:43.583237 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.583205 2575 log.go:25] "Validated CRI v1 runtime API" Apr 22 19:57:43.589020 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.589003 2575 log.go:25] "Validated CRI v1 image API" Apr 22 19:57:43.590134 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.590116 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 19:57:43.592521 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.592504 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:57:43.594025 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.594001 2575 fs.go:135] Filesystem UUIDs: map[1052dccb-e655-4f04-939c-60db7b793362:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 9e375222-ad13-428c-8179-cee20b2936d8:/dev/nvme0n1p3] Apr 22 19:57:43.594094 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.594024 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 19:57:43.599770 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.599660 2575 manager.go:217] Machine: {Timestamp:2026-04-22 19:57:43.597743516 +0000 UTC m=+0.411128692 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:2500004 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec24fe7655c6d9d1199d35ca4014f9f1 SystemUUID:ec24fe76-55c6-d9d1-199d-35ca4014f9f1 BootID:c325a48d-d777-4654-ad0d-91450b8ac8d9 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:0d:f3:ce:e9:d9 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:0d:f3:ce:e9:d9 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:56:c9:d7:e1:d3:b0 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 19:57:43.599770 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.599765 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 19:57:43.599881 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.599853 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 19:57:43.601067 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.601040 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 19:57:43.601206 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.601069 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-60.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 19:57:43.601254 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.601215 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 19:57:43.601254 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.601224 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 19:57:43.601254 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.601237 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:57:43.601983 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.601971 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:57:43.603385 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.603374 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:57:43.603672 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.603661 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 19:57:43.605980 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.605970 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 22 19:57:43.606055 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.605988 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 19:57:43.606055 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.606000 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 19:57:43.606055 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.606011 2575 kubelet.go:397] "Adding apiserver pod source" Apr 22 19:57:43.606055 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.606021 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 19:57:43.607286 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.607263 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:57:43.607286 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.607282 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:57:43.610357 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.610341 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 19:57:43.611579 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.611565 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 19:57:43.613097 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.613085 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 19:57:43.613144 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.613112 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 19:57:43.613144 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.613119 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 19:57:43.613144 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.613124 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 19:57:43.613144 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.613130 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 19:57:43.613144 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.613136 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 19:57:43.613144 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.613142 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 19:57:43.613144 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.613147 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 19:57:43.613331 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.613154 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 19:57:43.613331 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.613160 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 19:57:43.613331 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.613172 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 19:57:43.613331 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.613180 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 19:57:43.614674 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.614663 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 19:57:43.614712 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.614675 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 19:57:43.618507 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.618493 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 19:57:43.618593 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.618531 2575 server.go:1295] "Started kubelet" Apr 22 19:57:43.618656 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.618607 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 19:57:43.618707 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.618666 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 19:57:43.618755 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.618716 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 19:57:43.619977 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.619953 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-60.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 19:57:43.619977 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.619964 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 19:57:43.620121 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:43.620031 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 19:57:43.620272 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:43.620254 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-60.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 19:57:43.620267 ip-10-0-133-60 systemd[1]: Started Kubernetes Kubelet. Apr 22 19:57:43.622795 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.622775 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 22 19:57:43.628865 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.628845 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 19:57:43.629043 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.629021 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 19:57:43.629658 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:43.629635 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-60.ec2.internal\" not found" Apr 22 19:57:43.629845 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.629822 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 19:57:43.629845 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.629844 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 19:57:43.629985 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:43.628829 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-60.ec2.internal.18a8c618fcaf0a7a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-60.ec2.internal,UID:ip-10-0-133-60.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-60.ec2.internal,},FirstTimestamp:2026-04-22 19:57:43.618505338 +0000 UTC m=+0.431890509,LastTimestamp:2026-04-22 19:57:43.618505338 +0000 UTC m=+0.431890509,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-60.ec2.internal,}" Apr 22 19:57:43.629985 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.629875 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 19:57:43.629985 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.629908 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 22 19:57:43.629985 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.629919 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 22 19:57:43.630484 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.630458 2575 factory.go:55] Registering systemd factory Apr 22 19:57:43.630567 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.630510 2575 factory.go:223] Registration of the systemd container factory successfully Apr 22 19:57:43.630644 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:43.630620 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 19:57:43.630747 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.630693 2575 factory.go:153] Registering CRI-O factory Apr 22 19:57:43.630747 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.630705 2575 factory.go:223] Registration of the crio container factory successfully Apr 22 19:57:43.630747 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.630746 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 19:57:43.630896 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.630768 2575 factory.go:103] Registering Raw factory Apr 22 19:57:43.630896 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.630786 2575 manager.go:1196] Started watching for new ooms in manager Apr 22 19:57:43.631131 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.631118 2575 manager.go:319] Starting recovery of all containers Apr 22 19:57:43.640486 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.640353 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-72xq8" Apr 22 19:57:43.640553 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.640539 2575 manager.go:324] Recovery completed Apr 22 19:57:43.640899 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:43.640876 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 19:57:43.640899 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:43.640879 2575 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-133-60.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 19:57:43.644663 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.644647 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:43.645342 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.645327 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-72xq8" Apr 22 19:57:43.647719 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.647704 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-60.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:43.647813 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.647735 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-60.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:43.647813 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.647750 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-60.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:43.648255 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.648241 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 19:57:43.648255 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.648254 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 19:57:43.648362 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.648269 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:57:43.650293 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.650282 2575 policy_none.go:49] "None policy: Start" Apr 22 19:57:43.650363 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.650297 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 19:57:43.650363 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.650306 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 22 19:57:43.689955 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.689937 2575 manager.go:341] "Starting Device Plugin manager" Apr 22 19:57:43.707087 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:43.689975 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 19:57:43.707087 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.689988 2575 server.go:85] "Starting device plugin registration server" Apr 22 19:57:43.707087 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.690270 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 19:57:43.707087 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.690283 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 19:57:43.707087 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.690394 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 19:57:43.707087 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.690474 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 19:57:43.707087 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.690485 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 19:57:43.707087 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:43.690989 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 19:57:43.707087 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:43.691033 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-60.ec2.internal\" not found" Apr 22 19:57:43.724985 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.724956 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 19:57:43.726061 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.726039 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 19:57:43.726147 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.726068 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 19:57:43.726147 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.726085 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 19:57:43.726147 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.726092 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 19:57:43.726147 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:43.726126 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 19:57:43.729950 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.729931 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:43.790997 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.790920 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:43.791898 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.791882 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-60.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:43.791985 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.791911 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-60.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:43.791985 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.791924 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-60.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:43.791985 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.791957 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-60.ec2.internal" Apr 22 19:57:43.800608 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.800593 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-60.ec2.internal" Apr 22 19:57:43.800653 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:43.800614 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-60.ec2.internal\": node \"ip-10-0-133-60.ec2.internal\" not found" Apr 22 19:57:43.826659 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.826619 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-60.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-60.ec2.internal"] Apr 22 19:57:43.826766 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.826700 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:43.827473 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.827460 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-60.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:43.827527 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.827486 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-60.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:43.827527 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.827498 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-60.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:43.829811 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.829799 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:43.830449 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.830434 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-60.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:43.830520 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.830459 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-60.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:43.830520 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.830469 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-60.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:43.830601 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.830586 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-60.ec2.internal" Apr 22 19:57:43.830646 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.830618 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:43.831297 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.831286 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-60.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:43.831368 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.831308 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-60.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:43.831368 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.831337 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-60.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:43.832667 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.832654 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-60.ec2.internal" Apr 22 19:57:43.832714 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.832677 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:57:43.833284 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.833269 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-60.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:57:43.833383 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.833291 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-60.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:57:43.833383 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.833301 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-60.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:57:43.844290 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:43.844273 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-60.ec2.internal\" not found" Apr 22 19:57:43.860024 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:43.860006 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-60.ec2.internal\" not found" node="ip-10-0-133-60.ec2.internal" Apr 22 19:57:43.864391 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:43.864376 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-60.ec2.internal\" not found" node="ip-10-0-133-60.ec2.internal" Apr 22 19:57:43.930728 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.930689 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cd407a21e2749828c73fd67e5ee13311-config\") pod \"kube-apiserver-proxy-ip-10-0-133-60.ec2.internal\" (UID: \"cd407a21e2749828c73fd67e5ee13311\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-60.ec2.internal" Apr 22 19:57:43.930728 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.930731 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/36565672596d30113df6124742f498e5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-60.ec2.internal\" (UID: \"36565672596d30113df6124742f498e5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-60.ec2.internal" Apr 22 19:57:43.930891 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:43.930751 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/36565672596d30113df6124742f498e5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-60.ec2.internal\" (UID: \"36565672596d30113df6124742f498e5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-60.ec2.internal" Apr 22 19:57:43.944801 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:43.944783 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-60.ec2.internal\" not found" Apr 22 19:57:44.031188 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:44.031163 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/36565672596d30113df6124742f498e5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-60.ec2.internal\" (UID: \"36565672596d30113df6124742f498e5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-60.ec2.internal" Apr 22 19:57:44.031272 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:44.031114 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/36565672596d30113df6124742f498e5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-60.ec2.internal\" (UID: \"36565672596d30113df6124742f498e5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-60.ec2.internal" Apr 22 19:57:44.031272 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:44.031225 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/36565672596d30113df6124742f498e5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-60.ec2.internal\" (UID: \"36565672596d30113df6124742f498e5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-60.ec2.internal" Apr 22 19:57:44.031272 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:44.031251 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cd407a21e2749828c73fd67e5ee13311-config\") pod \"kube-apiserver-proxy-ip-10-0-133-60.ec2.internal\" (UID: \"cd407a21e2749828c73fd67e5ee13311\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-60.ec2.internal" Apr 22 19:57:44.031417 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:44.031279 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cd407a21e2749828c73fd67e5ee13311-config\") pod \"kube-apiserver-proxy-ip-10-0-133-60.ec2.internal\" (UID: \"cd407a21e2749828c73fd67e5ee13311\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-60.ec2.internal" Apr 22 19:57:44.031417 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:44.031291 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/36565672596d30113df6124742f498e5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-60.ec2.internal\" (UID: \"36565672596d30113df6124742f498e5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-60.ec2.internal" Apr 22 19:57:44.045266 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:44.045209 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-60.ec2.internal\" not found" Apr 22 19:57:44.146031 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:44.145997 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-60.ec2.internal\" not found" Apr 22 19:57:44.162179 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:44.162157 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-60.ec2.internal" Apr 22 19:57:44.166923 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:44.166905 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-60.ec2.internal" Apr 22 19:57:44.247098 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:44.247048 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-60.ec2.internal\" not found" Apr 22 19:57:44.347611 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:44.347521 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-60.ec2.internal\" not found" Apr 22 19:57:44.448122 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:44.448077 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-60.ec2.internal\" not found" Apr 22 19:57:44.538733 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:44.538703 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 19:57:44.539243 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:44.538859 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:57:44.548847 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:44.548827 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-60.ec2.internal\" not found" Apr 22 19:57:44.629576 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:44.629503 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 19:57:44.642647 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:44.642625 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:57:44.647730 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:44.647706 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 19:52:43 +0000 UTC" deadline="2027-12-08 06:16:31.54667147 +0000 UTC" Apr 22 19:57:44.647730 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:44.647728 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14266h18m46.898946239s" Apr 22 19:57:44.649851 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:44.649836 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-60.ec2.internal\" not found" Apr 22 19:57:44.666501 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:44.666482 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-spw5c" Apr 22 19:57:44.678433 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:44.678409 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-spw5c" Apr 22 19:57:44.711524 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:44.711507 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:44.729724 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:44.729703 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-60.ec2.internal" Apr 22 19:57:44.743885 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:44.743863 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:57:44.745738 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:44.745725 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-60.ec2.internal" Apr 22 19:57:44.753917 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:44.753893 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:57:44.811551 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:44.811528 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:44.833381 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:44.833351 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36565672596d30113df6124742f498e5.slice/crio-a55ca1362d35350c2d18cbd8a7c37b9642a53ac6e8f622f410bd7fd37e78c4a1 WatchSource:0}: Error finding container a55ca1362d35350c2d18cbd8a7c37b9642a53ac6e8f622f410bd7fd37e78c4a1: Status 404 returned error can't find the container with id a55ca1362d35350c2d18cbd8a7c37b9642a53ac6e8f622f410bd7fd37e78c4a1 Apr 22 19:57:44.833749 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:44.833736 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd407a21e2749828c73fd67e5ee13311.slice/crio-cbe4b8c460caaa22c2a3bb43a97e2d83d5bd78b187d0794994d9b0f0328551c3 WatchSource:0}: Error finding container cbe4b8c460caaa22c2a3bb43a97e2d83d5bd78b187d0794994d9b0f0328551c3: Status 404 returned error can't find the container with id cbe4b8c460caaa22c2a3bb43a97e2d83d5bd78b187d0794994d9b0f0328551c3 Apr 22 19:57:44.837533 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:44.837520 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:57:44.873653 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:44.873593 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:45.608002 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.607969 2575 apiserver.go:52] "Watching apiserver" Apr 22 19:57:45.616736 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.616703 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 19:57:45.617984 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.617952 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-133-60.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72j5f","openshift-cluster-node-tuning-operator/tuned-v28mk","openshift-image-registry/node-ca-lnz7b","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-60.ec2.internal","openshift-multus/multus-additional-cni-plugins-cfjtn","openshift-multus/network-metrics-daemon-v8fph","kube-system/global-pull-secret-syncer-cvg2s","openshift-dns/node-resolver-jlns8","openshift-multus/multus-7qp87","openshift-network-diagnostics/network-check-target-ncvmn","openshift-network-operator/iptables-alerter-f8qlf","openshift-ovn-kubernetes/ovnkube-node-6jf5c","kube-system/konnectivity-agent-w7bfn"] Apr 22 19:57:45.622634 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.622610 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:57:45.622761 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:45.622704 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v8fph" podUID="d7bee1d4-9229-4b17-8ec5-e19b53d61c9d" Apr 22 19:57:45.624746 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.624721 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-w7bfn" Apr 22 19:57:45.627036 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.627017 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.627971 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.627648 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 19:57:45.627971 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.627692 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-x4jwj\"" Apr 22 19:57:45.627971 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.627844 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 19:57:45.629666 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.629645 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.630874 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.630482 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 19:57:45.630874 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.630521 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-88jdn\"" Apr 22 19:57:45.630874 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.630740 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:57:45.632835 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.632238 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lnz7b" Apr 22 19:57:45.632835 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.632704 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 19:57:45.634123 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.633486 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 19:57:45.634123 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.633562 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 19:57:45.634123 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.633830 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 19:57:45.635214 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.634555 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cfjtn" Apr 22 19:57:45.635214 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.634662 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 19:57:45.635214 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.634902 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-2h9bq\"" Apr 22 19:57:45.635214 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.635104 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 19:57:45.635214 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.635121 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 19:57:45.635535 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.635429 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:57:45.636111 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.635666 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 19:57:45.636111 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.635674 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-txvmq\"" Apr 22 19:57:45.637098 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.637074 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72j5f" Apr 22 19:57:45.637976 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.637958 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 19:57:45.638476 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.638457 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 19:57:45.638820 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.638803 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-lkmrr\"" Apr 22 19:57:45.639101 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.639083 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 19:57:45.639252 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.639212 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 19:57:45.639533 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.639509 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 19:57:45.639610 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.639593 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 19:57:45.639764 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.639752 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 19:57:45.639848 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.639832 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-4rlrc\"" Apr 22 19:57:45.639886 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.639849 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/153f74bd-9a2c-4a02-88c9-243b60b35439-os-release\") pod \"multus-additional-cni-plugins-cfjtn\" (UID: \"153f74bd-9a2c-4a02-88c9-243b60b35439\") " pod="openshift-multus/multus-additional-cni-plugins-cfjtn" Apr 22 19:57:45.639922 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.639890 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0961e7e6-a8cb-43cc-82a5-7a080e47aae5-konnectivity-ca\") pod \"konnectivity-agent-w7bfn\" (UID: \"0961e7e6-a8cb-43cc-82a5-7a080e47aae5\") " pod="kube-system/konnectivity-agent-w7bfn" Apr 22 19:57:45.639922 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.639899 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 19:57:45.639979 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.639923 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-etc-sysconfig\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.639979 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.639966 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-etc-sysctl-d\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.640055 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.639998 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-systemd-units\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.640055 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.640030 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-etc-openvswitch\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.640120 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.640063 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/153f74bd-9a2c-4a02-88c9-243b60b35439-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cfjtn\" (UID: \"153f74bd-9a2c-4a02-88c9-243b60b35439\") " pod="openshift-multus/multus-additional-cni-plugins-cfjtn" Apr 22 19:57:45.640153 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.640123 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-etc-systemd\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.640184 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.640164 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-sys\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.640239 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.640212 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-host\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.640285 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.640247 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-host-run-netns\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.640371 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.640356 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jlns8" Apr 22 19:57:45.640442 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.640414 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knt6n\" (UniqueName: \"kubernetes.io/projected/153f74bd-9a2c-4a02-88c9-243b60b35439-kube-api-access-knt6n\") pod \"multus-additional-cni-plugins-cfjtn\" (UID: \"153f74bd-9a2c-4a02-88c9-243b60b35439\") " pod="openshift-multus/multus-additional-cni-plugins-cfjtn" Apr 22 19:57:45.640502 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.640451 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7bee1d4-9229-4b17-8ec5-e19b53d61c9d-metrics-certs\") pod \"network-metrics-daemon-v8fph\" (UID: \"d7bee1d4-9229-4b17-8ec5-e19b53d61c9d\") " pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:57:45.640502 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.640485 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-695p6\" (UniqueName: \"kubernetes.io/projected/d7bee1d4-9229-4b17-8ec5-e19b53d61c9d-kube-api-access-695p6\") pod \"network-metrics-daemon-v8fph\" (UID: \"d7bee1d4-9229-4b17-8ec5-e19b53d61c9d\") " pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:57:45.640698 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.640517 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0961e7e6-a8cb-43cc-82a5-7a080e47aae5-agent-certs\") pod \"konnectivity-agent-w7bfn\" (UID: \"0961e7e6-a8cb-43cc-82a5-7a080e47aae5\") " pod="kube-system/konnectivity-agent-w7bfn" Apr 22 19:57:45.640755 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.640707 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-host-slash\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.640755 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.640745 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-host-run-ovn-kubernetes\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.640876 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.640778 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/153f74bd-9a2c-4a02-88c9-243b60b35439-cni-binary-copy\") pod \"multus-additional-cni-plugins-cfjtn\" (UID: \"153f74bd-9a2c-4a02-88c9-243b60b35439\") " pod="openshift-multus/multus-additional-cni-plugins-cfjtn" Apr 22 19:57:45.640876 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.640809 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/153f74bd-9a2c-4a02-88c9-243b60b35439-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cfjtn\" (UID: \"153f74bd-9a2c-4a02-88c9-243b60b35439\") " pod="openshift-multus/multus-additional-cni-plugins-cfjtn" Apr 22 19:57:45.640876 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.640844 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c63ee2c0-b298-42a6-bcd1-b05ffc7971f2-host\") pod \"node-ca-lnz7b\" (UID: \"c63ee2c0-b298-42a6-bcd1-b05ffc7971f2\") " pod="openshift-image-registry/node-ca-lnz7b" Apr 22 19:57:45.641032 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.640879 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c63ee2c0-b298-42a6-bcd1-b05ffc7971f2-serviceca\") pod \"node-ca-lnz7b\" (UID: \"c63ee2c0-b298-42a6-bcd1-b05ffc7971f2\") " pod="openshift-image-registry/node-ca-lnz7b" Apr 22 19:57:45.641032 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.640968 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gmnn\" (UniqueName: \"kubernetes.io/projected/c63ee2c0-b298-42a6-bcd1-b05ffc7971f2-kube-api-access-4gmnn\") pod \"node-ca-lnz7b\" (UID: \"c63ee2c0-b298-42a6-bcd1-b05ffc7971f2\") " pod="openshift-image-registry/node-ca-lnz7b" Apr 22 19:57:45.641032 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.640989 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 19:57:45.641190 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.641059 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-run-ovn\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.641190 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.641076 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-host-cni-netd\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.641190 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.641095 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-etc-sysctl-conf\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.641381 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.641202 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-run-openvswitch\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.641381 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.641286 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-node-log\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.641381 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.641337 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-host-cni-bin\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.641559 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.641402 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.641559 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.641429 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e92d8d1a-4d78-4b32-8e69-32db4468f373-ovnkube-config\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.641559 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.641450 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvq8l\" (UniqueName: \"kubernetes.io/projected/e92d8d1a-4d78-4b32-8e69-32db4468f373-kube-api-access-cvq8l\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.641559 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.641493 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-etc-modprobe-d\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.641559 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.641540 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-etc-kubernetes\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.641826 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.641568 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-run\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.641826 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.641610 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-var-lib-kubelet\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.641826 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.641647 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fk7k\" (UniqueName: \"kubernetes.io/projected/ae1657a9-83bc-4dfc-8385-1b003298717a-kube-api-access-5fk7k\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.641826 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.641739 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e92d8d1a-4d78-4b32-8e69-32db4468f373-env-overrides\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.641826 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.641781 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e92d8d1a-4d78-4b32-8e69-32db4468f373-ovn-node-metrics-cert\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.642040 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.641832 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/153f74bd-9a2c-4a02-88c9-243b60b35439-system-cni-dir\") pod \"multus-additional-cni-plugins-cfjtn\" (UID: \"153f74bd-9a2c-4a02-88c9-243b60b35439\") " pod="openshift-multus/multus-additional-cni-plugins-cfjtn" Apr 22 19:57:45.642040 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.641891 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-host-kubelet\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.642040 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.641920 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-var-lib-openvswitch\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.642040 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.641941 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/153f74bd-9a2c-4a02-88c9-243b60b35439-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cfjtn\" (UID: \"153f74bd-9a2c-4a02-88c9-243b60b35439\") " pod="openshift-multus/multus-additional-cni-plugins-cfjtn" Apr 22 19:57:45.642040 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.641970 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-lib-modules\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.642298 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.642023 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ae1657a9-83bc-4dfc-8385-1b003298717a-etc-tuned\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.642298 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.642126 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ae1657a9-83bc-4dfc-8385-1b003298717a-tmp\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.642298 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.642159 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-run-systemd\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.642298 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.642192 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-log-socket\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.642298 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.642252 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e92d8d1a-4d78-4b32-8e69-32db4468f373-ovnkube-script-lib\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.642298 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.642288 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/153f74bd-9a2c-4a02-88c9-243b60b35439-cnibin\") pod \"multus-additional-cni-plugins-cfjtn\" (UID: \"153f74bd-9a2c-4a02-88c9-243b60b35439\") " pod="openshift-multus/multus-additional-cni-plugins-cfjtn" Apr 22 19:57:45.643042 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.642404 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-4fjn6\"" Apr 22 19:57:45.643239 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.642639 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 19:57:45.643357 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.642654 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 19:57:45.644531 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.644512 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.646548 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.646527 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 19:57:45.646652 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.646595 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-f6lmc\"" Apr 22 19:57:45.646993 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.646974 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncvmn" Apr 22 19:57:45.647092 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:45.647052 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncvmn" podUID="f55cb5a5-4e42-4775-bf9f-5f92344b63ff" Apr 22 19:57:45.650541 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.650513 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-f8qlf" Apr 22 19:57:45.652621 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.652555 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:57:45.652621 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.652580 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 19:57:45.652899 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.652758 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-tsf74\"" Apr 22 19:57:45.653026 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.653009 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:57:45.653100 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.653048 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 19:57:45.653100 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:45.653073 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cvg2s" podUID="cba227e9-dcf5-4cf7-9c4b-83013a0b20fb" Apr 22 19:57:45.679736 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.679661 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:52:44 +0000 UTC" deadline="2027-10-04 10:53:31.095048037 +0000 UTC" Apr 22 19:57:45.679736 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.679695 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12710h55m45.415357375s" Apr 22 19:57:45.730705 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.730654 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-60.ec2.internal" event={"ID":"cd407a21e2749828c73fd67e5ee13311","Type":"ContainerStarted","Data":"cbe4b8c460caaa22c2a3bb43a97e2d83d5bd78b187d0794994d9b0f0328551c3"} Apr 22 19:57:45.730948 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.730933 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 19:57:45.731919 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.731896 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-60.ec2.internal" event={"ID":"36565672596d30113df6124742f498e5","Type":"ContainerStarted","Data":"a55ca1362d35350c2d18cbd8a7c37b9642a53ac6e8f622f410bd7fd37e78c4a1"} Apr 22 19:57:45.743164 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.743134 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/153f74bd-9a2c-4a02-88c9-243b60b35439-system-cni-dir\") pod \"multus-additional-cni-plugins-cfjtn\" (UID: \"153f74bd-9a2c-4a02-88c9-243b60b35439\") " pod="openshift-multus/multus-additional-cni-plugins-cfjtn" Apr 22 19:57:45.743290 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.743169 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/153f74bd-9a2c-4a02-88c9-243b60b35439-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cfjtn\" (UID: \"153f74bd-9a2c-4a02-88c9-243b60b35439\") " pod="openshift-multus/multus-additional-cni-plugins-cfjtn" Apr 22 19:57:45.743290 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.743190 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0a9f8642-7eca-4b0a-bf48-893987e02188-hosts-file\") pod \"node-resolver-jlns8\" (UID: \"0a9f8642-7eca-4b0a-bf48-893987e02188\") " pod="openshift-dns/node-resolver-jlns8" Apr 22 19:57:45.743290 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.743218 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-host-run-multus-certs\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.743290 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.743256 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/153f74bd-9a2c-4a02-88c9-243b60b35439-system-cni-dir\") pod \"multus-additional-cni-plugins-cfjtn\" (UID: \"153f74bd-9a2c-4a02-88c9-243b60b35439\") " pod="openshift-multus/multus-additional-cni-plugins-cfjtn" Apr 22 19:57:45.743290 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.743278 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-host-kubelet\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.743565 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.743334 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-var-lib-openvswitch\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.743565 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.743367 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-host-kubelet\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.743565 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.743426 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d13c3e6a-67cb-4dd5-acc1-44cced31f8d2-socket-dir\") pod \"aws-ebs-csi-driver-node-72j5f\" (UID: \"d13c3e6a-67cb-4dd5-acc1-44cced31f8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72j5f" Apr 22 19:57:45.743565 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.743442 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-var-lib-openvswitch\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.743565 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.743459 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-multus-socket-dir-parent\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.743565 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.743475 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-hostroot\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.743565 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.743503 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/48059c77-1229-43a7-a9b7-366463148f62-iptables-alerter-script\") pod \"iptables-alerter-f8qlf\" (UID: \"48059c77-1229-43a7-a9b7-366463148f62\") " pod="openshift-network-operator/iptables-alerter-f8qlf" Apr 22 19:57:45.743565 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.743518 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/48059c77-1229-43a7-a9b7-366463148f62-host-slash\") pod \"iptables-alerter-f8qlf\" (UID: \"48059c77-1229-43a7-a9b7-366463148f62\") " pod="openshift-network-operator/iptables-alerter-f8qlf" Apr 22 19:57:45.743565 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.743536 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-lib-modules\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.743565 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.743552 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ae1657a9-83bc-4dfc-8385-1b003298717a-etc-tuned\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.744016 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.743677 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/153f74bd-9a2c-4a02-88c9-243b60b35439-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cfjtn\" (UID: \"153f74bd-9a2c-4a02-88c9-243b60b35439\") " pod="openshift-multus/multus-additional-cni-plugins-cfjtn" Apr 22 19:57:45.744016 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.743695 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-log-socket\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.744016 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.743727 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e92d8d1a-4d78-4b32-8e69-32db4468f373-ovnkube-script-lib\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.744016 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.743755 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/153f74bd-9a2c-4a02-88c9-243b60b35439-cnibin\") pod \"multus-additional-cni-plugins-cfjtn\" (UID: \"153f74bd-9a2c-4a02-88c9-243b60b35439\") " pod="openshift-multus/multus-additional-cni-plugins-cfjtn" Apr 22 19:57:45.744016 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.743701 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-lib-modules\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.744016 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.743802 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/153f74bd-9a2c-4a02-88c9-243b60b35439-os-release\") pod \"multus-additional-cni-plugins-cfjtn\" (UID: \"153f74bd-9a2c-4a02-88c9-243b60b35439\") " pod="openshift-multus/multus-additional-cni-plugins-cfjtn" Apr 22 19:57:45.744016 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.743827 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-log-socket\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.744016 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.743834 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4vz7\" (UniqueName: \"kubernetes.io/projected/48059c77-1229-43a7-a9b7-366463148f62-kube-api-access-x4vz7\") pod \"iptables-alerter-f8qlf\" (UID: \"48059c77-1229-43a7-a9b7-366463148f62\") " pod="openshift-network-operator/iptables-alerter-f8qlf" Apr 22 19:57:45.744016 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.743881 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0961e7e6-a8cb-43cc-82a5-7a080e47aae5-konnectivity-ca\") pod \"konnectivity-agent-w7bfn\" (UID: \"0961e7e6-a8cb-43cc-82a5-7a080e47aae5\") " pod="kube-system/konnectivity-agent-w7bfn" Apr 22 19:57:45.744016 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.743911 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-etc-sysconfig\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.744016 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.743914 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 19:57:45.744016 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.743959 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-systemd-units\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.744016 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.743985 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-etc-openvswitch\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.744016 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.744002 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/153f74bd-9a2c-4a02-88c9-243b60b35439-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cfjtn\" (UID: \"153f74bd-9a2c-4a02-88c9-243b60b35439\") " pod="openshift-multus/multus-additional-cni-plugins-cfjtn" Apr 22 19:57:45.744016 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.744019 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-multus-conf-dir\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.744710 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.744038 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/153f74bd-9a2c-4a02-88c9-243b60b35439-cnibin\") pod \"multus-additional-cni-plugins-cfjtn\" (UID: \"153f74bd-9a2c-4a02-88c9-243b60b35439\") " pod="openshift-multus/multus-additional-cni-plugins-cfjtn" Apr 22 19:57:45.744710 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.744070 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-host\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.744710 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.744095 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-host-run-netns\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.744710 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.744121 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/cba227e9-dcf5-4cf7-9c4b-83013a0b20fb-dbus\") pod \"global-pull-secret-syncer-cvg2s\" (UID: \"cba227e9-dcf5-4cf7-9c4b-83013a0b20fb\") " pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:57:45.744710 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.744148 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cba227e9-dcf5-4cf7-9c4b-83013a0b20fb-original-pull-secret\") pod \"global-pull-secret-syncer-cvg2s\" (UID: \"cba227e9-dcf5-4cf7-9c4b-83013a0b20fb\") " pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:57:45.744710 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.744173 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7bee1d4-9229-4b17-8ec5-e19b53d61c9d-metrics-certs\") pod \"network-metrics-daemon-v8fph\" (UID: \"d7bee1d4-9229-4b17-8ec5-e19b53d61c9d\") " pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:57:45.744710 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.744189 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0961e7e6-a8cb-43cc-82a5-7a080e47aae5-agent-certs\") pod \"konnectivity-agent-w7bfn\" (UID: \"0961e7e6-a8cb-43cc-82a5-7a080e47aae5\") " pod="kube-system/konnectivity-agent-w7bfn" Apr 22 19:57:45.744710 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.744268 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/153f74bd-9a2c-4a02-88c9-243b60b35439-cni-binary-copy\") pod \"multus-additional-cni-plugins-cfjtn\" (UID: \"153f74bd-9a2c-4a02-88c9-243b60b35439\") " pod="openshift-multus/multus-additional-cni-plugins-cfjtn" Apr 22 19:57:45.744710 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.744335 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/153f74bd-9a2c-4a02-88c9-243b60b35439-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cfjtn\" (UID: \"153f74bd-9a2c-4a02-88c9-243b60b35439\") " pod="openshift-multus/multus-additional-cni-plugins-cfjtn" Apr 22 19:57:45.744710 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.744370 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d13c3e6a-67cb-4dd5-acc1-44cced31f8d2-device-dir\") pod \"aws-ebs-csi-driver-node-72j5f\" (UID: \"d13c3e6a-67cb-4dd5-acc1-44cced31f8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72j5f" Apr 22 19:57:45.744710 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.744396 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-host-var-lib-cni-bin\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.744710 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.744416 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gmnn\" (UniqueName: \"kubernetes.io/projected/c63ee2c0-b298-42a6-bcd1-b05ffc7971f2-kube-api-access-4gmnn\") pod \"node-ca-lnz7b\" (UID: \"c63ee2c0-b298-42a6-bcd1-b05ffc7971f2\") " pod="openshift-image-registry/node-ca-lnz7b" Apr 22 19:57:45.744710 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.744432 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76f4k\" (UniqueName: \"kubernetes.io/projected/f55cb5a5-4e42-4775-bf9f-5f92344b63ff-kube-api-access-76f4k\") pod \"network-check-target-ncvmn\" (UID: \"f55cb5a5-4e42-4775-bf9f-5f92344b63ff\") " pod="openshift-network-diagnostics/network-check-target-ncvmn" Apr 22 19:57:45.744710 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.744441 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e92d8d1a-4d78-4b32-8e69-32db4468f373-ovnkube-script-lib\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.744710 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.744461 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-cnibin\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.744710 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.744485 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-etc-sysctl-conf\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.744710 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.744504 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-run-openvswitch\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.745498 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.744512 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/153f74bd-9a2c-4a02-88c9-243b60b35439-os-release\") pod \"multus-additional-cni-plugins-cfjtn\" (UID: \"153f74bd-9a2c-4a02-88c9-243b60b35439\") " pod="openshift-multus/multus-additional-cni-plugins-cfjtn" Apr 22 19:57:45.745498 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.744521 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvq8l\" (UniqueName: \"kubernetes.io/projected/e92d8d1a-4d78-4b32-8e69-32db4468f373-kube-api-access-cvq8l\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.745498 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.744551 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d13c3e6a-67cb-4dd5-acc1-44cced31f8d2-registration-dir\") pod \"aws-ebs-csi-driver-node-72j5f\" (UID: \"d13c3e6a-67cb-4dd5-acc1-44cced31f8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72j5f" Apr 22 19:57:45.745498 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.744581 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0961e7e6-a8cb-43cc-82a5-7a080e47aae5-konnectivity-ca\") pod \"konnectivity-agent-w7bfn\" (UID: \"0961e7e6-a8cb-43cc-82a5-7a080e47aae5\") " pod="kube-system/konnectivity-agent-w7bfn" Apr 22 19:57:45.745498 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.744651 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-etc-sysconfig\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.745498 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.744856 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-host\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.745498 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.744890 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-systemd-units\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.745498 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.744913 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-etc-openvswitch\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.745498 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.744969 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/153f74bd-9a2c-4a02-88c9-243b60b35439-cni-binary-copy\") pod \"multus-additional-cni-plugins-cfjtn\" (UID: \"153f74bd-9a2c-4a02-88c9-243b60b35439\") " pod="openshift-multus/multus-additional-cni-plugins-cfjtn" Apr 22 19:57:45.745498 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.745012 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-host-run-netns\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.745498 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.745029 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/153f74bd-9a2c-4a02-88c9-243b60b35439-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cfjtn\" (UID: \"153f74bd-9a2c-4a02-88c9-243b60b35439\") " pod="openshift-multus/multus-additional-cni-plugins-cfjtn" Apr 22 19:57:45.745498 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.745039 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-host-var-lib-kubelet\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.745498 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.745051 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-host-run-netns\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.745498 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.745071 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-run\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.745498 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.745097 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-var-lib-kubelet\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.745498 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:45.745111 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:45.745498 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.745123 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e92d8d1a-4d78-4b32-8e69-32db4468f373-env-overrides\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.746270 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:45.745182 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7bee1d4-9229-4b17-8ec5-e19b53d61c9d-metrics-certs podName:d7bee1d4-9229-4b17-8ec5-e19b53d61c9d nodeName:}" failed. No retries permitted until 2026-04-22 19:57:46.245151863 +0000 UTC m=+3.058537024 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7bee1d4-9229-4b17-8ec5-e19b53d61c9d-metrics-certs") pod "network-metrics-daemon-v8fph" (UID: "d7bee1d4-9229-4b17-8ec5-e19b53d61c9d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:45.746270 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.745204 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0a9f8642-7eca-4b0a-bf48-893987e02188-tmp-dir\") pod \"node-resolver-jlns8\" (UID: \"0a9f8642-7eca-4b0a-bf48-893987e02188\") " pod="openshift-dns/node-resolver-jlns8" Apr 22 19:57:45.746270 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.745508 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e92d8d1a-4d78-4b32-8e69-32db4468f373-env-overrides\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.746270 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.745627 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-etc-sysctl-conf\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.746270 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.745662 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-run-openvswitch\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.746270 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.745715 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5n5j\" (UniqueName: \"kubernetes.io/projected/d13c3e6a-67cb-4dd5-acc1-44cced31f8d2-kube-api-access-r5n5j\") pod \"aws-ebs-csi-driver-node-72j5f\" (UID: \"d13c3e6a-67cb-4dd5-acc1-44cced31f8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72j5f" Apr 22 19:57:45.746270 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.745757 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-multus-cni-dir\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.746270 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.745783 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8a3063cb-ff79-4232-8126-f9de4b63a839-cni-binary-copy\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.746270 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.745814 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x756g\" (UniqueName: \"kubernetes.io/projected/8a3063cb-ff79-4232-8126-f9de4b63a839-kube-api-access-x756g\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.746270 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.745840 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8a3063cb-ff79-4232-8126-f9de4b63a839-multus-daemon-config\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.746270 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.745870 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ae1657a9-83bc-4dfc-8385-1b003298717a-tmp\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.746270 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.745977 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/153f74bd-9a2c-4a02-88c9-243b60b35439-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cfjtn\" (UID: \"153f74bd-9a2c-4a02-88c9-243b60b35439\") " pod="openshift-multus/multus-additional-cni-plugins-cfjtn" Apr 22 19:57:45.746270 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.746050 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-run\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.746270 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.746095 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-var-lib-kubelet\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.746270 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.746120 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-run-systemd\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.746270 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.746163 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d13c3e6a-67cb-4dd5-acc1-44cced31f8d2-sys-fs\") pod \"aws-ebs-csi-driver-node-72j5f\" (UID: \"d13c3e6a-67cb-4dd5-acc1-44cced31f8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72j5f" Apr 22 19:57:45.747222 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.746190 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-os-release\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.747222 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.746232 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-host-run-k8s-cni-cncf-io\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.747222 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.746258 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-etc-kubernetes\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.747222 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.746288 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-etc-sysctl-d\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.747222 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.746359 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d13c3e6a-67cb-4dd5-acc1-44cced31f8d2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-72j5f\" (UID: \"d13c3e6a-67cb-4dd5-acc1-44cced31f8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72j5f" Apr 22 19:57:45.747222 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.746869 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-etc-systemd\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.747222 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.746937 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-sys\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.747222 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.746977 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-knt6n\" (UniqueName: \"kubernetes.io/projected/153f74bd-9a2c-4a02-88c9-243b60b35439-kube-api-access-knt6n\") pod \"multus-additional-cni-plugins-cfjtn\" (UID: \"153f74bd-9a2c-4a02-88c9-243b60b35439\") " pod="openshift-multus/multus-additional-cni-plugins-cfjtn" Apr 22 19:57:45.747222 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.747016 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-695p6\" (UniqueName: \"kubernetes.io/projected/d7bee1d4-9229-4b17-8ec5-e19b53d61c9d-kube-api-access-695p6\") pod \"network-metrics-daemon-v8fph\" (UID: \"d7bee1d4-9229-4b17-8ec5-e19b53d61c9d\") " pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:57:45.747222 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.747050 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-host-slash\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.747789 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.747337 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-host-run-ovn-kubernetes\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.747789 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.747377 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-system-cni-dir\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.747789 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.747410 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-host-var-lib-cni-multus\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.747789 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.747446 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c63ee2c0-b298-42a6-bcd1-b05ffc7971f2-host\") pod \"node-ca-lnz7b\" (UID: \"c63ee2c0-b298-42a6-bcd1-b05ffc7971f2\") " pod="openshift-image-registry/node-ca-lnz7b" Apr 22 19:57:45.747789 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.747505 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c63ee2c0-b298-42a6-bcd1-b05ffc7971f2-host\") pod \"node-ca-lnz7b\" (UID: \"c63ee2c0-b298-42a6-bcd1-b05ffc7971f2\") " pod="openshift-image-registry/node-ca-lnz7b" Apr 22 19:57:45.747789 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.747546 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-host-slash\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.747789 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.747604 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-run-systemd\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.747789 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.747118 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-etc-sysctl-d\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.747789 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.747667 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-etc-systemd\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.747789 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.747715 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-sys\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.748253 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.747793 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ae1657a9-83bc-4dfc-8385-1b003298717a-etc-tuned\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.748253 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.747955 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c63ee2c0-b298-42a6-bcd1-b05ffc7971f2-serviceca\") pod \"node-ca-lnz7b\" (UID: \"c63ee2c0-b298-42a6-bcd1-b05ffc7971f2\") " pod="openshift-image-registry/node-ca-lnz7b" Apr 22 19:57:45.748253 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.747974 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-host-run-ovn-kubernetes\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.748253 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.748030 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-run-ovn\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.748468 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.748373 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-host-cni-netd\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.748468 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.748408 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jkjh\" (UniqueName: \"kubernetes.io/projected/0a9f8642-7eca-4b0a-bf48-893987e02188-kube-api-access-9jkjh\") pod \"node-resolver-jlns8\" (UID: \"0a9f8642-7eca-4b0a-bf48-893987e02188\") " pod="openshift-dns/node-resolver-jlns8" Apr 22 19:57:45.748468 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.748463 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-node-log\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.748617 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.748495 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-host-cni-bin\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.748617 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.748529 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.748752 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.748675 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e92d8d1a-4d78-4b32-8e69-32db4468f373-ovnkube-config\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.748752 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.748709 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d13c3e6a-67cb-4dd5-acc1-44cced31f8d2-etc-selinux\") pod \"aws-ebs-csi-driver-node-72j5f\" (UID: \"d13c3e6a-67cb-4dd5-acc1-44cced31f8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72j5f" Apr 22 19:57:45.748752 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.748741 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/cba227e9-dcf5-4cf7-9c4b-83013a0b20fb-kubelet-config\") pod \"global-pull-secret-syncer-cvg2s\" (UID: \"cba227e9-dcf5-4cf7-9c4b-83013a0b20fb\") " pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:57:45.748895 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.748770 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-etc-modprobe-d\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.748895 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.748803 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-etc-kubernetes\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.748895 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.748837 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fk7k\" (UniqueName: \"kubernetes.io/projected/ae1657a9-83bc-4dfc-8385-1b003298717a-kube-api-access-5fk7k\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.748895 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.748867 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e92d8d1a-4d78-4b32-8e69-32db4468f373-ovn-node-metrics-cert\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.749215 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.749196 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ae1657a9-83bc-4dfc-8385-1b003298717a-tmp\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.749715 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.749671 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e92d8d1a-4d78-4b32-8e69-32db4468f373-ovnkube-config\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.749806 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.749727 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.749806 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.749783 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-run-ovn\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.749913 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.749817 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-host-cni-netd\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.749913 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.749859 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-node-log\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.749913 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.749900 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e92d8d1a-4d78-4b32-8e69-32db4468f373-host-cni-bin\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.750256 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.749921 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c63ee2c0-b298-42a6-bcd1-b05ffc7971f2-serviceca\") pod \"node-ca-lnz7b\" (UID: \"c63ee2c0-b298-42a6-bcd1-b05ffc7971f2\") " pod="openshift-image-registry/node-ca-lnz7b" Apr 22 19:57:45.750256 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.750131 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-etc-kubernetes\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.750256 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.750134 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ae1657a9-83bc-4dfc-8385-1b003298717a-etc-modprobe-d\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.752160 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.752097 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e92d8d1a-4d78-4b32-8e69-32db4468f373-ovn-node-metrics-cert\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.752285 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.752243 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0961e7e6-a8cb-43cc-82a5-7a080e47aae5-agent-certs\") pod \"konnectivity-agent-w7bfn\" (UID: \"0961e7e6-a8cb-43cc-82a5-7a080e47aae5\") " pod="kube-system/konnectivity-agent-w7bfn" Apr 22 19:57:45.753505 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.753487 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvq8l\" (UniqueName: \"kubernetes.io/projected/e92d8d1a-4d78-4b32-8e69-32db4468f373-kube-api-access-cvq8l\") pod \"ovnkube-node-6jf5c\" (UID: \"e92d8d1a-4d78-4b32-8e69-32db4468f373\") " pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.754697 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.754673 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gmnn\" (UniqueName: \"kubernetes.io/projected/c63ee2c0-b298-42a6-bcd1-b05ffc7971f2-kube-api-access-4gmnn\") pod \"node-ca-lnz7b\" (UID: \"c63ee2c0-b298-42a6-bcd1-b05ffc7971f2\") " pod="openshift-image-registry/node-ca-lnz7b" Apr 22 19:57:45.756466 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.756425 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-knt6n\" (UniqueName: \"kubernetes.io/projected/153f74bd-9a2c-4a02-88c9-243b60b35439-kube-api-access-knt6n\") pod \"multus-additional-cni-plugins-cfjtn\" (UID: \"153f74bd-9a2c-4a02-88c9-243b60b35439\") " pod="openshift-multus/multus-additional-cni-plugins-cfjtn" Apr 22 19:57:45.756665 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.756644 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-695p6\" (UniqueName: \"kubernetes.io/projected/d7bee1d4-9229-4b17-8ec5-e19b53d61c9d-kube-api-access-695p6\") pod \"network-metrics-daemon-v8fph\" (UID: \"d7bee1d4-9229-4b17-8ec5-e19b53d61c9d\") " pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:57:45.760384 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.760360 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fk7k\" (UniqueName: \"kubernetes.io/projected/ae1657a9-83bc-4dfc-8385-1b003298717a-kube-api-access-5fk7k\") pod \"tuned-v28mk\" (UID: \"ae1657a9-83bc-4dfc-8385-1b003298717a\") " pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.849914 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.849876 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-multus-cni-dir\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.849914 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.849918 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8a3063cb-ff79-4232-8126-f9de4b63a839-cni-binary-copy\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.850143 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.849943 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x756g\" (UniqueName: \"kubernetes.io/projected/8a3063cb-ff79-4232-8126-f9de4b63a839-kube-api-access-x756g\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.850143 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.849964 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8a3063cb-ff79-4232-8126-f9de4b63a839-multus-daemon-config\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.850143 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.849986 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d13c3e6a-67cb-4dd5-acc1-44cced31f8d2-sys-fs\") pod \"aws-ebs-csi-driver-node-72j5f\" (UID: \"d13c3e6a-67cb-4dd5-acc1-44cced31f8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72j5f" Apr 22 19:57:45.850143 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850000 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-multus-cni-dir\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.850143 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850005 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-os-release\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.850143 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850109 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-os-release\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.850143 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850059 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d13c3e6a-67cb-4dd5-acc1-44cced31f8d2-sys-fs\") pod \"aws-ebs-csi-driver-node-72j5f\" (UID: \"d13c3e6a-67cb-4dd5-acc1-44cced31f8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72j5f" Apr 22 19:57:45.850143 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850140 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-host-run-k8s-cni-cncf-io\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.850489 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850167 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-etc-kubernetes\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.850489 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850195 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d13c3e6a-67cb-4dd5-acc1-44cced31f8d2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-72j5f\" (UID: \"d13c3e6a-67cb-4dd5-acc1-44cced31f8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72j5f" Apr 22 19:57:45.850489 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850224 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-system-cni-dir\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.850489 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850249 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-host-var-lib-cni-multus\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.850489 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850261 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-etc-kubernetes\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.850489 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850279 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jkjh\" (UniqueName: \"kubernetes.io/projected/0a9f8642-7eca-4b0a-bf48-893987e02188-kube-api-access-9jkjh\") pod \"node-resolver-jlns8\" (UID: \"0a9f8642-7eca-4b0a-bf48-893987e02188\") " pod="openshift-dns/node-resolver-jlns8" Apr 22 19:57:45.850489 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850297 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d13c3e6a-67cb-4dd5-acc1-44cced31f8d2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-72j5f\" (UID: \"d13c3e6a-67cb-4dd5-acc1-44cced31f8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72j5f" Apr 22 19:57:45.850489 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850361 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-system-cni-dir\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.850489 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850363 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-host-var-lib-cni-multus\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.850489 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850385 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d13c3e6a-67cb-4dd5-acc1-44cced31f8d2-etc-selinux\") pod \"aws-ebs-csi-driver-node-72j5f\" (UID: \"d13c3e6a-67cb-4dd5-acc1-44cced31f8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72j5f" Apr 22 19:57:45.850489 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850410 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-host-run-k8s-cni-cncf-io\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.850489 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850413 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/cba227e9-dcf5-4cf7-9c4b-83013a0b20fb-kubelet-config\") pod \"global-pull-secret-syncer-cvg2s\" (UID: \"cba227e9-dcf5-4cf7-9c4b-83013a0b20fb\") " pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:57:45.850489 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850452 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/cba227e9-dcf5-4cf7-9c4b-83013a0b20fb-kubelet-config\") pod \"global-pull-secret-syncer-cvg2s\" (UID: \"cba227e9-dcf5-4cf7-9c4b-83013a0b20fb\") " pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:57:45.850489 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850451 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0a9f8642-7eca-4b0a-bf48-893987e02188-hosts-file\") pod \"node-resolver-jlns8\" (UID: \"0a9f8642-7eca-4b0a-bf48-893987e02188\") " pod="openshift-dns/node-resolver-jlns8" Apr 22 19:57:45.850489 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850490 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-host-run-multus-certs\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.851148 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850509 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8a3063cb-ff79-4232-8126-f9de4b63a839-cni-binary-copy\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.851148 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850519 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d13c3e6a-67cb-4dd5-acc1-44cced31f8d2-socket-dir\") pod \"aws-ebs-csi-driver-node-72j5f\" (UID: \"d13c3e6a-67cb-4dd5-acc1-44cced31f8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72j5f" Apr 22 19:57:45.851148 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850490 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0a9f8642-7eca-4b0a-bf48-893987e02188-hosts-file\") pod \"node-resolver-jlns8\" (UID: \"0a9f8642-7eca-4b0a-bf48-893987e02188\") " pod="openshift-dns/node-resolver-jlns8" Apr 22 19:57:45.851148 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850547 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-multus-socket-dir-parent\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.851148 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850570 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-hostroot\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.851148 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850582 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-host-run-multus-certs\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.851148 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850595 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/48059c77-1229-43a7-a9b7-366463148f62-iptables-alerter-script\") pod \"iptables-alerter-f8qlf\" (UID: \"48059c77-1229-43a7-a9b7-366463148f62\") " pod="openshift-network-operator/iptables-alerter-f8qlf" Apr 22 19:57:45.851148 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850605 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d13c3e6a-67cb-4dd5-acc1-44cced31f8d2-etc-selinux\") pod \"aws-ebs-csi-driver-node-72j5f\" (UID: \"d13c3e6a-67cb-4dd5-acc1-44cced31f8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72j5f" Apr 22 19:57:45.851148 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850620 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/48059c77-1229-43a7-a9b7-366463148f62-host-slash\") pod \"iptables-alerter-f8qlf\" (UID: \"48059c77-1229-43a7-a9b7-366463148f62\") " pod="openshift-network-operator/iptables-alerter-f8qlf" Apr 22 19:57:45.851148 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850636 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-multus-socket-dir-parent\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.851148 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850646 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d13c3e6a-67cb-4dd5-acc1-44cced31f8d2-socket-dir\") pod \"aws-ebs-csi-driver-node-72j5f\" (UID: \"d13c3e6a-67cb-4dd5-acc1-44cced31f8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72j5f" Apr 22 19:57:45.851148 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850653 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x4vz7\" (UniqueName: \"kubernetes.io/projected/48059c77-1229-43a7-a9b7-366463148f62-kube-api-access-x4vz7\") pod \"iptables-alerter-f8qlf\" (UID: \"48059c77-1229-43a7-a9b7-366463148f62\") " pod="openshift-network-operator/iptables-alerter-f8qlf" Apr 22 19:57:45.851148 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850662 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-hostroot\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.851148 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850680 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/48059c77-1229-43a7-a9b7-366463148f62-host-slash\") pod \"iptables-alerter-f8qlf\" (UID: \"48059c77-1229-43a7-a9b7-366463148f62\") " pod="openshift-network-operator/iptables-alerter-f8qlf" Apr 22 19:57:45.851148 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850681 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-multus-conf-dir\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.851148 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850707 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/cba227e9-dcf5-4cf7-9c4b-83013a0b20fb-dbus\") pod \"global-pull-secret-syncer-cvg2s\" (UID: \"cba227e9-dcf5-4cf7-9c4b-83013a0b20fb\") " pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:57:45.851148 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850733 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cba227e9-dcf5-4cf7-9c4b-83013a0b20fb-original-pull-secret\") pod \"global-pull-secret-syncer-cvg2s\" (UID: \"cba227e9-dcf5-4cf7-9c4b-83013a0b20fb\") " pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:57:45.851148 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850783 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-multus-conf-dir\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.851826 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850809 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d13c3e6a-67cb-4dd5-acc1-44cced31f8d2-device-dir\") pod \"aws-ebs-csi-driver-node-72j5f\" (UID: \"d13c3e6a-67cb-4dd5-acc1-44cced31f8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72j5f" Apr 22 19:57:45.851826 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850825 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-host-var-lib-cni-bin\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.851826 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:45.850832 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:57:45.851826 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850844 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76f4k\" (UniqueName: \"kubernetes.io/projected/f55cb5a5-4e42-4775-bf9f-5f92344b63ff-kube-api-access-76f4k\") pod \"network-check-target-ncvmn\" (UID: \"f55cb5a5-4e42-4775-bf9f-5f92344b63ff\") " pod="openshift-network-diagnostics/network-check-target-ncvmn" Apr 22 19:57:45.851826 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850862 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-cnibin\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.851826 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:45.850895 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cba227e9-dcf5-4cf7-9c4b-83013a0b20fb-original-pull-secret podName:cba227e9-dcf5-4cf7-9c4b-83013a0b20fb nodeName:}" failed. No retries permitted until 2026-04-22 19:57:46.350878943 +0000 UTC m=+3.164264121 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/cba227e9-dcf5-4cf7-9c4b-83013a0b20fb-original-pull-secret") pod "global-pull-secret-syncer-cvg2s" (UID: "cba227e9-dcf5-4cf7-9c4b-83013a0b20fb") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:57:45.851826 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850902 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-cnibin\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.851826 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850903 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-host-var-lib-cni-bin\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.851826 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850908 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8a3063cb-ff79-4232-8126-f9de4b63a839-multus-daemon-config\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.851826 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850914 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d13c3e6a-67cb-4dd5-acc1-44cced31f8d2-registration-dir\") pod \"aws-ebs-csi-driver-node-72j5f\" (UID: \"d13c3e6a-67cb-4dd5-acc1-44cced31f8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72j5f" Apr 22 19:57:45.851826 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850941 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-host-run-netns\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.851826 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850956 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d13c3e6a-67cb-4dd5-acc1-44cced31f8d2-device-dir\") pod \"aws-ebs-csi-driver-node-72j5f\" (UID: \"d13c3e6a-67cb-4dd5-acc1-44cced31f8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72j5f" Apr 22 19:57:45.851826 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850963 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-host-var-lib-kubelet\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.851826 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850997 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-host-var-lib-kubelet\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.851826 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.850998 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0a9f8642-7eca-4b0a-bf48-893987e02188-tmp-dir\") pod \"node-resolver-jlns8\" (UID: \"0a9f8642-7eca-4b0a-bf48-893987e02188\") " pod="openshift-dns/node-resolver-jlns8" Apr 22 19:57:45.851826 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.851026 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5n5j\" (UniqueName: \"kubernetes.io/projected/d13c3e6a-67cb-4dd5-acc1-44cced31f8d2-kube-api-access-r5n5j\") pod \"aws-ebs-csi-driver-node-72j5f\" (UID: \"d13c3e6a-67cb-4dd5-acc1-44cced31f8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72j5f" Apr 22 19:57:45.851826 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.851042 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/cba227e9-dcf5-4cf7-9c4b-83013a0b20fb-dbus\") pod \"global-pull-secret-syncer-cvg2s\" (UID: \"cba227e9-dcf5-4cf7-9c4b-83013a0b20fb\") " pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:57:45.852411 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.851047 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/48059c77-1229-43a7-a9b7-366463148f62-iptables-alerter-script\") pod \"iptables-alerter-f8qlf\" (UID: \"48059c77-1229-43a7-a9b7-366463148f62\") " pod="openshift-network-operator/iptables-alerter-f8qlf" Apr 22 19:57:45.852411 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.851094 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d13c3e6a-67cb-4dd5-acc1-44cced31f8d2-registration-dir\") pod \"aws-ebs-csi-driver-node-72j5f\" (UID: \"d13c3e6a-67cb-4dd5-acc1-44cced31f8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72j5f" Apr 22 19:57:45.852411 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.851118 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8a3063cb-ff79-4232-8126-f9de4b63a839-host-run-netns\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.852411 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.851352 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0a9f8642-7eca-4b0a-bf48-893987e02188-tmp-dir\") pod \"node-resolver-jlns8\" (UID: \"0a9f8642-7eca-4b0a-bf48-893987e02188\") " pod="openshift-dns/node-resolver-jlns8" Apr 22 19:57:45.857788 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:45.857764 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:57:45.857788 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:45.857783 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:57:45.857788 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:45.857792 2575 projected.go:194] Error preparing data for projected volume kube-api-access-76f4k for pod openshift-network-diagnostics/network-check-target-ncvmn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:45.858029 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:45.857846 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f55cb5a5-4e42-4775-bf9f-5f92344b63ff-kube-api-access-76f4k podName:f55cb5a5-4e42-4775-bf9f-5f92344b63ff nodeName:}" failed. No retries permitted until 2026-04-22 19:57:46.357832564 +0000 UTC m=+3.171217726 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-76f4k" (UniqueName: "kubernetes.io/projected/f55cb5a5-4e42-4775-bf9f-5f92344b63ff-kube-api-access-76f4k") pod "network-check-target-ncvmn" (UID: "f55cb5a5-4e42-4775-bf9f-5f92344b63ff") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:45.860394 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.860370 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jkjh\" (UniqueName: \"kubernetes.io/projected/0a9f8642-7eca-4b0a-bf48-893987e02188-kube-api-access-9jkjh\") pod \"node-resolver-jlns8\" (UID: \"0a9f8642-7eca-4b0a-bf48-893987e02188\") " pod="openshift-dns/node-resolver-jlns8" Apr 22 19:57:45.860394 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.860395 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x756g\" (UniqueName: \"kubernetes.io/projected/8a3063cb-ff79-4232-8126-f9de4b63a839-kube-api-access-x756g\") pod \"multus-7qp87\" (UID: \"8a3063cb-ff79-4232-8126-f9de4b63a839\") " pod="openshift-multus/multus-7qp87" Apr 22 19:57:45.860807 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.860788 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4vz7\" (UniqueName: \"kubernetes.io/projected/48059c77-1229-43a7-a9b7-366463148f62-kube-api-access-x4vz7\") pod \"iptables-alerter-f8qlf\" (UID: \"48059c77-1229-43a7-a9b7-366463148f62\") " pod="openshift-network-operator/iptables-alerter-f8qlf" Apr 22 19:57:45.860807 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.860798 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5n5j\" (UniqueName: \"kubernetes.io/projected/d13c3e6a-67cb-4dd5-acc1-44cced31f8d2-kube-api-access-r5n5j\") pod \"aws-ebs-csi-driver-node-72j5f\" (UID: \"d13c3e6a-67cb-4dd5-acc1-44cced31f8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72j5f" Apr 22 19:57:45.940787 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.940754 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-w7bfn" Apr 22 19:57:45.948623 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.948597 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-v28mk" Apr 22 19:57:45.957418 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.957387 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:57:45.964001 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.963970 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lnz7b" Apr 22 19:57:45.971555 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.971532 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cfjtn" Apr 22 19:57:45.985929 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.985905 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72j5f" Apr 22 19:57:45.993512 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.993483 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jlns8" Apr 22 19:57:45.999099 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:45.999078 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7qp87" Apr 22 19:57:46.006658 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:46.006635 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-f8qlf" Apr 22 19:57:46.254737 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:46.254705 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7bee1d4-9229-4b17-8ec5-e19b53d61c9d-metrics-certs\") pod \"network-metrics-daemon-v8fph\" (UID: \"d7bee1d4-9229-4b17-8ec5-e19b53d61c9d\") " pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:57:46.254896 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:46.254869 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:46.254972 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:46.254951 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7bee1d4-9229-4b17-8ec5-e19b53d61c9d-metrics-certs podName:d7bee1d4-9229-4b17-8ec5-e19b53d61c9d nodeName:}" failed. No retries permitted until 2026-04-22 19:57:47.254931102 +0000 UTC m=+4.068316266 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7bee1d4-9229-4b17-8ec5-e19b53d61c9d-metrics-certs") pod "network-metrics-daemon-v8fph" (UID: "d7bee1d4-9229-4b17-8ec5-e19b53d61c9d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:46.355332 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:46.355282 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cba227e9-dcf5-4cf7-9c4b-83013a0b20fb-original-pull-secret\") pod \"global-pull-secret-syncer-cvg2s\" (UID: \"cba227e9-dcf5-4cf7-9c4b-83013a0b20fb\") " pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:57:46.355495 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:46.355434 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:57:46.355558 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:46.355517 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cba227e9-dcf5-4cf7-9c4b-83013a0b20fb-original-pull-secret podName:cba227e9-dcf5-4cf7-9c4b-83013a0b20fb nodeName:}" failed. No retries permitted until 2026-04-22 19:57:47.355496496 +0000 UTC m=+4.168881657 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/cba227e9-dcf5-4cf7-9c4b-83013a0b20fb-original-pull-secret") pod "global-pull-secret-syncer-cvg2s" (UID: "cba227e9-dcf5-4cf7-9c4b-83013a0b20fb") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:57:46.456570 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:46.456516 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76f4k\" (UniqueName: \"kubernetes.io/projected/f55cb5a5-4e42-4775-bf9f-5f92344b63ff-kube-api-access-76f4k\") pod \"network-check-target-ncvmn\" (UID: \"f55cb5a5-4e42-4775-bf9f-5f92344b63ff\") " pod="openshift-network-diagnostics/network-check-target-ncvmn" Apr 22 19:57:46.456739 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:46.456698 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:57:46.456739 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:46.456719 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:57:46.456739 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:46.456731 2575 projected.go:194] Error preparing data for projected volume kube-api-access-76f4k for pod openshift-network-diagnostics/network-check-target-ncvmn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:46.456901 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:46.456784 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f55cb5a5-4e42-4775-bf9f-5f92344b63ff-kube-api-access-76f4k podName:f55cb5a5-4e42-4775-bf9f-5f92344b63ff nodeName:}" failed. No retries permitted until 2026-04-22 19:57:47.456769432 +0000 UTC m=+4.270154590 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-76f4k" (UniqueName: "kubernetes.io/projected/f55cb5a5-4e42-4775-bf9f-5f92344b63ff-kube-api-access-76f4k") pod "network-check-target-ncvmn" (UID: "f55cb5a5-4e42-4775-bf9f-5f92344b63ff") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:46.570133 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:46.570104 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc63ee2c0_b298_42a6_bcd1_b05ffc7971f2.slice/crio-9c307ce30379ed248e901cd668edad51ba875c9dca4ed2a38d1ed6bb395895e8 WatchSource:0}: Error finding container 9c307ce30379ed248e901cd668edad51ba875c9dca4ed2a38d1ed6bb395895e8: Status 404 returned error can't find the container with id 9c307ce30379ed248e901cd668edad51ba875c9dca4ed2a38d1ed6bb395895e8 Apr 22 19:57:46.575719 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:46.575690 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod153f74bd_9a2c_4a02_88c9_243b60b35439.slice/crio-0c7b8a670ac620dba697dd772c3188747fb9259e0d0292cd6c910a27a3cafb2f WatchSource:0}: Error finding container 0c7b8a670ac620dba697dd772c3188747fb9259e0d0292cd6c910a27a3cafb2f: Status 404 returned error can't find the container with id 0c7b8a670ac620dba697dd772c3188747fb9259e0d0292cd6c910a27a3cafb2f Apr 22 19:57:46.576651 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:46.576629 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode92d8d1a_4d78_4b32_8e69_32db4468f373.slice/crio-511082f0f152434eb8677fef634104adbe404ac2d6a64abc1846c24d3f7f02fb WatchSource:0}: Error finding container 511082f0f152434eb8677fef634104adbe404ac2d6a64abc1846c24d3f7f02fb: Status 404 returned error can't find the container with id 511082f0f152434eb8677fef634104adbe404ac2d6a64abc1846c24d3f7f02fb Apr 22 19:57:46.577664 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:46.577631 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae1657a9_83bc_4dfc_8385_1b003298717a.slice/crio-7ce7866f02d0162c73c119c49fff4de5d4379878058f59daf8f03915a452b9de WatchSource:0}: Error finding container 7ce7866f02d0162c73c119c49fff4de5d4379878058f59daf8f03915a452b9de: Status 404 returned error can't find the container with id 7ce7866f02d0162c73c119c49fff4de5d4379878058f59daf8f03915a452b9de Apr 22 19:57:46.578822 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:46.578802 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48059c77_1229_43a7_a9b7_366463148f62.slice/crio-29ad9896d7d4b235dfeb7675c593380739c0eb74a3369dcf9f2f781bf22bafdd WatchSource:0}: Error finding container 29ad9896d7d4b235dfeb7675c593380739c0eb74a3369dcf9f2f781bf22bafdd: Status 404 returned error can't find the container with id 29ad9896d7d4b235dfeb7675c593380739c0eb74a3369dcf9f2f781bf22bafdd Apr 22 19:57:46.580284 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:46.580244 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a3063cb_ff79_4232_8126_f9de4b63a839.slice/crio-6673f760cafbc1a72ba3fedeecb955968fa7f5b478a4397c5806676576a86840 WatchSource:0}: Error finding container 6673f760cafbc1a72ba3fedeecb955968fa7f5b478a4397c5806676576a86840: Status 404 returned error can't find the container with id 6673f760cafbc1a72ba3fedeecb955968fa7f5b478a4397c5806676576a86840 Apr 22 19:57:46.583236 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:46.583204 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd13c3e6a_67cb_4dd5_acc1_44cced31f8d2.slice/crio-e74f0ef4ca489c9ff20276d7d34dda44b1b42b66267b2378b0eab7a44e589f0d WatchSource:0}: Error finding container e74f0ef4ca489c9ff20276d7d34dda44b1b42b66267b2378b0eab7a44e589f0d: Status 404 returned error can't find the container with id e74f0ef4ca489c9ff20276d7d34dda44b1b42b66267b2378b0eab7a44e589f0d Apr 22 19:57:46.584114 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:57:46.584076 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0961e7e6_a8cb_43cc_82a5_7a080e47aae5.slice/crio-10333b32f18a1af57162ea435964f84cbdbad8fed3f2f0a86d73e3361bc2b002 WatchSource:0}: Error finding container 10333b32f18a1af57162ea435964f84cbdbad8fed3f2f0a86d73e3361bc2b002: Status 404 returned error can't find the container with id 10333b32f18a1af57162ea435964f84cbdbad8fed3f2f0a86d73e3361bc2b002 Apr 22 19:57:46.680014 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:46.679844 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:52:44 +0000 UTC" deadline="2027-10-18 02:23:22.27160402 +0000 UTC" Apr 22 19:57:46.680014 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:46.680011 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13038h25m35.591596705s" Apr 22 19:57:46.734331 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:46.734287 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72j5f" event={"ID":"d13c3e6a-67cb-4dd5-acc1-44cced31f8d2","Type":"ContainerStarted","Data":"e74f0ef4ca489c9ff20276d7d34dda44b1b42b66267b2378b0eab7a44e589f0d"} Apr 22 19:57:46.735224 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:46.735198 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-f8qlf" event={"ID":"48059c77-1229-43a7-a9b7-366463148f62","Type":"ContainerStarted","Data":"29ad9896d7d4b235dfeb7675c593380739c0eb74a3369dcf9f2f781bf22bafdd"} Apr 22 19:57:46.736122 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:46.736099 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-v28mk" event={"ID":"ae1657a9-83bc-4dfc-8385-1b003298717a","Type":"ContainerStarted","Data":"7ce7866f02d0162c73c119c49fff4de5d4379878058f59daf8f03915a452b9de"} Apr 22 19:57:46.737047 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:46.737022 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" event={"ID":"e92d8d1a-4d78-4b32-8e69-32db4468f373","Type":"ContainerStarted","Data":"511082f0f152434eb8677fef634104adbe404ac2d6a64abc1846c24d3f7f02fb"} Apr 22 19:57:46.737989 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:46.737964 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lnz7b" event={"ID":"c63ee2c0-b298-42a6-bcd1-b05ffc7971f2","Type":"ContainerStarted","Data":"9c307ce30379ed248e901cd668edad51ba875c9dca4ed2a38d1ed6bb395895e8"} Apr 22 19:57:46.739447 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:46.739426 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-60.ec2.internal" event={"ID":"cd407a21e2749828c73fd67e5ee13311","Type":"ContainerStarted","Data":"013095aedbcfd73f7b74933877982ba2158bd293c60690d395859e4952dfca59"} Apr 22 19:57:46.740425 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:46.740405 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jlns8" event={"ID":"0a9f8642-7eca-4b0a-bf48-893987e02188","Type":"ContainerStarted","Data":"8d9101f36583609d3e9472091356abc4bfd2908a43f495e1887cbc9fe99b0cda"} Apr 22 19:57:46.741300 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:46.741283 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7qp87" event={"ID":"8a3063cb-ff79-4232-8126-f9de4b63a839","Type":"ContainerStarted","Data":"6673f760cafbc1a72ba3fedeecb955968fa7f5b478a4397c5806676576a86840"} Apr 22 19:57:46.742276 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:46.742255 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cfjtn" event={"ID":"153f74bd-9a2c-4a02-88c9-243b60b35439","Type":"ContainerStarted","Data":"0c7b8a670ac620dba697dd772c3188747fb9259e0d0292cd6c910a27a3cafb2f"} Apr 22 19:57:46.743272 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:46.743251 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-w7bfn" event={"ID":"0961e7e6-a8cb-43cc-82a5-7a080e47aae5","Type":"ContainerStarted","Data":"10333b32f18a1af57162ea435964f84cbdbad8fed3f2f0a86d73e3361bc2b002"} Apr 22 19:57:47.264050 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:47.264010 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7bee1d4-9229-4b17-8ec5-e19b53d61c9d-metrics-certs\") pod \"network-metrics-daemon-v8fph\" (UID: \"d7bee1d4-9229-4b17-8ec5-e19b53d61c9d\") " pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:57:47.264233 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:47.264193 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:47.264299 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:47.264259 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7bee1d4-9229-4b17-8ec5-e19b53d61c9d-metrics-certs podName:d7bee1d4-9229-4b17-8ec5-e19b53d61c9d nodeName:}" failed. No retries permitted until 2026-04-22 19:57:49.264240455 +0000 UTC m=+6.077625629 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7bee1d4-9229-4b17-8ec5-e19b53d61c9d-metrics-certs") pod "network-metrics-daemon-v8fph" (UID: "d7bee1d4-9229-4b17-8ec5-e19b53d61c9d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:47.364983 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:47.364934 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cba227e9-dcf5-4cf7-9c4b-83013a0b20fb-original-pull-secret\") pod \"global-pull-secret-syncer-cvg2s\" (UID: \"cba227e9-dcf5-4cf7-9c4b-83013a0b20fb\") " pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:57:47.365157 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:47.365108 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:57:47.365219 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:47.365171 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cba227e9-dcf5-4cf7-9c4b-83013a0b20fb-original-pull-secret podName:cba227e9-dcf5-4cf7-9c4b-83013a0b20fb nodeName:}" failed. No retries permitted until 2026-04-22 19:57:49.365153802 +0000 UTC m=+6.178538967 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/cba227e9-dcf5-4cf7-9c4b-83013a0b20fb-original-pull-secret") pod "global-pull-secret-syncer-cvg2s" (UID: "cba227e9-dcf5-4cf7-9c4b-83013a0b20fb") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:57:47.468074 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:47.467485 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76f4k\" (UniqueName: \"kubernetes.io/projected/f55cb5a5-4e42-4775-bf9f-5f92344b63ff-kube-api-access-76f4k\") pod \"network-check-target-ncvmn\" (UID: \"f55cb5a5-4e42-4775-bf9f-5f92344b63ff\") " pod="openshift-network-diagnostics/network-check-target-ncvmn" Apr 22 19:57:47.468074 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:47.467655 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:57:47.468074 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:47.467674 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:57:47.468074 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:47.467686 2575 projected.go:194] Error preparing data for projected volume kube-api-access-76f4k for pod openshift-network-diagnostics/network-check-target-ncvmn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:47.468074 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:47.467744 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f55cb5a5-4e42-4775-bf9f-5f92344b63ff-kube-api-access-76f4k podName:f55cb5a5-4e42-4775-bf9f-5f92344b63ff nodeName:}" failed. No retries permitted until 2026-04-22 19:57:49.46772763 +0000 UTC m=+6.281112793 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-76f4k" (UniqueName: "kubernetes.io/projected/f55cb5a5-4e42-4775-bf9f-5f92344b63ff-kube-api-access-76f4k") pod "network-check-target-ncvmn" (UID: "f55cb5a5-4e42-4775-bf9f-5f92344b63ff") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:47.727566 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:47.727465 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncvmn" Apr 22 19:57:47.728010 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:47.727622 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:57:47.728010 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:47.727747 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v8fph" podUID="d7bee1d4-9229-4b17-8ec5-e19b53d61c9d" Apr 22 19:57:47.728010 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:47.727615 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncvmn" podUID="f55cb5a5-4e42-4775-bf9f-5f92344b63ff" Apr 22 19:57:47.728010 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:47.727481 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:57:47.728010 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:47.727851 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cvg2s" podUID="cba227e9-dcf5-4cf7-9c4b-83013a0b20fb" Apr 22 19:57:47.765133 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:47.765098 2575 generic.go:358] "Generic (PLEG): container finished" podID="36565672596d30113df6124742f498e5" containerID="3e57065f05d849cb4d0d49065ce9a0fc5b1487cd234ee969d0a11f2514c85908" exitCode=0 Apr 22 19:57:47.766081 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:47.766055 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-60.ec2.internal" event={"ID":"36565672596d30113df6124742f498e5","Type":"ContainerDied","Data":"3e57065f05d849cb4d0d49065ce9a0fc5b1487cd234ee969d0a11f2514c85908"} Apr 22 19:57:47.784310 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:47.784260 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-60.ec2.internal" podStartSLOduration=3.7842428630000002 podStartE2EDuration="3.784242863s" podCreationTimestamp="2026-04-22 19:57:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:57:46.753876596 +0000 UTC m=+3.567261777" watchObservedRunningTime="2026-04-22 19:57:47.784242863 +0000 UTC m=+4.597628045" Apr 22 19:57:48.782335 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:48.780595 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-60.ec2.internal" event={"ID":"36565672596d30113df6124742f498e5","Type":"ContainerStarted","Data":"97dfbfedf29a46ad689601a35e68f41b10e2f1a250c61d97caed2733c08014fe"} Apr 22 19:57:49.284662 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:49.284040 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7bee1d4-9229-4b17-8ec5-e19b53d61c9d-metrics-certs\") pod \"network-metrics-daemon-v8fph\" (UID: \"d7bee1d4-9229-4b17-8ec5-e19b53d61c9d\") " pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:57:49.284662 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:49.284218 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:49.284662 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:49.284280 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7bee1d4-9229-4b17-8ec5-e19b53d61c9d-metrics-certs podName:d7bee1d4-9229-4b17-8ec5-e19b53d61c9d nodeName:}" failed. No retries permitted until 2026-04-22 19:57:53.284263033 +0000 UTC m=+10.097648205 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7bee1d4-9229-4b17-8ec5-e19b53d61c9d-metrics-certs") pod "network-metrics-daemon-v8fph" (UID: "d7bee1d4-9229-4b17-8ec5-e19b53d61c9d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:49.384848 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:49.384809 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cba227e9-dcf5-4cf7-9c4b-83013a0b20fb-original-pull-secret\") pod \"global-pull-secret-syncer-cvg2s\" (UID: \"cba227e9-dcf5-4cf7-9c4b-83013a0b20fb\") " pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:57:49.385022 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:49.384983 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:57:49.385077 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:49.385045 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cba227e9-dcf5-4cf7-9c4b-83013a0b20fb-original-pull-secret podName:cba227e9-dcf5-4cf7-9c4b-83013a0b20fb nodeName:}" failed. No retries permitted until 2026-04-22 19:57:53.385027505 +0000 UTC m=+10.198412679 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/cba227e9-dcf5-4cf7-9c4b-83013a0b20fb-original-pull-secret") pod "global-pull-secret-syncer-cvg2s" (UID: "cba227e9-dcf5-4cf7-9c4b-83013a0b20fb") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:57:49.485951 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:49.485905 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76f4k\" (UniqueName: \"kubernetes.io/projected/f55cb5a5-4e42-4775-bf9f-5f92344b63ff-kube-api-access-76f4k\") pod \"network-check-target-ncvmn\" (UID: \"f55cb5a5-4e42-4775-bf9f-5f92344b63ff\") " pod="openshift-network-diagnostics/network-check-target-ncvmn" Apr 22 19:57:49.486135 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:49.486083 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:57:49.486135 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:49.486108 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:57:49.486135 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:49.486120 2575 projected.go:194] Error preparing data for projected volume kube-api-access-76f4k for pod openshift-network-diagnostics/network-check-target-ncvmn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:49.486364 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:49.486184 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f55cb5a5-4e42-4775-bf9f-5f92344b63ff-kube-api-access-76f4k podName:f55cb5a5-4e42-4775-bf9f-5f92344b63ff nodeName:}" failed. No retries permitted until 2026-04-22 19:57:53.486165247 +0000 UTC m=+10.299550425 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-76f4k" (UniqueName: "kubernetes.io/projected/f55cb5a5-4e42-4775-bf9f-5f92344b63ff-kube-api-access-76f4k") pod "network-check-target-ncvmn" (UID: "f55cb5a5-4e42-4775-bf9f-5f92344b63ff") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:49.729166 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:49.728762 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncvmn" Apr 22 19:57:49.729166 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:49.728895 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncvmn" podUID="f55cb5a5-4e42-4775-bf9f-5f92344b63ff" Apr 22 19:57:49.729166 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:49.728983 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:57:49.730252 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:49.730000 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cvg2s" podUID="cba227e9-dcf5-4cf7-9c4b-83013a0b20fb" Apr 22 19:57:49.731600 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:49.731574 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:57:49.731811 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:49.731738 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v8fph" podUID="d7bee1d4-9229-4b17-8ec5-e19b53d61c9d" Apr 22 19:57:51.726581 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:51.726544 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:57:51.727069 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:51.726685 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cvg2s" podUID="cba227e9-dcf5-4cf7-9c4b-83013a0b20fb" Apr 22 19:57:51.727069 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:51.726771 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:57:51.727069 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:51.726856 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v8fph" podUID="d7bee1d4-9229-4b17-8ec5-e19b53d61c9d" Apr 22 19:57:51.727069 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:51.726932 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncvmn" Apr 22 19:57:51.727069 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:51.727035 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncvmn" podUID="f55cb5a5-4e42-4775-bf9f-5f92344b63ff" Apr 22 19:57:53.317659 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:53.317608 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7bee1d4-9229-4b17-8ec5-e19b53d61c9d-metrics-certs\") pod \"network-metrics-daemon-v8fph\" (UID: \"d7bee1d4-9229-4b17-8ec5-e19b53d61c9d\") " pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:57:53.318179 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:53.317783 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:53.318179 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:53.317863 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7bee1d4-9229-4b17-8ec5-e19b53d61c9d-metrics-certs podName:d7bee1d4-9229-4b17-8ec5-e19b53d61c9d nodeName:}" failed. No retries permitted until 2026-04-22 19:58:01.317843268 +0000 UTC m=+18.131228445 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7bee1d4-9229-4b17-8ec5-e19b53d61c9d-metrics-certs") pod "network-metrics-daemon-v8fph" (UID: "d7bee1d4-9229-4b17-8ec5-e19b53d61c9d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:57:53.418161 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:53.418123 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cba227e9-dcf5-4cf7-9c4b-83013a0b20fb-original-pull-secret\") pod \"global-pull-secret-syncer-cvg2s\" (UID: \"cba227e9-dcf5-4cf7-9c4b-83013a0b20fb\") " pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:57:53.418419 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:53.418271 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:57:53.418419 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:53.418398 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cba227e9-dcf5-4cf7-9c4b-83013a0b20fb-original-pull-secret podName:cba227e9-dcf5-4cf7-9c4b-83013a0b20fb nodeName:}" failed. No retries permitted until 2026-04-22 19:58:01.418384144 +0000 UTC m=+18.231769308 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/cba227e9-dcf5-4cf7-9c4b-83013a0b20fb-original-pull-secret") pod "global-pull-secret-syncer-cvg2s" (UID: "cba227e9-dcf5-4cf7-9c4b-83013a0b20fb") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:57:53.519206 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:53.519168 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76f4k\" (UniqueName: \"kubernetes.io/projected/f55cb5a5-4e42-4775-bf9f-5f92344b63ff-kube-api-access-76f4k\") pod \"network-check-target-ncvmn\" (UID: \"f55cb5a5-4e42-4775-bf9f-5f92344b63ff\") " pod="openshift-network-diagnostics/network-check-target-ncvmn" Apr 22 19:57:53.519421 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:53.519377 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:57:53.519421 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:53.519396 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:57:53.519421 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:53.519409 2575 projected.go:194] Error preparing data for projected volume kube-api-access-76f4k for pod openshift-network-diagnostics/network-check-target-ncvmn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:53.519577 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:53.519466 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f55cb5a5-4e42-4775-bf9f-5f92344b63ff-kube-api-access-76f4k podName:f55cb5a5-4e42-4775-bf9f-5f92344b63ff nodeName:}" failed. No retries permitted until 2026-04-22 19:58:01.519447835 +0000 UTC m=+18.332833001 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-76f4k" (UniqueName: "kubernetes.io/projected/f55cb5a5-4e42-4775-bf9f-5f92344b63ff-kube-api-access-76f4k") pod "network-check-target-ncvmn" (UID: "f55cb5a5-4e42-4775-bf9f-5f92344b63ff") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:57:53.727147 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:53.727074 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncvmn" Apr 22 19:57:53.727147 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:53.727087 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:57:53.728160 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:53.728128 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncvmn" podUID="f55cb5a5-4e42-4775-bf9f-5f92344b63ff" Apr 22 19:57:53.728337 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:53.728304 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:57:53.728492 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:53.728470 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v8fph" podUID="d7bee1d4-9229-4b17-8ec5-e19b53d61c9d" Apr 22 19:57:53.728583 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:53.728560 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cvg2s" podUID="cba227e9-dcf5-4cf7-9c4b-83013a0b20fb" Apr 22 19:57:55.726932 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:55.726898 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:57:55.727357 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:55.727041 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncvmn" Apr 22 19:57:55.727357 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:55.727085 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:57:55.727357 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:55.727045 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v8fph" podUID="d7bee1d4-9229-4b17-8ec5-e19b53d61c9d" Apr 22 19:57:55.727357 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:55.727163 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncvmn" podUID="f55cb5a5-4e42-4775-bf9f-5f92344b63ff" Apr 22 19:57:55.727357 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:55.727261 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cvg2s" podUID="cba227e9-dcf5-4cf7-9c4b-83013a0b20fb" Apr 22 19:57:57.726341 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:57.726290 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncvmn" Apr 22 19:57:57.726747 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:57.726435 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncvmn" podUID="f55cb5a5-4e42-4775-bf9f-5f92344b63ff" Apr 22 19:57:57.726747 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:57.726508 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:57:57.726747 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:57.726595 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cvg2s" podUID="cba227e9-dcf5-4cf7-9c4b-83013a0b20fb" Apr 22 19:57:57.726747 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:57.726637 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:57:57.726747 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:57.726715 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v8fph" podUID="d7bee1d4-9229-4b17-8ec5-e19b53d61c9d" Apr 22 19:57:59.726916 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:59.726885 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:57:59.726916 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:59.726911 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:57:59.727444 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:57:59.726886 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncvmn" Apr 22 19:57:59.727444 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:59.726992 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cvg2s" podUID="cba227e9-dcf5-4cf7-9c4b-83013a0b20fb" Apr 22 19:57:59.727444 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:59.727060 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncvmn" podUID="f55cb5a5-4e42-4775-bf9f-5f92344b63ff" Apr 22 19:57:59.727444 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:57:59.727138 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v8fph" podUID="d7bee1d4-9229-4b17-8ec5-e19b53d61c9d" Apr 22 19:58:01.376725 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:01.376695 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7bee1d4-9229-4b17-8ec5-e19b53d61c9d-metrics-certs\") pod \"network-metrics-daemon-v8fph\" (UID: \"d7bee1d4-9229-4b17-8ec5-e19b53d61c9d\") " pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:58:01.377191 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:01.376829 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:01.377191 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:01.376885 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7bee1d4-9229-4b17-8ec5-e19b53d61c9d-metrics-certs podName:d7bee1d4-9229-4b17-8ec5-e19b53d61c9d nodeName:}" failed. No retries permitted until 2026-04-22 19:58:17.376871067 +0000 UTC m=+34.190256225 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7bee1d4-9229-4b17-8ec5-e19b53d61c9d-metrics-certs") pod "network-metrics-daemon-v8fph" (UID: "d7bee1d4-9229-4b17-8ec5-e19b53d61c9d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:01.477062 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:01.477026 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cba227e9-dcf5-4cf7-9c4b-83013a0b20fb-original-pull-secret\") pod \"global-pull-secret-syncer-cvg2s\" (UID: \"cba227e9-dcf5-4cf7-9c4b-83013a0b20fb\") " pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:58:01.477234 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:01.477185 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:01.477298 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:01.477255 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cba227e9-dcf5-4cf7-9c4b-83013a0b20fb-original-pull-secret podName:cba227e9-dcf5-4cf7-9c4b-83013a0b20fb nodeName:}" failed. No retries permitted until 2026-04-22 19:58:17.477240206 +0000 UTC m=+34.290625364 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/cba227e9-dcf5-4cf7-9c4b-83013a0b20fb-original-pull-secret") pod "global-pull-secret-syncer-cvg2s" (UID: "cba227e9-dcf5-4cf7-9c4b-83013a0b20fb") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:01.577915 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:01.577876 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76f4k\" (UniqueName: \"kubernetes.io/projected/f55cb5a5-4e42-4775-bf9f-5f92344b63ff-kube-api-access-76f4k\") pod \"network-check-target-ncvmn\" (UID: \"f55cb5a5-4e42-4775-bf9f-5f92344b63ff\") " pod="openshift-network-diagnostics/network-check-target-ncvmn" Apr 22 19:58:01.578093 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:01.578014 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:01.578093 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:01.578031 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:01.578093 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:01.578040 2575 projected.go:194] Error preparing data for projected volume kube-api-access-76f4k for pod openshift-network-diagnostics/network-check-target-ncvmn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:01.578093 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:01.578092 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f55cb5a5-4e42-4775-bf9f-5f92344b63ff-kube-api-access-76f4k podName:f55cb5a5-4e42-4775-bf9f-5f92344b63ff nodeName:}" failed. No retries permitted until 2026-04-22 19:58:17.578080168 +0000 UTC m=+34.391465331 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-76f4k" (UniqueName: "kubernetes.io/projected/f55cb5a5-4e42-4775-bf9f-5f92344b63ff-kube-api-access-76f4k") pod "network-check-target-ncvmn" (UID: "f55cb5a5-4e42-4775-bf9f-5f92344b63ff") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:01.727082 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:01.726990 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncvmn" Apr 22 19:58:01.727258 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:01.726990 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:58:01.727258 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:01.727133 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncvmn" podUID="f55cb5a5-4e42-4775-bf9f-5f92344b63ff" Apr 22 19:58:01.727258 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:01.727003 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:58:01.727258 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:01.727212 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cvg2s" podUID="cba227e9-dcf5-4cf7-9c4b-83013a0b20fb" Apr 22 19:58:01.727472 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:01.727295 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v8fph" podUID="d7bee1d4-9229-4b17-8ec5-e19b53d61c9d" Apr 22 19:58:03.727862 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:03.727831 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:58:03.728198 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:03.727920 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cvg2s" podUID="cba227e9-dcf5-4cf7-9c4b-83013a0b20fb" Apr 22 19:58:03.728198 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:03.728013 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:58:03.728198 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:03.728133 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v8fph" podUID="d7bee1d4-9229-4b17-8ec5-e19b53d61c9d" Apr 22 19:58:03.728198 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:03.728180 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncvmn" Apr 22 19:58:03.728368 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:03.728251 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncvmn" podUID="f55cb5a5-4e42-4775-bf9f-5f92344b63ff" Apr 22 19:58:04.810058 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:04.809677 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jlns8" event={"ID":"0a9f8642-7eca-4b0a-bf48-893987e02188","Type":"ContainerStarted","Data":"a4d2ba10f62877b1bdcd48b8fae56d9c6dcaa62262f6f5ed59417b4ead8e25a0"} Apr 22 19:58:04.811222 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:04.811188 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7qp87" event={"ID":"8a3063cb-ff79-4232-8126-f9de4b63a839","Type":"ContainerStarted","Data":"7e359a836952c2f38adc9c7946cd7c29cf12167497877e453a1d42bc535f7864"} Apr 22 19:58:04.812739 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:04.812710 2575 generic.go:358] "Generic (PLEG): container finished" podID="153f74bd-9a2c-4a02-88c9-243b60b35439" containerID="5c5ce13f4937858459a70dd97d5908d7f8480277a928402d0b298b338ed4ebcc" exitCode=0 Apr 22 19:58:04.812871 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:04.812798 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cfjtn" event={"ID":"153f74bd-9a2c-4a02-88c9-243b60b35439","Type":"ContainerDied","Data":"5c5ce13f4937858459a70dd97d5908d7f8480277a928402d0b298b338ed4ebcc"} Apr 22 19:58:04.814297 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:04.814272 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-w7bfn" event={"ID":"0961e7e6-a8cb-43cc-82a5-7a080e47aae5","Type":"ContainerStarted","Data":"2b0b4229b97f0d4784d34bd547e0f5275ff1eea3435a22a03fabc1c76b993aa3"} Apr 22 19:58:04.815845 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:04.815825 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72j5f" event={"ID":"d13c3e6a-67cb-4dd5-acc1-44cced31f8d2","Type":"ContainerStarted","Data":"3370195626b4dd7a25e09f460a6a05eee38e35d5a86fdf55036ef85faa6d08c4"} Apr 22 19:58:04.817240 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:04.817208 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-v28mk" event={"ID":"ae1657a9-83bc-4dfc-8385-1b003298717a","Type":"ContainerStarted","Data":"e0338c75a1f2740b172225d8bf5bc9f058992459a1d0a5352c21ee54378ff064"} Apr 22 19:58:04.820046 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:04.820021 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" event={"ID":"e92d8d1a-4d78-4b32-8e69-32db4468f373","Type":"ContainerStarted","Data":"67c214acf573e573b6b70a7798eaaeee9d8a5536cfb916e6469c08506d28fc54"} Apr 22 19:58:04.820133 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:04.820048 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" event={"ID":"e92d8d1a-4d78-4b32-8e69-32db4468f373","Type":"ContainerStarted","Data":"db81b2ad48212bd6fc5f4590e70ec1f88d0770f46ac279583910714f8a61616c"} Apr 22 19:58:04.820133 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:04.820063 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" event={"ID":"e92d8d1a-4d78-4b32-8e69-32db4468f373","Type":"ContainerStarted","Data":"98edb1e96a3ccf94bc544bdb948f18bdbc7d46590b32b9113a1a5c01bfc5a4c4"} Apr 22 19:58:04.820133 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:04.820075 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" event={"ID":"e92d8d1a-4d78-4b32-8e69-32db4468f373","Type":"ContainerStarted","Data":"3f11a3b3d938f8a0612162f8c1996a1d2431c665d62b57365175b33ccfbabec0"} Apr 22 19:58:04.820133 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:04.820087 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" event={"ID":"e92d8d1a-4d78-4b32-8e69-32db4468f373","Type":"ContainerStarted","Data":"f57c53c76eb9d28fd7a6da149c829d06268dd73035d343da49fd04c5da3ccb8a"} Apr 22 19:58:04.820133 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:04.820096 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" event={"ID":"e92d8d1a-4d78-4b32-8e69-32db4468f373","Type":"ContainerStarted","Data":"ada119001a642afdd39150f2c44f434721a893b67e8fb5ecc944b28f08fde1da"} Apr 22 19:58:04.821300 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:04.821276 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lnz7b" event={"ID":"c63ee2c0-b298-42a6-bcd1-b05ffc7971f2","Type":"ContainerStarted","Data":"d0703ab9e970ef538cf01fc53ede8c788814da477c3b8f2a974a43228cd709bf"} Apr 22 19:58:04.824497 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:04.824457 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-60.ec2.internal" podStartSLOduration=20.824433221 podStartE2EDuration="20.824433221s" podCreationTimestamp="2026-04-22 19:57:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:57:48.795961669 +0000 UTC m=+5.609346851" watchObservedRunningTime="2026-04-22 19:58:04.824433221 +0000 UTC m=+21.637818407" Apr 22 19:58:04.824773 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:04.824745 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-jlns8" podStartSLOduration=3.907209323 podStartE2EDuration="20.824737265s" podCreationTimestamp="2026-04-22 19:57:44 +0000 UTC" firstStartedPulling="2026-04-22 19:57:46.591342239 +0000 UTC m=+3.404727398" lastFinishedPulling="2026-04-22 19:58:03.508870179 +0000 UTC m=+20.322255340" observedRunningTime="2026-04-22 19:58:04.824346926 +0000 UTC m=+21.637732107" watchObservedRunningTime="2026-04-22 19:58:04.824737265 +0000 UTC m=+21.638122448" Apr 22 19:58:04.838934 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:04.838889 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-w7bfn" podStartSLOduration=4.605610803 podStartE2EDuration="21.838878316s" podCreationTimestamp="2026-04-22 19:57:43 +0000 UTC" firstStartedPulling="2026-04-22 19:57:46.588005672 +0000 UTC m=+3.401390837" lastFinishedPulling="2026-04-22 19:58:03.821273188 +0000 UTC m=+20.634658350" observedRunningTime="2026-04-22 19:58:04.8385546 +0000 UTC m=+21.651939780" watchObservedRunningTime="2026-04-22 19:58:04.838878316 +0000 UTC m=+21.652263491" Apr 22 19:58:04.855565 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:04.855517 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-v28mk" podStartSLOduration=4.6147921069999995 podStartE2EDuration="21.855507408s" podCreationTimestamp="2026-04-22 19:57:43 +0000 UTC" firstStartedPulling="2026-04-22 19:57:46.580492543 +0000 UTC m=+3.393877707" lastFinishedPulling="2026-04-22 19:58:03.821207836 +0000 UTC m=+20.634593008" observedRunningTime="2026-04-22 19:58:04.854914381 +0000 UTC m=+21.668299563" watchObservedRunningTime="2026-04-22 19:58:04.855507408 +0000 UTC m=+21.668892592" Apr 22 19:58:04.871450 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:04.871404 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7qp87" podStartSLOduration=3.573971662 podStartE2EDuration="20.871390606s" podCreationTimestamp="2026-04-22 19:57:44 +0000 UTC" firstStartedPulling="2026-04-22 19:57:46.583398608 +0000 UTC m=+3.396783767" lastFinishedPulling="2026-04-22 19:58:03.880817541 +0000 UTC m=+20.694202711" observedRunningTime="2026-04-22 19:58:04.870899899 +0000 UTC m=+21.684285081" watchObservedRunningTime="2026-04-22 19:58:04.871390606 +0000 UTC m=+21.684775786" Apr 22 19:58:04.891058 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:04.891018 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-lnz7b" podStartSLOduration=4.954552443 podStartE2EDuration="21.89100545s" podCreationTimestamp="2026-04-22 19:57:43 +0000 UTC" firstStartedPulling="2026-04-22 19:57:46.572299354 +0000 UTC m=+3.385684516" lastFinishedPulling="2026-04-22 19:58:03.508752352 +0000 UTC m=+20.322137523" observedRunningTime="2026-04-22 19:58:04.890961032 +0000 UTC m=+21.704346213" watchObservedRunningTime="2026-04-22 19:58:04.89100545 +0000 UTC m=+21.704390631" Apr 22 19:58:05.061514 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:05.061489 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 19:58:05.701493 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:05.701383 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T19:58:05.061509636Z","UUID":"f52ce8ba-691f-49e9-93c6-fdb1dfff337b","Handler":null,"Name":"","Endpoint":""} Apr 22 19:58:05.703264 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:05.703227 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 19:58:05.703264 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:05.703269 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 19:58:05.726712 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:05.726686 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncvmn" Apr 22 19:58:05.726712 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:05.726709 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:58:05.726916 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:05.726692 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:58:05.726916 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:05.726788 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncvmn" podUID="f55cb5a5-4e42-4775-bf9f-5f92344b63ff" Apr 22 19:58:05.726916 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:05.726861 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cvg2s" podUID="cba227e9-dcf5-4cf7-9c4b-83013a0b20fb" Apr 22 19:58:05.727068 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:05.726959 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v8fph" podUID="d7bee1d4-9229-4b17-8ec5-e19b53d61c9d" Apr 22 19:58:05.824544 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:05.824510 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72j5f" event={"ID":"d13c3e6a-67cb-4dd5-acc1-44cced31f8d2","Type":"ContainerStarted","Data":"26e3ecfcf13cbb91b138c0daec7675750a8b706a8f7a601e8ec8d69357e44e7f"} Apr 22 19:58:05.825950 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:05.825924 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-f8qlf" event={"ID":"48059c77-1229-43a7-a9b7-366463148f62","Type":"ContainerStarted","Data":"fa285d2c0c7ec431a0a5694e4007e7a0f34248cd32b5c9e6a0e9558f4747a531"} Apr 22 19:58:06.395105 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:06.395069 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-w7bfn" Apr 22 19:58:06.829980 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:06.829945 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72j5f" event={"ID":"d13c3e6a-67cb-4dd5-acc1-44cced31f8d2","Type":"ContainerStarted","Data":"26b13e54e206f7550301eb3f948e7b7d8c73b206151447b0cca9dd4853b2c922"} Apr 22 19:58:06.833813 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:06.833786 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" event={"ID":"e92d8d1a-4d78-4b32-8e69-32db4468f373","Type":"ContainerStarted","Data":"b4468b94726312fca5d587cf461dfc5f179d61143dd3f451a4684bfdb19bac47"} Apr 22 19:58:06.850200 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:06.850154 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-f8qlf" podStartSLOduration=5.610467807 podStartE2EDuration="22.850140559s" podCreationTimestamp="2026-04-22 19:57:44 +0000 UTC" firstStartedPulling="2026-04-22 19:57:46.581533217 +0000 UTC m=+3.394918376" lastFinishedPulling="2026-04-22 19:58:03.821205955 +0000 UTC m=+20.634591128" observedRunningTime="2026-04-22 19:58:05.841864523 +0000 UTC m=+22.655249705" watchObservedRunningTime="2026-04-22 19:58:06.850140559 +0000 UTC m=+23.663525743" Apr 22 19:58:06.850425 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:06.850402 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72j5f" podStartSLOduration=3.566237669 podStartE2EDuration="22.850395455s" podCreationTimestamp="2026-04-22 19:57:44 +0000 UTC" firstStartedPulling="2026-04-22 19:57:46.587981896 +0000 UTC m=+3.401367064" lastFinishedPulling="2026-04-22 19:58:05.87213969 +0000 UTC m=+22.685524850" observedRunningTime="2026-04-22 19:58:06.849748344 +0000 UTC m=+23.663133525" watchObservedRunningTime="2026-04-22 19:58:06.850395455 +0000 UTC m=+23.663780787" Apr 22 19:58:07.726780 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:07.726528 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:58:07.726945 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:07.726533 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:58:07.726945 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:07.726886 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cvg2s" podUID="cba227e9-dcf5-4cf7-9c4b-83013a0b20fb" Apr 22 19:58:07.727050 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:07.726965 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v8fph" podUID="d7bee1d4-9229-4b17-8ec5-e19b53d61c9d" Apr 22 19:58:07.727050 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:07.726533 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncvmn" Apr 22 19:58:07.727141 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:07.727053 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncvmn" podUID="f55cb5a5-4e42-4775-bf9f-5f92344b63ff" Apr 22 19:58:07.935123 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:07.935094 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-w7bfn" Apr 22 19:58:07.935964 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:07.935944 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-w7bfn" Apr 22 19:58:08.837515 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:08.837491 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-w7bfn" Apr 22 19:58:09.726486 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:09.726282 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:58:09.727329 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:09.726351 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncvmn" Apr 22 19:58:09.727329 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:09.726583 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v8fph" podUID="d7bee1d4-9229-4b17-8ec5-e19b53d61c9d" Apr 22 19:58:09.727329 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:09.726368 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:58:09.727329 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:09.726631 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncvmn" podUID="f55cb5a5-4e42-4775-bf9f-5f92344b63ff" Apr 22 19:58:09.727329 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:09.726692 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cvg2s" podUID="cba227e9-dcf5-4cf7-9c4b-83013a0b20fb" Apr 22 19:58:09.841776 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:09.841742 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" event={"ID":"e92d8d1a-4d78-4b32-8e69-32db4468f373","Type":"ContainerStarted","Data":"a57100572793106adb519b4fb85884d0afc102468e0a9bf73c3da2435a6ff3a8"} Apr 22 19:58:09.842024 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:09.841999 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:58:09.842130 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:09.842039 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:58:09.842130 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:09.842055 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:58:09.843720 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:09.843696 2575 generic.go:358] "Generic (PLEG): container finished" podID="153f74bd-9a2c-4a02-88c9-243b60b35439" containerID="5055fa5a5046b635cd3fbe6db9662f98f790feb8ca24756e6c741df9b2d46d4b" exitCode=0 Apr 22 19:58:09.843823 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:09.843774 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cfjtn" event={"ID":"153f74bd-9a2c-4a02-88c9-243b60b35439","Type":"ContainerDied","Data":"5055fa5a5046b635cd3fbe6db9662f98f790feb8ca24756e6c741df9b2d46d4b"} Apr 22 19:58:09.856520 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:09.856499 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:58:09.860675 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:09.860659 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:58:09.869104 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:09.869064 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" podStartSLOduration=8.330492646 podStartE2EDuration="25.869053273s" podCreationTimestamp="2026-04-22 19:57:44 +0000 UTC" firstStartedPulling="2026-04-22 19:57:46.578870745 +0000 UTC m=+3.392255918" lastFinishedPulling="2026-04-22 19:58:04.117431383 +0000 UTC m=+20.930816545" observedRunningTime="2026-04-22 19:58:09.868621369 +0000 UTC m=+26.682006561" watchObservedRunningTime="2026-04-22 19:58:09.869053273 +0000 UTC m=+26.682438453" Apr 22 19:58:10.806023 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:10.805992 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-v8fph"] Apr 22 19:58:10.806536 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:10.806133 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:58:10.806536 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:10.806250 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v8fph" podUID="d7bee1d4-9229-4b17-8ec5-e19b53d61c9d" Apr 22 19:58:10.809175 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:10.809146 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ncvmn"] Apr 22 19:58:10.809293 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:10.809250 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncvmn" Apr 22 19:58:10.809379 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:10.809350 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncvmn" podUID="f55cb5a5-4e42-4775-bf9f-5f92344b63ff" Apr 22 19:58:10.809718 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:10.809698 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-cvg2s"] Apr 22 19:58:10.809810 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:10.809797 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:58:10.809913 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:10.809887 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cvg2s" podUID="cba227e9-dcf5-4cf7-9c4b-83013a0b20fb" Apr 22 19:58:11.848285 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:11.848251 2575 generic.go:358] "Generic (PLEG): container finished" podID="153f74bd-9a2c-4a02-88c9-243b60b35439" containerID="989ae71fa8352a77a458cc2ec945b8cd1abd010f5f1f2fca68769de549fc1b46" exitCode=0 Apr 22 19:58:11.848683 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:11.848347 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cfjtn" event={"ID":"153f74bd-9a2c-4a02-88c9-243b60b35439","Type":"ContainerDied","Data":"989ae71fa8352a77a458cc2ec945b8cd1abd010f5f1f2fca68769de549fc1b46"} Apr 22 19:58:12.726812 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:12.726732 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:58:12.726812 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:12.726753 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncvmn" Apr 22 19:58:12.726963 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:12.726737 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:58:12.726963 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:12.726844 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v8fph" podUID="d7bee1d4-9229-4b17-8ec5-e19b53d61c9d" Apr 22 19:58:12.726963 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:12.726892 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cvg2s" podUID="cba227e9-dcf5-4cf7-9c4b-83013a0b20fb" Apr 22 19:58:12.726963 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:12.726954 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncvmn" podUID="f55cb5a5-4e42-4775-bf9f-5f92344b63ff" Apr 22 19:58:12.852608 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:12.852573 2575 generic.go:358] "Generic (PLEG): container finished" podID="153f74bd-9a2c-4a02-88c9-243b60b35439" containerID="5726eb301fe73e812cb383d0d2a100a1f907d3f50a1fa02591901ed5925079dc" exitCode=0 Apr 22 19:58:12.853011 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:12.852641 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cfjtn" event={"ID":"153f74bd-9a2c-4a02-88c9-243b60b35439","Type":"ContainerDied","Data":"5726eb301fe73e812cb383d0d2a100a1f907d3f50a1fa02591901ed5925079dc"} Apr 22 19:58:14.726825 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:14.726631 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncvmn" Apr 22 19:58:14.727452 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:14.726631 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:58:14.727452 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:14.726907 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncvmn" podUID="f55cb5a5-4e42-4775-bf9f-5f92344b63ff" Apr 22 19:58:14.727452 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:14.726631 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:58:14.727452 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:14.726977 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cvg2s" podUID="cba227e9-dcf5-4cf7-9c4b-83013a0b20fb" Apr 22 19:58:14.727452 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:14.727047 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-v8fph" podUID="d7bee1d4-9229-4b17-8ec5-e19b53d61c9d" Apr 22 19:58:16.478403 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.478373 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-60.ec2.internal" event="NodeReady" Apr 22 19:58:16.478908 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.478553 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 19:58:16.514564 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.514530 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-64b9f68d58-mbcdm"] Apr 22 19:58:16.548018 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.547176 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-f7qdl"] Apr 22 19:58:16.548018 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.547376 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:16.551753 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.551724 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 19:58:16.555328 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.555290 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 19:58:16.555471 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.555447 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-wxhns\"" Apr 22 19:58:16.555592 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.555469 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 19:58:16.560760 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.560742 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 19:58:16.563343 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.563308 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dqgqv"] Apr 22 19:58:16.563453 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.563430 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-f7qdl" Apr 22 19:58:16.567128 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.567108 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 19:58:16.567238 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.567126 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-hhjcv\"" Apr 22 19:58:16.568944 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.568924 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 19:58:16.587187 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.587161 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-b8884"] Apr 22 19:58:16.587380 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.587358 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dqgqv" Apr 22 19:58:16.590869 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.590843 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 19:58:16.591265 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.591101 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-vz7kz\"" Apr 22 19:58:16.591265 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.591101 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:58:16.606708 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.606676 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vrf5f"] Apr 22 19:58:16.606814 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.606729 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-b8884" Apr 22 19:58:16.618365 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.618333 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 19:58:16.618669 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.618620 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 19:58:16.618669 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.618643 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 19:58:16.620024 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.620001 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-57qct\"" Apr 22 19:58:16.621099 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.621074 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 19:58:16.621238 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.621209 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-vfhht"] Apr 22 19:58:16.621459 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.621442 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vrf5f" Apr 22 19:58:16.626012 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.625992 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 19:58:16.626493 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.626407 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:58:16.626592 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.626570 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 19:58:16.626692 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.626675 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-fkjtp\"" Apr 22 19:58:16.637152 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.637126 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-f7qdl"] Apr 22 19:58:16.637277 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.637162 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zxkxh"] Apr 22 19:58:16.637348 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.637298 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-vfhht" Apr 22 19:58:16.646889 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.646862 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 19:58:16.647026 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.646944 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 19:58:16.647076 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.647017 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-vj7ts\"" Apr 22 19:58:16.647199 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.647176 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:58:16.647299 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.647185 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 19:58:16.652469 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.652448 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-46lb7"] Apr 22 19:58:16.652614 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.652599 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zxkxh" Apr 22 19:58:16.652684 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.652668 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 19:58:16.657193 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.657175 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 19:58:16.657367 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.657186 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:58:16.657463 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.657382 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 19:58:16.657463 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.657388 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 19:58:16.657463 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.657350 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-2qqkz\"" Apr 22 19:58:16.670994 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.670974 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rs9hj"] Apr 22 19:58:16.671141 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.671126 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-46lb7" Apr 22 19:58:16.678404 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.678001 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 19:58:16.678404 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.678252 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 19:58:16.678774 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.678757 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 19:58:16.678874 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.678786 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 19:58:16.678874 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.678786 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-8l9nh\"" Apr 22 19:58:16.680911 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.680891 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5ff55498bb-hwmkq"] Apr 22 19:58:16.681053 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.681037 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rs9hj" Apr 22 19:58:16.689625 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.689428 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 19:58:16.689625 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.689593 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 19:58:16.689792 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.689696 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 19:58:16.690086 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.690067 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:58:16.690333 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.690299 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-krjmb\"" Apr 22 19:58:16.691375 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.691353 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4wqkr"] Apr 22 19:58:16.692568 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.692548 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 19:58:16.694591 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.694567 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ec41f5a3-a7a6-4607-b14e-49402afefbe2-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-f7qdl\" (UID: \"ec41f5a3-a7a6-4607-b14e-49402afefbe2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-f7qdl" Apr 22 19:58:16.694674 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.694614 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c1f10812-b9aa-426e-ac37-99d23b87a05a-registry-certificates\") pod \"image-registry-64b9f68d58-mbcdm\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:16.694674 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.694643 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1f10812-b9aa-426e-ac37-99d23b87a05a-trusted-ca\") pod \"image-registry-64b9f68d58-mbcdm\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:16.694775 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.694687 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh7fj\" (UniqueName: \"kubernetes.io/projected/d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4-kube-api-access-dh7fj\") pod \"cluster-samples-operator-6dc5bdb6b4-vrf5f\" (UID: \"d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vrf5f" Apr 22 19:58:16.694775 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.694715 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ab3165ee-b810-41ae-b9dc-8e3198db6bc1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-b8884\" (UID: \"ab3165ee-b810-41ae-b9dc-8e3198db6bc1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-b8884" Apr 22 19:58:16.694864 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.694830 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vrf5f\" (UID: \"d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vrf5f" Apr 22 19:58:16.694864 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.694854 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ab3165ee-b810-41ae-b9dc-8e3198db6bc1-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-b8884\" (UID: \"ab3165ee-b810-41ae-b9dc-8e3198db6bc1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-b8884" Apr 22 19:58:16.694946 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.694899 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd8ns\" (UniqueName: \"kubernetes.io/projected/ab3165ee-b810-41ae-b9dc-8e3198db6bc1-kube-api-access-fd8ns\") pod \"cluster-monitoring-operator-75587bd455-b8884\" (UID: \"ab3165ee-b810-41ae-b9dc-8e3198db6bc1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-b8884" Apr 22 19:58:16.694946 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.694923 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ec41f5a3-a7a6-4607-b14e-49402afefbe2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-f7qdl\" (UID: \"ec41f5a3-a7a6-4607-b14e-49402afefbe2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-f7qdl" Apr 22 19:58:16.694946 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.694939 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-bound-sa-token\") pod \"image-registry-64b9f68d58-mbcdm\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:16.695069 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.694957 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gg2l\" (UniqueName: \"kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-kube-api-access-4gg2l\") pod \"image-registry-64b9f68d58-mbcdm\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:16.695069 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.694982 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgwfc\" (UniqueName: \"kubernetes.io/projected/ccc07e82-60c3-4ce8-9255-15e80fff83d9-kube-api-access-bgwfc\") pod \"volume-data-source-validator-7c6cbb6c87-dqgqv\" (UID: \"ccc07e82-60c3-4ce8-9255-15e80fff83d9\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dqgqv" Apr 22 19:58:16.695069 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.694999 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c1f10812-b9aa-426e-ac37-99d23b87a05a-image-registry-private-configuration\") pod \"image-registry-64b9f68d58-mbcdm\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:16.695069 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.695014 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-registry-tls\") pod \"image-registry-64b9f68d58-mbcdm\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:16.695069 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.695029 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c1f10812-b9aa-426e-ac37-99d23b87a05a-installation-pull-secrets\") pod \"image-registry-64b9f68d58-mbcdm\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:16.695276 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.695079 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c1f10812-b9aa-426e-ac37-99d23b87a05a-ca-trust-extracted\") pod \"image-registry-64b9f68d58-mbcdm\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:16.703145 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.703081 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86b985fbd5-gx4rj"] Apr 22 19:58:16.703264 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.703149 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:16.703264 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.703191 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4wqkr" Apr 22 19:58:16.706237 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.706216 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 19:58:16.706385 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.706261 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 19:58:16.706790 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.706771 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-2zz2n\"" Apr 22 19:58:16.706890 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.706775 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 19:58:16.707851 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.707833 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-7t6fr\"" Apr 22 19:58:16.707942 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.707854 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 19:58:16.707942 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.707897 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 19:58:16.707942 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.707903 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:58:16.708031 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.707854 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 19:58:16.708138 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.708123 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:58:16.710754 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.710730 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-bb6558494-8xmvt"] Apr 22 19:58:16.710857 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.710844 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86b985fbd5-gx4rj" Apr 22 19:58:16.714603 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.714576 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 19:58:16.714684 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.714618 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-64b9f68d58-mbcdm"] Apr 22 19:58:16.714684 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.714645 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2"] Apr 22 19:58:16.714795 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.714770 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bb6558494-8xmvt" Apr 22 19:58:16.714869 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.714848 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 19:58:16.714946 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.714873 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 19:58:16.715017 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.714578 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 19:58:16.715075 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.715031 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-88rrq\"" Apr 22 19:58:16.721178 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.718691 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 19:58:16.722118 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.722093 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-ggr2l"] Apr 22 19:58:16.722289 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.722270 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" Apr 22 19:58:16.724693 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.724646 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 19:58:16.724693 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.724661 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 19:58:16.725492 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.725472 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 19:58:16.726869 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.726807 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 19:58:16.728015 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.727998 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-b8884"] Apr 22 19:58:16.728105 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.728037 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-vfhht"] Apr 22 19:58:16.728105 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.728049 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-46lb7"] Apr 22 19:58:16.728202 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.728126 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zxkxh"] Apr 22 19:58:16.728202 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.728134 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncvmn" Apr 22 19:58:16.728202 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.728150 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rs9hj"] Apr 22 19:58:16.728202 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.728173 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dqgqv"] Apr 22 19:58:16.728202 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.728186 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:58:16.728202 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.728188 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2"] Apr 22 19:58:16.728528 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.728221 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-bb6558494-8xmvt"] Apr 22 19:58:16.728528 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.728237 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4wqkr"] Apr 22 19:58:16.728528 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.728244 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:58:16.728528 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.728249 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vrf5f"] Apr 22 19:58:16.728528 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.728281 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ggr2l"] Apr 22 19:58:16.728528 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.728290 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5ff55498bb-hwmkq"] Apr 22 19:58:16.728528 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.728302 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86b985fbd5-gx4rj"] Apr 22 19:58:16.728528 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.728361 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-fpzrf"] Apr 22 19:58:16.728528 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.728456 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ggr2l" Apr 22 19:58:16.730704 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.730685 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-2xv8f\"" Apr 22 19:58:16.730704 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.730701 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 19:58:16.731064 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.731047 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gl2tf\"" Apr 22 19:58:16.731179 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.731149 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-pxnlc\"" Apr 22 19:58:16.731531 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.731513 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 19:58:16.731647 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.731628 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 19:58:16.731872 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.731829 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:58:16.731955 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.731944 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 19:58:16.732195 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.732176 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fpzrf"] Apr 22 19:58:16.732306 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.732283 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fpzrf" Apr 22 19:58:16.734261 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.734244 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 19:58:16.734349 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.734295 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 19:58:16.734834 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.734815 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wrr4x\"" Apr 22 19:58:16.795499 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.795476 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fd8ns\" (UniqueName: \"kubernetes.io/projected/ab3165ee-b810-41ae-b9dc-8e3198db6bc1-kube-api-access-fd8ns\") pod \"cluster-monitoring-operator-75587bd455-b8884\" (UID: \"ab3165ee-b810-41ae-b9dc-8e3198db6bc1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-b8884" Apr 22 19:58:16.795610 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.795521 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3484f3e2-bbf8-464e-a5f4-8d66ff2230f3-config\") pod \"service-ca-operator-d6fc45fc5-rs9hj\" (UID: \"3484f3e2-bbf8-464e-a5f4-8d66ff2230f3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rs9hj" Apr 22 19:58:16.795672 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.795648 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dh7fj\" (UniqueName: \"kubernetes.io/projected/d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4-kube-api-access-dh7fj\") pod \"cluster-samples-operator-6dc5bdb6b4-vrf5f\" (UID: \"d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vrf5f" Apr 22 19:58:16.795725 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.795676 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ab3165ee-b810-41ae-b9dc-8e3198db6bc1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-b8884\" (UID: \"ab3165ee-b810-41ae-b9dc-8e3198db6bc1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-b8884" Apr 22 19:58:16.795725 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.795711 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-stats-auth\") pod \"router-default-5ff55498bb-hwmkq\" (UID: \"9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9\") " pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:16.795827 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.795753 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1f10812-b9aa-426e-ac37-99d23b87a05a-trusted-ca\") pod \"image-registry-64b9f68d58-mbcdm\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:16.795827 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.795780 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5bfm\" (UniqueName: \"kubernetes.io/projected/240239de-ebef-48d8-bfd9-2171161b364f-kube-api-access-h5bfm\") pod \"network-check-source-8894fc9bd-4wqkr\" (UID: \"240239de-ebef-48d8-bfd9-2171161b364f\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4wqkr" Apr 22 19:58:16.795827 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.795807 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh28f\" (UniqueName: \"kubernetes.io/projected/245ccfea-9935-4d23-820f-9267b29a9257-kube-api-access-wh28f\") pod \"klusterlet-addon-workmgr-bb6558494-8xmvt\" (UID: \"245ccfea-9935-4d23-820f-9267b29a9257\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bb6558494-8xmvt" Apr 22 19:58:16.795827 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:16.795819 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:58:16.796003 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.795833 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8tvw\" (UniqueName: \"kubernetes.io/projected/5c826ea9-4c12-4695-924d-37f9fda8d480-kube-api-access-q8tvw\") pod \"managed-serviceaccount-addon-agent-86b985fbd5-gx4rj\" (UID: \"5c826ea9-4c12-4695-924d-37f9fda8d480\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86b985fbd5-gx4rj" Apr 22 19:58:16.796003 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.795874 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vrf5f\" (UID: \"d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vrf5f" Apr 22 19:58:16.796003 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:16.795885 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab3165ee-b810-41ae-b9dc-8e3198db6bc1-cluster-monitoring-operator-tls podName:ab3165ee-b810-41ae-b9dc-8e3198db6bc1 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:17.295867063 +0000 UTC m=+34.109252225 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ab3165ee-b810-41ae-b9dc-8e3198db6bc1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-b8884" (UID: "ab3165ee-b810-41ae-b9dc-8e3198db6bc1") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:58:16.796003 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.795947 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzf9t\" (UniqueName: \"kubernetes.io/projected/7fa0c2d8-cebb-4563-88fb-c974a238cc8d-kube-api-access-dzf9t\") pod \"insights-operator-585dfdc468-46lb7\" (UID: \"7fa0c2d8-cebb-4563-88fb-c974a238cc8d\") " pod="openshift-insights/insights-operator-585dfdc468-46lb7" Apr 22 19:58:16.796003 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:16.795959 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:58:16.796003 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.795982 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-default-certificate\") pod \"router-default-5ff55498bb-hwmkq\" (UID: \"9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9\") " pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:16.796003 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:16.796005 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4-samples-operator-tls podName:d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:17.295993749 +0000 UTC m=+34.109378908 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-vrf5f" (UID: "d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4") : secret "samples-operator-tls" not found Apr 22 19:58:16.796374 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.796044 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/338d333b-8626-46b3-b450-fbdd521183d8-serving-cert\") pod \"console-operator-9d4b6777b-vfhht\" (UID: \"338d333b-8626-46b3-b450-fbdd521183d8\") " pod="openshift-console-operator/console-operator-9d4b6777b-vfhht" Apr 22 19:58:16.796374 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.796075 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ec41f5a3-a7a6-4607-b14e-49402afefbe2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-f7qdl\" (UID: \"ec41f5a3-a7a6-4607-b14e-49402afefbe2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-f7qdl" Apr 22 19:58:16.796374 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.796103 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gg2l\" (UniqueName: \"kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-kube-api-access-4gg2l\") pod \"image-registry-64b9f68d58-mbcdm\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:16.796374 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.796127 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41ae5fd4-d9cf-4314-b727-acf1d473957e-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-zxkxh\" (UID: \"41ae5fd4-d9cf-4314-b727-acf1d473957e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zxkxh" Apr 22 19:58:16.796374 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.796143 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6c75cc995-6jqp2\" (UID: \"5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" Apr 22 19:58:16.796374 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.796167 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fa0c2d8-cebb-4563-88fb-c974a238cc8d-service-ca-bundle\") pod \"insights-operator-585dfdc468-46lb7\" (UID: \"7fa0c2d8-cebb-4563-88fb-c974a238cc8d\") " pod="openshift-insights/insights-operator-585dfdc468-46lb7" Apr 22 19:58:16.796374 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:16.796172 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:58:16.796374 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.796224 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bgwfc\" (UniqueName: \"kubernetes.io/projected/ccc07e82-60c3-4ce8-9255-15e80fff83d9-kube-api-access-bgwfc\") pod \"volume-data-source-validator-7c6cbb6c87-dqgqv\" (UID: \"ccc07e82-60c3-4ce8-9255-15e80fff83d9\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dqgqv" Apr 22 19:58:16.796374 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:16.796241 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec41f5a3-a7a6-4607-b14e-49402afefbe2-networking-console-plugin-cert podName:ec41f5a3-a7a6-4607-b14e-49402afefbe2 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:17.296224459 +0000 UTC m=+34.109609619 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ec41f5a3-a7a6-4607-b14e-49402afefbe2-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-f7qdl" (UID: "ec41f5a3-a7a6-4607-b14e-49402afefbe2") : secret "networking-console-plugin-cert" not found Apr 22 19:58:16.796374 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.796355 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c1f10812-b9aa-426e-ac37-99d23b87a05a-image-registry-private-configuration\") pod \"image-registry-64b9f68d58-mbcdm\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:16.796755 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.796395 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk9ll\" (UniqueName: \"kubernetes.io/projected/41ae5fd4-d9cf-4314-b727-acf1d473957e-kube-api-access-vk9ll\") pod \"kube-storage-version-migrator-operator-6769c5d45-zxkxh\" (UID: \"41ae5fd4-d9cf-4314-b727-acf1d473957e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zxkxh" Apr 22 19:58:16.796755 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.796423 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7jvg\" (UniqueName: \"kubernetes.io/projected/5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32-kube-api-access-g7jvg\") pod \"cluster-proxy-proxy-agent-6c75cc995-6jqp2\" (UID: \"5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" Apr 22 19:58:16.796755 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.796462 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c1f10812-b9aa-426e-ac37-99d23b87a05a-ca-trust-extracted\") pod \"image-registry-64b9f68d58-mbcdm\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:16.796755 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.796489 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/7fa0c2d8-cebb-4563-88fb-c974a238cc8d-snapshots\") pod \"insights-operator-585dfdc468-46lb7\" (UID: \"7fa0c2d8-cebb-4563-88fb-c974a238cc8d\") " pod="openshift-insights/insights-operator-585dfdc468-46lb7" Apr 22 19:58:16.796755 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.796514 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6c75cc995-6jqp2\" (UID: \"5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" Apr 22 19:58:16.796755 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.796540 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7fa0c2d8-cebb-4563-88fb-c974a238cc8d-tmp\") pod \"insights-operator-585dfdc468-46lb7\" (UID: \"7fa0c2d8-cebb-4563-88fb-c974a238cc8d\") " pod="openshift-insights/insights-operator-585dfdc468-46lb7" Apr 22 19:58:16.796755 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.796564 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3484f3e2-bbf8-464e-a5f4-8d66ff2230f3-serving-cert\") pod \"service-ca-operator-d6fc45fc5-rs9hj\" (UID: \"3484f3e2-bbf8-464e-a5f4-8d66ff2230f3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rs9hj" Apr 22 19:58:16.796755 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.796582 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ab3165ee-b810-41ae-b9dc-8e3198db6bc1-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-b8884\" (UID: \"ab3165ee-b810-41ae-b9dc-8e3198db6bc1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-b8884" Apr 22 19:58:16.796755 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.796601 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ec41f5a3-a7a6-4607-b14e-49402afefbe2-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-f7qdl\" (UID: \"ec41f5a3-a7a6-4607-b14e-49402afefbe2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-f7qdl" Apr 22 19:58:16.796755 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.796620 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c1f10812-b9aa-426e-ac37-99d23b87a05a-registry-certificates\") pod \"image-registry-64b9f68d58-mbcdm\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:16.796755 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.796646 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/338d333b-8626-46b3-b450-fbdd521183d8-trusted-ca\") pod \"console-operator-9d4b6777b-vfhht\" (UID: \"338d333b-8626-46b3-b450-fbdd521183d8\") " pod="openshift-console-operator/console-operator-9d4b6777b-vfhht" Apr 22 19:58:16.796755 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.796675 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/245ccfea-9935-4d23-820f-9267b29a9257-klusterlet-config\") pod \"klusterlet-addon-workmgr-bb6558494-8xmvt\" (UID: \"245ccfea-9935-4d23-820f-9267b29a9257\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bb6558494-8xmvt" Apr 22 19:58:16.796755 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.796702 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j6c9\" (UniqueName: \"kubernetes.io/projected/338d333b-8626-46b3-b450-fbdd521183d8-kube-api-access-9j6c9\") pod \"console-operator-9d4b6777b-vfhht\" (UID: \"338d333b-8626-46b3-b450-fbdd521183d8\") " pod="openshift-console-operator/console-operator-9d4b6777b-vfhht" Apr 22 19:58:16.796755 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.796749 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6c75cc995-6jqp2\" (UID: \"5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" Apr 22 19:58:16.797262 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.796780 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32-ca\") pod \"cluster-proxy-proxy-agent-6c75cc995-6jqp2\" (UID: \"5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" Apr 22 19:58:16.797262 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.796812 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c1f10812-b9aa-426e-ac37-99d23b87a05a-ca-trust-extracted\") pod \"image-registry-64b9f68d58-mbcdm\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:16.797262 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.796820 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/338d333b-8626-46b3-b450-fbdd521183d8-config\") pod \"console-operator-9d4b6777b-vfhht\" (UID: \"338d333b-8626-46b3-b450-fbdd521183d8\") " pod="openshift-console-operator/console-operator-9d4b6777b-vfhht" Apr 22 19:58:16.797262 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.796859 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fa0c2d8-cebb-4563-88fb-c974a238cc8d-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-46lb7\" (UID: \"7fa0c2d8-cebb-4563-88fb-c974a238cc8d\") " pod="openshift-insights/insights-operator-585dfdc468-46lb7" Apr 22 19:58:16.797262 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.796885 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dwbp\" (UniqueName: \"kubernetes.io/projected/3484f3e2-bbf8-464e-a5f4-8d66ff2230f3-kube-api-access-8dwbp\") pod \"service-ca-operator-d6fc45fc5-rs9hj\" (UID: \"3484f3e2-bbf8-464e-a5f4-8d66ff2230f3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rs9hj" Apr 22 19:58:16.797262 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.796932 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41ae5fd4-d9cf-4314-b727-acf1d473957e-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-zxkxh\" (UID: \"41ae5fd4-d9cf-4314-b727-acf1d473957e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zxkxh" Apr 22 19:58:16.797262 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.796949 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1f10812-b9aa-426e-ac37-99d23b87a05a-trusted-ca\") pod \"image-registry-64b9f68d58-mbcdm\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:16.797262 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.796963 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwntb\" (UniqueName: \"kubernetes.io/projected/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-kube-api-access-dwntb\") pod \"router-default-5ff55498bb-hwmkq\" (UID: \"9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9\") " pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:16.797262 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.797006 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-bound-sa-token\") pod \"image-registry-64b9f68d58-mbcdm\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:16.797262 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.797032 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5c826ea9-4c12-4695-924d-37f9fda8d480-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-86b985fbd5-gx4rj\" (UID: \"5c826ea9-4c12-4695-924d-37f9fda8d480\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86b985fbd5-gx4rj" Apr 22 19:58:16.797262 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.797072 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-metrics-certs\") pod \"router-default-5ff55498bb-hwmkq\" (UID: \"9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9\") " pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:16.797262 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.797107 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32-hub\") pod \"cluster-proxy-proxy-agent-6c75cc995-6jqp2\" (UID: \"5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" Apr 22 19:58:16.797262 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.797157 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-registry-tls\") pod \"image-registry-64b9f68d58-mbcdm\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:16.797262 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.797195 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c1f10812-b9aa-426e-ac37-99d23b87a05a-installation-pull-secrets\") pod \"image-registry-64b9f68d58-mbcdm\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:16.797262 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.797227 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-service-ca-bundle\") pod \"router-default-5ff55498bb-hwmkq\" (UID: \"9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9\") " pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:16.797951 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.797272 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/245ccfea-9935-4d23-820f-9267b29a9257-tmp\") pod \"klusterlet-addon-workmgr-bb6558494-8xmvt\" (UID: \"245ccfea-9935-4d23-820f-9267b29a9257\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bb6558494-8xmvt" Apr 22 19:58:16.797951 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.797327 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c1f10812-b9aa-426e-ac37-99d23b87a05a-registry-certificates\") pod \"image-registry-64b9f68d58-mbcdm\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:16.797951 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:16.797372 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:58:16.797951 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:16.797386 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64b9f68d58-mbcdm: secret "image-registry-tls" not found Apr 22 19:58:16.797951 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.797427 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fa0c2d8-cebb-4563-88fb-c974a238cc8d-serving-cert\") pod \"insights-operator-585dfdc468-46lb7\" (UID: \"7fa0c2d8-cebb-4563-88fb-c974a238cc8d\") " pod="openshift-insights/insights-operator-585dfdc468-46lb7" Apr 22 19:58:16.797951 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.797454 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ec41f5a3-a7a6-4607-b14e-49402afefbe2-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-f7qdl\" (UID: \"ec41f5a3-a7a6-4607-b14e-49402afefbe2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-f7qdl" Apr 22 19:58:16.797951 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.797494 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ab3165ee-b810-41ae-b9dc-8e3198db6bc1-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-b8884\" (UID: \"ab3165ee-b810-41ae-b9dc-8e3198db6bc1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-b8884" Apr 22 19:58:16.797951 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:16.797527 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-registry-tls podName:c1f10812-b9aa-426e-ac37-99d23b87a05a nodeName:}" failed. No retries permitted until 2026-04-22 19:58:17.297511477 +0000 UTC m=+34.110896647 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-registry-tls") pod "image-registry-64b9f68d58-mbcdm" (UID: "c1f10812-b9aa-426e-ac37-99d23b87a05a") : secret "image-registry-tls" not found Apr 22 19:58:16.801178 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.801155 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c1f10812-b9aa-426e-ac37-99d23b87a05a-installation-pull-secrets\") pod \"image-registry-64b9f68d58-mbcdm\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:16.801178 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.801170 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c1f10812-b9aa-426e-ac37-99d23b87a05a-image-registry-private-configuration\") pod \"image-registry-64b9f68d58-mbcdm\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:16.804652 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.804595 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd8ns\" (UniqueName: \"kubernetes.io/projected/ab3165ee-b810-41ae-b9dc-8e3198db6bc1-kube-api-access-fd8ns\") pod \"cluster-monitoring-operator-75587bd455-b8884\" (UID: \"ab3165ee-b810-41ae-b9dc-8e3198db6bc1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-b8884" Apr 22 19:58:16.805400 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.805294 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgwfc\" (UniqueName: \"kubernetes.io/projected/ccc07e82-60c3-4ce8-9255-15e80fff83d9-kube-api-access-bgwfc\") pod \"volume-data-source-validator-7c6cbb6c87-dqgqv\" (UID: \"ccc07e82-60c3-4ce8-9255-15e80fff83d9\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dqgqv" Apr 22 19:58:16.806042 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.806014 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh7fj\" (UniqueName: \"kubernetes.io/projected/d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4-kube-api-access-dh7fj\") pod \"cluster-samples-operator-6dc5bdb6b4-vrf5f\" (UID: \"d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vrf5f" Apr 22 19:58:16.807249 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.807189 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gg2l\" (UniqueName: \"kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-kube-api-access-4gg2l\") pod \"image-registry-64b9f68d58-mbcdm\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:16.811691 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.811671 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-bound-sa-token\") pod \"image-registry-64b9f68d58-mbcdm\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:16.897745 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.897715 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8tvw\" (UniqueName: \"kubernetes.io/projected/5c826ea9-4c12-4695-924d-37f9fda8d480-kube-api-access-q8tvw\") pod \"managed-serviceaccount-addon-agent-86b985fbd5-gx4rj\" (UID: \"5c826ea9-4c12-4695-924d-37f9fda8d480\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86b985fbd5-gx4rj" Apr 22 19:58:16.897954 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.897778 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzf9t\" (UniqueName: \"kubernetes.io/projected/7fa0c2d8-cebb-4563-88fb-c974a238cc8d-kube-api-access-dzf9t\") pod \"insights-operator-585dfdc468-46lb7\" (UID: \"7fa0c2d8-cebb-4563-88fb-c974a238cc8d\") " pod="openshift-insights/insights-operator-585dfdc468-46lb7" Apr 22 19:58:16.897954 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.897803 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-default-certificate\") pod \"router-default-5ff55498bb-hwmkq\" (UID: \"9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9\") " pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:16.897954 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.897827 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/338d333b-8626-46b3-b450-fbdd521183d8-serving-cert\") pod \"console-operator-9d4b6777b-vfhht\" (UID: \"338d333b-8626-46b3-b450-fbdd521183d8\") " pod="openshift-console-operator/console-operator-9d4b6777b-vfhht" Apr 22 19:58:16.897954 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.897842 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dqgqv" Apr 22 19:58:16.897954 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.897876 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41ae5fd4-d9cf-4314-b727-acf1d473957e-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-zxkxh\" (UID: \"41ae5fd4-d9cf-4314-b727-acf1d473957e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zxkxh" Apr 22 19:58:16.897954 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.897901 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6c75cc995-6jqp2\" (UID: \"5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" Apr 22 19:58:16.897954 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.897930 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fa0c2d8-cebb-4563-88fb-c974a238cc8d-service-ca-bundle\") pod \"insights-operator-585dfdc468-46lb7\" (UID: \"7fa0c2d8-cebb-4563-88fb-c974a238cc8d\") " pod="openshift-insights/insights-operator-585dfdc468-46lb7" Apr 22 19:58:16.897954 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.897955 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vk9ll\" (UniqueName: \"kubernetes.io/projected/41ae5fd4-d9cf-4314-b727-acf1d473957e-kube-api-access-vk9ll\") pod \"kube-storage-version-migrator-operator-6769c5d45-zxkxh\" (UID: \"41ae5fd4-d9cf-4314-b727-acf1d473957e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zxkxh" Apr 22 19:58:16.898448 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.897981 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g7jvg\" (UniqueName: \"kubernetes.io/projected/5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32-kube-api-access-g7jvg\") pod \"cluster-proxy-proxy-agent-6c75cc995-6jqp2\" (UID: \"5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" Apr 22 19:58:16.898448 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.898029 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/7fa0c2d8-cebb-4563-88fb-c974a238cc8d-snapshots\") pod \"insights-operator-585dfdc468-46lb7\" (UID: \"7fa0c2d8-cebb-4563-88fb-c974a238cc8d\") " pod="openshift-insights/insights-operator-585dfdc468-46lb7" Apr 22 19:58:16.898448 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.898056 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6c75cc995-6jqp2\" (UID: \"5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" Apr 22 19:58:16.898448 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.898082 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7fa0c2d8-cebb-4563-88fb-c974a238cc8d-tmp\") pod \"insights-operator-585dfdc468-46lb7\" (UID: \"7fa0c2d8-cebb-4563-88fb-c974a238cc8d\") " pod="openshift-insights/insights-operator-585dfdc468-46lb7" Apr 22 19:58:16.898448 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.898106 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3484f3e2-bbf8-464e-a5f4-8d66ff2230f3-serving-cert\") pod \"service-ca-operator-d6fc45fc5-rs9hj\" (UID: \"3484f3e2-bbf8-464e-a5f4-8d66ff2230f3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rs9hj" Apr 22 19:58:16.898448 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.898140 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/338d333b-8626-46b3-b450-fbdd521183d8-trusted-ca\") pod \"console-operator-9d4b6777b-vfhht\" (UID: \"338d333b-8626-46b3-b450-fbdd521183d8\") " pod="openshift-console-operator/console-operator-9d4b6777b-vfhht" Apr 22 19:58:16.898448 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.898172 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/245ccfea-9935-4d23-820f-9267b29a9257-klusterlet-config\") pod \"klusterlet-addon-workmgr-bb6558494-8xmvt\" (UID: \"245ccfea-9935-4d23-820f-9267b29a9257\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bb6558494-8xmvt" Apr 22 19:58:16.898448 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.898197 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9j6c9\" (UniqueName: \"kubernetes.io/projected/338d333b-8626-46b3-b450-fbdd521183d8-kube-api-access-9j6c9\") pod \"console-operator-9d4b6777b-vfhht\" (UID: \"338d333b-8626-46b3-b450-fbdd521183d8\") " pod="openshift-console-operator/console-operator-9d4b6777b-vfhht" Apr 22 19:58:16.898448 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.898254 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7h2n\" (UniqueName: \"kubernetes.io/projected/efcdc84a-302d-4ab3-b70a-269b931a9634-kube-api-access-x7h2n\") pod \"ingress-canary-ggr2l\" (UID: \"efcdc84a-302d-4ab3-b70a-269b931a9634\") " pod="openshift-ingress-canary/ingress-canary-ggr2l" Apr 22 19:58:16.898448 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.898298 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6c75cc995-6jqp2\" (UID: \"5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" Apr 22 19:58:16.898448 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.898344 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32-ca\") pod \"cluster-proxy-proxy-agent-6c75cc995-6jqp2\" (UID: \"5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" Apr 22 19:58:16.898448 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.898381 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/338d333b-8626-46b3-b450-fbdd521183d8-config\") pod \"console-operator-9d4b6777b-vfhht\" (UID: \"338d333b-8626-46b3-b450-fbdd521183d8\") " pod="openshift-console-operator/console-operator-9d4b6777b-vfhht" Apr 22 19:58:16.898448 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.898411 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fa0c2d8-cebb-4563-88fb-c974a238cc8d-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-46lb7\" (UID: \"7fa0c2d8-cebb-4563-88fb-c974a238cc8d\") " pod="openshift-insights/insights-operator-585dfdc468-46lb7" Apr 22 19:58:16.898448 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.898439 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dwbp\" (UniqueName: \"kubernetes.io/projected/3484f3e2-bbf8-464e-a5f4-8d66ff2230f3-kube-api-access-8dwbp\") pod \"service-ca-operator-d6fc45fc5-rs9hj\" (UID: \"3484f3e2-bbf8-464e-a5f4-8d66ff2230f3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rs9hj" Apr 22 19:58:16.899111 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.898463 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7561fdeb-e97b-4652-ba5c-1e555b68f4aa-tmp-dir\") pod \"dns-default-fpzrf\" (UID: \"7561fdeb-e97b-4652-ba5c-1e555b68f4aa\") " pod="openshift-dns/dns-default-fpzrf" Apr 22 19:58:16.899111 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.898498 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41ae5fd4-d9cf-4314-b727-acf1d473957e-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-zxkxh\" (UID: \"41ae5fd4-d9cf-4314-b727-acf1d473957e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zxkxh" Apr 22 19:58:16.899111 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.898530 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfvtf\" (UniqueName: \"kubernetes.io/projected/7561fdeb-e97b-4652-ba5c-1e555b68f4aa-kube-api-access-gfvtf\") pod \"dns-default-fpzrf\" (UID: \"7561fdeb-e97b-4652-ba5c-1e555b68f4aa\") " pod="openshift-dns/dns-default-fpzrf" Apr 22 19:58:16.899111 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.898560 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwntb\" (UniqueName: \"kubernetes.io/projected/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-kube-api-access-dwntb\") pod \"router-default-5ff55498bb-hwmkq\" (UID: \"9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9\") " pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:16.899111 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.898592 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5c826ea9-4c12-4695-924d-37f9fda8d480-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-86b985fbd5-gx4rj\" (UID: \"5c826ea9-4c12-4695-924d-37f9fda8d480\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86b985fbd5-gx4rj" Apr 22 19:58:16.899111 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.898620 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-metrics-certs\") pod \"router-default-5ff55498bb-hwmkq\" (UID: \"9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9\") " pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:16.899111 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.898644 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32-hub\") pod \"cluster-proxy-proxy-agent-6c75cc995-6jqp2\" (UID: \"5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" Apr 22 19:58:16.899111 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.898684 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efcdc84a-302d-4ab3-b70a-269b931a9634-cert\") pod \"ingress-canary-ggr2l\" (UID: \"efcdc84a-302d-4ab3-b70a-269b931a9634\") " pod="openshift-ingress-canary/ingress-canary-ggr2l" Apr 22 19:58:16.899111 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.898728 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-service-ca-bundle\") pod \"router-default-5ff55498bb-hwmkq\" (UID: \"9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9\") " pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:16.899111 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.898753 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/245ccfea-9935-4d23-820f-9267b29a9257-tmp\") pod \"klusterlet-addon-workmgr-bb6558494-8xmvt\" (UID: \"245ccfea-9935-4d23-820f-9267b29a9257\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bb6558494-8xmvt" Apr 22 19:58:16.899111 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.898780 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7561fdeb-e97b-4652-ba5c-1e555b68f4aa-config-volume\") pod \"dns-default-fpzrf\" (UID: \"7561fdeb-e97b-4652-ba5c-1e555b68f4aa\") " pod="openshift-dns/dns-default-fpzrf" Apr 22 19:58:16.899111 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.898804 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7561fdeb-e97b-4652-ba5c-1e555b68f4aa-metrics-tls\") pod \"dns-default-fpzrf\" (UID: \"7561fdeb-e97b-4652-ba5c-1e555b68f4aa\") " pod="openshift-dns/dns-default-fpzrf" Apr 22 19:58:16.899111 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.898848 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fa0c2d8-cebb-4563-88fb-c974a238cc8d-serving-cert\") pod \"insights-operator-585dfdc468-46lb7\" (UID: \"7fa0c2d8-cebb-4563-88fb-c974a238cc8d\") " pod="openshift-insights/insights-operator-585dfdc468-46lb7" Apr 22 19:58:16.899111 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.898882 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3484f3e2-bbf8-464e-a5f4-8d66ff2230f3-config\") pod \"service-ca-operator-d6fc45fc5-rs9hj\" (UID: \"3484f3e2-bbf8-464e-a5f4-8d66ff2230f3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rs9hj" Apr 22 19:58:16.899111 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.898936 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-stats-auth\") pod \"router-default-5ff55498bb-hwmkq\" (UID: \"9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9\") " pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:16.899111 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.898972 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5bfm\" (UniqueName: \"kubernetes.io/projected/240239de-ebef-48d8-bfd9-2171161b364f-kube-api-access-h5bfm\") pod \"network-check-source-8894fc9bd-4wqkr\" (UID: \"240239de-ebef-48d8-bfd9-2171161b364f\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4wqkr" Apr 22 19:58:16.900074 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.899005 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wh28f\" (UniqueName: \"kubernetes.io/projected/245ccfea-9935-4d23-820f-9267b29a9257-kube-api-access-wh28f\") pod \"klusterlet-addon-workmgr-bb6558494-8xmvt\" (UID: \"245ccfea-9935-4d23-820f-9267b29a9257\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bb6558494-8xmvt" Apr 22 19:58:16.900074 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.899576 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/7fa0c2d8-cebb-4563-88fb-c974a238cc8d-snapshots\") pod \"insights-operator-585dfdc468-46lb7\" (UID: \"7fa0c2d8-cebb-4563-88fb-c974a238cc8d\") " pod="openshift-insights/insights-operator-585dfdc468-46lb7" Apr 22 19:58:16.901543 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.900427 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6c75cc995-6jqp2\" (UID: \"5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" Apr 22 19:58:16.901543 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.900502 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fa0c2d8-cebb-4563-88fb-c974a238cc8d-service-ca-bundle\") pod \"insights-operator-585dfdc468-46lb7\" (UID: \"7fa0c2d8-cebb-4563-88fb-c974a238cc8d\") " pod="openshift-insights/insights-operator-585dfdc468-46lb7" Apr 22 19:58:16.901543 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.901109 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-default-certificate\") pod \"router-default-5ff55498bb-hwmkq\" (UID: \"9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9\") " pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:16.901543 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.901142 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/338d333b-8626-46b3-b450-fbdd521183d8-trusted-ca\") pod \"console-operator-9d4b6777b-vfhht\" (UID: \"338d333b-8626-46b3-b450-fbdd521183d8\") " pod="openshift-console-operator/console-operator-9d4b6777b-vfhht" Apr 22 19:58:16.901543 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.901398 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7fa0c2d8-cebb-4563-88fb-c974a238cc8d-tmp\") pod \"insights-operator-585dfdc468-46lb7\" (UID: \"7fa0c2d8-cebb-4563-88fb-c974a238cc8d\") " pod="openshift-insights/insights-operator-585dfdc468-46lb7" Apr 22 19:58:16.903832 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:16.901973 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-service-ca-bundle podName:9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:17.40195193 +0000 UTC m=+34.215337111 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-service-ca-bundle") pod "router-default-5ff55498bb-hwmkq" (UID: "9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9") : configmap references non-existent config key: service-ca.crt Apr 22 19:58:16.903832 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.902185 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/338d333b-8626-46b3-b450-fbdd521183d8-serving-cert\") pod \"console-operator-9d4b6777b-vfhht\" (UID: \"338d333b-8626-46b3-b450-fbdd521183d8\") " pod="openshift-console-operator/console-operator-9d4b6777b-vfhht" Apr 22 19:58:16.903832 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.902478 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/245ccfea-9935-4d23-820f-9267b29a9257-tmp\") pod \"klusterlet-addon-workmgr-bb6558494-8xmvt\" (UID: \"245ccfea-9935-4d23-820f-9267b29a9257\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bb6558494-8xmvt" Apr 22 19:58:16.903832 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.902744 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3484f3e2-bbf8-464e-a5f4-8d66ff2230f3-config\") pod \"service-ca-operator-d6fc45fc5-rs9hj\" (UID: \"3484f3e2-bbf8-464e-a5f4-8d66ff2230f3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rs9hj" Apr 22 19:58:16.903832 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.903033 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41ae5fd4-d9cf-4314-b727-acf1d473957e-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-zxkxh\" (UID: \"41ae5fd4-d9cf-4314-b727-acf1d473957e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zxkxh" Apr 22 19:58:16.903832 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.903583 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/338d333b-8626-46b3-b450-fbdd521183d8-config\") pod \"console-operator-9d4b6777b-vfhht\" (UID: \"338d333b-8626-46b3-b450-fbdd521183d8\") " pod="openshift-console-operator/console-operator-9d4b6777b-vfhht" Apr 22 19:58:16.903832 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.903662 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6c75cc995-6jqp2\" (UID: \"5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" Apr 22 19:58:16.904283 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:16.903910 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:58:16.904437 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.904400 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3484f3e2-bbf8-464e-a5f4-8d66ff2230f3-serving-cert\") pod \"service-ca-operator-d6fc45fc5-rs9hj\" (UID: \"3484f3e2-bbf8-464e-a5f4-8d66ff2230f3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rs9hj" Apr 22 19:58:16.904548 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:16.904511 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-metrics-certs podName:9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:17.404490779 +0000 UTC m=+34.217875952 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-metrics-certs") pod "router-default-5ff55498bb-hwmkq" (UID: "9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9") : secret "router-metrics-certs-default" not found Apr 22 19:58:16.904795 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.904775 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fa0c2d8-cebb-4563-88fb-c974a238cc8d-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-46lb7\" (UID: \"7fa0c2d8-cebb-4563-88fb-c974a238cc8d\") " pod="openshift-insights/insights-operator-585dfdc468-46lb7" Apr 22 19:58:16.905276 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.905143 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6c75cc995-6jqp2\" (UID: \"5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" Apr 22 19:58:16.905902 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.905855 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32-ca\") pod \"cluster-proxy-proxy-agent-6c75cc995-6jqp2\" (UID: \"5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" Apr 22 19:58:16.906387 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.906349 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32-hub\") pod \"cluster-proxy-proxy-agent-6c75cc995-6jqp2\" (UID: \"5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" Apr 22 19:58:16.906749 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.906725 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-stats-auth\") pod \"router-default-5ff55498bb-hwmkq\" (UID: \"9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9\") " pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:16.906983 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.906946 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5c826ea9-4c12-4695-924d-37f9fda8d480-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-86b985fbd5-gx4rj\" (UID: \"5c826ea9-4c12-4695-924d-37f9fda8d480\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86b985fbd5-gx4rj" Apr 22 19:58:16.907553 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.907533 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7fa0c2d8-cebb-4563-88fb-c974a238cc8d-serving-cert\") pod \"insights-operator-585dfdc468-46lb7\" (UID: \"7fa0c2d8-cebb-4563-88fb-c974a238cc8d\") " pod="openshift-insights/insights-operator-585dfdc468-46lb7" Apr 22 19:58:16.907655 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.907552 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/41ae5fd4-d9cf-4314-b727-acf1d473957e-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-zxkxh\" (UID: \"41ae5fd4-d9cf-4314-b727-acf1d473957e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zxkxh" Apr 22 19:58:16.907655 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.907586 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/245ccfea-9935-4d23-820f-9267b29a9257-klusterlet-config\") pod \"klusterlet-addon-workmgr-bb6558494-8xmvt\" (UID: \"245ccfea-9935-4d23-820f-9267b29a9257\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bb6558494-8xmvt" Apr 22 19:58:16.917297 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.917278 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzf9t\" (UniqueName: \"kubernetes.io/projected/7fa0c2d8-cebb-4563-88fb-c974a238cc8d-kube-api-access-dzf9t\") pod \"insights-operator-585dfdc468-46lb7\" (UID: \"7fa0c2d8-cebb-4563-88fb-c974a238cc8d\") " pod="openshift-insights/insights-operator-585dfdc468-46lb7" Apr 22 19:58:16.917614 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.917598 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh28f\" (UniqueName: \"kubernetes.io/projected/245ccfea-9935-4d23-820f-9267b29a9257-kube-api-access-wh28f\") pod \"klusterlet-addon-workmgr-bb6558494-8xmvt\" (UID: \"245ccfea-9935-4d23-820f-9267b29a9257\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bb6558494-8xmvt" Apr 22 19:58:16.920016 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.919965 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk9ll\" (UniqueName: \"kubernetes.io/projected/41ae5fd4-d9cf-4314-b727-acf1d473957e-kube-api-access-vk9ll\") pod \"kube-storage-version-migrator-operator-6769c5d45-zxkxh\" (UID: \"41ae5fd4-d9cf-4314-b727-acf1d473957e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zxkxh" Apr 22 19:58:16.920867 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.920817 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8tvw\" (UniqueName: \"kubernetes.io/projected/5c826ea9-4c12-4695-924d-37f9fda8d480-kube-api-access-q8tvw\") pod \"managed-serviceaccount-addon-agent-86b985fbd5-gx4rj\" (UID: \"5c826ea9-4c12-4695-924d-37f9fda8d480\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86b985fbd5-gx4rj" Apr 22 19:58:16.928205 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.927940 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5bfm\" (UniqueName: \"kubernetes.io/projected/240239de-ebef-48d8-bfd9-2171161b364f-kube-api-access-h5bfm\") pod \"network-check-source-8894fc9bd-4wqkr\" (UID: \"240239de-ebef-48d8-bfd9-2171161b364f\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4wqkr" Apr 22 19:58:16.929162 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.929136 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7jvg\" (UniqueName: \"kubernetes.io/projected/5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32-kube-api-access-g7jvg\") pod \"cluster-proxy-proxy-agent-6c75cc995-6jqp2\" (UID: \"5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" Apr 22 19:58:16.929403 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.929356 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwntb\" (UniqueName: \"kubernetes.io/projected/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-kube-api-access-dwntb\") pod \"router-default-5ff55498bb-hwmkq\" (UID: \"9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9\") " pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:16.934885 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.934860 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dwbp\" (UniqueName: \"kubernetes.io/projected/3484f3e2-bbf8-464e-a5f4-8d66ff2230f3-kube-api-access-8dwbp\") pod \"service-ca-operator-d6fc45fc5-rs9hj\" (UID: \"3484f3e2-bbf8-464e-a5f4-8d66ff2230f3\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rs9hj" Apr 22 19:58:16.935590 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.935571 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j6c9\" (UniqueName: \"kubernetes.io/projected/338d333b-8626-46b3-b450-fbdd521183d8-kube-api-access-9j6c9\") pod \"console-operator-9d4b6777b-vfhht\" (UID: \"338d333b-8626-46b3-b450-fbdd521183d8\") " pod="openshift-console-operator/console-operator-9d4b6777b-vfhht" Apr 22 19:58:16.950838 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.950814 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-vfhht" Apr 22 19:58:16.962152 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.962095 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zxkxh" Apr 22 19:58:16.981242 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.981210 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-46lb7" Apr 22 19:58:16.991433 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:16.991408 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rs9hj" Apr 22 19:58:17.001077 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:17.000529 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7h2n\" (UniqueName: \"kubernetes.io/projected/efcdc84a-302d-4ab3-b70a-269b931a9634-kube-api-access-x7h2n\") pod \"ingress-canary-ggr2l\" (UID: \"efcdc84a-302d-4ab3-b70a-269b931a9634\") " pod="openshift-ingress-canary/ingress-canary-ggr2l" Apr 22 19:58:17.001077 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:17.000617 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7561fdeb-e97b-4652-ba5c-1e555b68f4aa-tmp-dir\") pod \"dns-default-fpzrf\" (UID: \"7561fdeb-e97b-4652-ba5c-1e555b68f4aa\") " pod="openshift-dns/dns-default-fpzrf" Apr 22 19:58:17.001077 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:17.000651 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gfvtf\" (UniqueName: \"kubernetes.io/projected/7561fdeb-e97b-4652-ba5c-1e555b68f4aa-kube-api-access-gfvtf\") pod \"dns-default-fpzrf\" (UID: \"7561fdeb-e97b-4652-ba5c-1e555b68f4aa\") " pod="openshift-dns/dns-default-fpzrf" Apr 22 19:58:17.001077 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:17.000690 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efcdc84a-302d-4ab3-b70a-269b931a9634-cert\") pod \"ingress-canary-ggr2l\" (UID: \"efcdc84a-302d-4ab3-b70a-269b931a9634\") " pod="openshift-ingress-canary/ingress-canary-ggr2l" Apr 22 19:58:17.001077 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:17.000733 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7561fdeb-e97b-4652-ba5c-1e555b68f4aa-config-volume\") pod \"dns-default-fpzrf\" (UID: \"7561fdeb-e97b-4652-ba5c-1e555b68f4aa\") " pod="openshift-dns/dns-default-fpzrf" Apr 22 19:58:17.001077 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:17.000834 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:17.001077 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:17.000881 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efcdc84a-302d-4ab3-b70a-269b931a9634-cert podName:efcdc84a-302d-4ab3-b70a-269b931a9634 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:17.500864377 +0000 UTC m=+34.314249537 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efcdc84a-302d-4ab3-b70a-269b931a9634-cert") pod "ingress-canary-ggr2l" (UID: "efcdc84a-302d-4ab3-b70a-269b931a9634") : secret "canary-serving-cert" not found Apr 22 19:58:17.001077 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:17.000903 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7561fdeb-e97b-4652-ba5c-1e555b68f4aa-metrics-tls\") pod \"dns-default-fpzrf\" (UID: \"7561fdeb-e97b-4652-ba5c-1e555b68f4aa\") " pod="openshift-dns/dns-default-fpzrf" Apr 22 19:58:17.001077 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:17.000945 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:17.001077 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:17.001010 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7561fdeb-e97b-4652-ba5c-1e555b68f4aa-tmp-dir\") pod \"dns-default-fpzrf\" (UID: \"7561fdeb-e97b-4652-ba5c-1e555b68f4aa\") " pod="openshift-dns/dns-default-fpzrf" Apr 22 19:58:17.001077 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:17.001023 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7561fdeb-e97b-4652-ba5c-1e555b68f4aa-metrics-tls podName:7561fdeb-e97b-4652-ba5c-1e555b68f4aa nodeName:}" failed. No retries permitted until 2026-04-22 19:58:17.501005145 +0000 UTC m=+34.314390316 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7561fdeb-e97b-4652-ba5c-1e555b68f4aa-metrics-tls") pod "dns-default-fpzrf" (UID: "7561fdeb-e97b-4652-ba5c-1e555b68f4aa") : secret "dns-default-metrics-tls" not found Apr 22 19:58:17.001670 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:17.001270 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7561fdeb-e97b-4652-ba5c-1e555b68f4aa-config-volume\") pod \"dns-default-fpzrf\" (UID: \"7561fdeb-e97b-4652-ba5c-1e555b68f4aa\") " pod="openshift-dns/dns-default-fpzrf" Apr 22 19:58:17.010289 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:17.010264 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfvtf\" (UniqueName: \"kubernetes.io/projected/7561fdeb-e97b-4652-ba5c-1e555b68f4aa-kube-api-access-gfvtf\") pod \"dns-default-fpzrf\" (UID: \"7561fdeb-e97b-4652-ba5c-1e555b68f4aa\") " pod="openshift-dns/dns-default-fpzrf" Apr 22 19:58:17.010289 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:17.010279 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7h2n\" (UniqueName: \"kubernetes.io/projected/efcdc84a-302d-4ab3-b70a-269b931a9634-kube-api-access-x7h2n\") pod \"ingress-canary-ggr2l\" (UID: \"efcdc84a-302d-4ab3-b70a-269b931a9634\") " pod="openshift-ingress-canary/ingress-canary-ggr2l" Apr 22 19:58:17.015063 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:17.015045 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4wqkr" Apr 22 19:58:17.040329 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:17.040284 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86b985fbd5-gx4rj" Apr 22 19:58:17.050099 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:17.050076 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bb6558494-8xmvt" Apr 22 19:58:17.056785 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:17.056763 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" Apr 22 19:58:17.304037 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:17.304000 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ab3165ee-b810-41ae-b9dc-8e3198db6bc1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-b8884\" (UID: \"ab3165ee-b810-41ae-b9dc-8e3198db6bc1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-b8884" Apr 22 19:58:17.304295 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:17.304063 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vrf5f\" (UID: \"d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vrf5f" Apr 22 19:58:17.304295 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:17.304100 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ec41f5a3-a7a6-4607-b14e-49402afefbe2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-f7qdl\" (UID: \"ec41f5a3-a7a6-4607-b14e-49402afefbe2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-f7qdl" Apr 22 19:58:17.304295 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:17.304167 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:58:17.304295 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:17.304224 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:58:17.304295 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:17.304242 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab3165ee-b810-41ae-b9dc-8e3198db6bc1-cluster-monitoring-operator-tls podName:ab3165ee-b810-41ae-b9dc-8e3198db6bc1 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:18.304220847 +0000 UTC m=+35.117606011 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ab3165ee-b810-41ae-b9dc-8e3198db6bc1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-b8884" (UID: "ab3165ee-b810-41ae-b9dc-8e3198db6bc1") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:58:17.304295 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:17.304268 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-registry-tls\") pod \"image-registry-64b9f68d58-mbcdm\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:17.304295 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:17.304288 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:58:17.304295 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:17.304295 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4-samples-operator-tls podName:d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:18.304282021 +0000 UTC m=+35.117667184 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-vrf5f" (UID: "d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4") : secret "samples-operator-tls" not found Apr 22 19:58:17.304747 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:17.304353 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:58:17.304747 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:17.304364 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64b9f68d58-mbcdm: secret "image-registry-tls" not found Apr 22 19:58:17.304747 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:17.304364 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec41f5a3-a7a6-4607-b14e-49402afefbe2-networking-console-plugin-cert podName:ec41f5a3-a7a6-4607-b14e-49402afefbe2 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:18.304351333 +0000 UTC m=+35.117736493 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ec41f5a3-a7a6-4607-b14e-49402afefbe2-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-f7qdl" (UID: "ec41f5a3-a7a6-4607-b14e-49402afefbe2") : secret "networking-console-plugin-cert" not found Apr 22 19:58:17.304747 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:17.304441 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-registry-tls podName:c1f10812-b9aa-426e-ac37-99d23b87a05a nodeName:}" failed. No retries permitted until 2026-04-22 19:58:18.304429872 +0000 UTC m=+35.117815034 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-registry-tls") pod "image-registry-64b9f68d58-mbcdm" (UID: "c1f10812-b9aa-426e-ac37-99d23b87a05a") : secret "image-registry-tls" not found Apr 22 19:58:17.405768 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:17.405730 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-metrics-certs\") pod \"router-default-5ff55498bb-hwmkq\" (UID: \"9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9\") " pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:17.405938 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:17.405791 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-service-ca-bundle\") pod \"router-default-5ff55498bb-hwmkq\" (UID: \"9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9\") " pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:17.405938 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:17.405854 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7bee1d4-9229-4b17-8ec5-e19b53d61c9d-metrics-certs\") pod \"network-metrics-daemon-v8fph\" (UID: \"d7bee1d4-9229-4b17-8ec5-e19b53d61c9d\") " pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:58:17.405938 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:17.405877 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:58:17.405938 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:17.405938 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-metrics-certs podName:9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:18.405919347 +0000 UTC m=+35.219304512 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-metrics-certs") pod "router-default-5ff55498bb-hwmkq" (UID: "9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9") : secret "router-metrics-certs-default" not found Apr 22 19:58:17.406117 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:17.405945 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:58:17.406117 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:17.405985 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-service-ca-bundle podName:9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:18.405962297 +0000 UTC m=+35.219347479 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-service-ca-bundle") pod "router-default-5ff55498bb-hwmkq" (UID: "9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9") : configmap references non-existent config key: service-ca.crt Apr 22 19:58:17.406117 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:17.406014 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7bee1d4-9229-4b17-8ec5-e19b53d61c9d-metrics-certs podName:d7bee1d4-9229-4b17-8ec5-e19b53d61c9d nodeName:}" failed. No retries permitted until 2026-04-22 19:58:49.406003446 +0000 UTC m=+66.219388608 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d7bee1d4-9229-4b17-8ec5-e19b53d61c9d-metrics-certs") pod "network-metrics-daemon-v8fph" (UID: "d7bee1d4-9229-4b17-8ec5-e19b53d61c9d") : secret "metrics-daemon-secret" not found Apr 22 19:58:17.507180 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:17.507146 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cba227e9-dcf5-4cf7-9c4b-83013a0b20fb-original-pull-secret\") pod \"global-pull-secret-syncer-cvg2s\" (UID: \"cba227e9-dcf5-4cf7-9c4b-83013a0b20fb\") " pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:58:17.507945 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:17.507244 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efcdc84a-302d-4ab3-b70a-269b931a9634-cert\") pod \"ingress-canary-ggr2l\" (UID: \"efcdc84a-302d-4ab3-b70a-269b931a9634\") " pod="openshift-ingress-canary/ingress-canary-ggr2l" Apr 22 19:58:17.507945 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:17.507287 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7561fdeb-e97b-4652-ba5c-1e555b68f4aa-metrics-tls\") pod \"dns-default-fpzrf\" (UID: \"7561fdeb-e97b-4652-ba5c-1e555b68f4aa\") " pod="openshift-dns/dns-default-fpzrf" Apr 22 19:58:17.507945 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:17.507387 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:17.507945 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:17.507438 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:17.507945 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:17.507462 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efcdc84a-302d-4ab3-b70a-269b931a9634-cert podName:efcdc84a-302d-4ab3-b70a-269b931a9634 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:18.507440275 +0000 UTC m=+35.320825452 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efcdc84a-302d-4ab3-b70a-269b931a9634-cert") pod "ingress-canary-ggr2l" (UID: "efcdc84a-302d-4ab3-b70a-269b931a9634") : secret "canary-serving-cert" not found Apr 22 19:58:17.507945 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:17.507486 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7561fdeb-e97b-4652-ba5c-1e555b68f4aa-metrics-tls podName:7561fdeb-e97b-4652-ba5c-1e555b68f4aa nodeName:}" failed. No retries permitted until 2026-04-22 19:58:18.50747527 +0000 UTC m=+35.320860429 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7561fdeb-e97b-4652-ba5c-1e555b68f4aa-metrics-tls") pod "dns-default-fpzrf" (UID: "7561fdeb-e97b-4652-ba5c-1e555b68f4aa") : secret "dns-default-metrics-tls" not found Apr 22 19:58:17.509657 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:17.509635 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/cba227e9-dcf5-4cf7-9c4b-83013a0b20fb-original-pull-secret\") pod \"global-pull-secret-syncer-cvg2s\" (UID: \"cba227e9-dcf5-4cf7-9c4b-83013a0b20fb\") " pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:58:17.608004 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:17.607916 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76f4k\" (UniqueName: \"kubernetes.io/projected/f55cb5a5-4e42-4775-bf9f-5f92344b63ff-kube-api-access-76f4k\") pod \"network-check-target-ncvmn\" (UID: \"f55cb5a5-4e42-4775-bf9f-5f92344b63ff\") " pod="openshift-network-diagnostics/network-check-target-ncvmn" Apr 22 19:58:17.610512 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:17.610484 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-76f4k\" (UniqueName: \"kubernetes.io/projected/f55cb5a5-4e42-4775-bf9f-5f92344b63ff-kube-api-access-76f4k\") pod \"network-check-target-ncvmn\" (UID: \"f55cb5a5-4e42-4775-bf9f-5f92344b63ff\") " pod="openshift-network-diagnostics/network-check-target-ncvmn" Apr 22 19:58:17.671490 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:17.671447 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cvg2s" Apr 22 19:58:17.691595 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:17.691562 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncvmn" Apr 22 19:58:18.313998 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:18.313958 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ec41f5a3-a7a6-4607-b14e-49402afefbe2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-f7qdl\" (UID: \"ec41f5a3-a7a6-4607-b14e-49402afefbe2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-f7qdl" Apr 22 19:58:18.314284 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:18.314081 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-registry-tls\") pod \"image-registry-64b9f68d58-mbcdm\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:18.314284 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:18.314126 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:58:18.314284 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:18.314147 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ab3165ee-b810-41ae-b9dc-8e3198db6bc1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-b8884\" (UID: \"ab3165ee-b810-41ae-b9dc-8e3198db6bc1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-b8884" Apr 22 19:58:18.314284 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:18.314201 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec41f5a3-a7a6-4607-b14e-49402afefbe2-networking-console-plugin-cert podName:ec41f5a3-a7a6-4607-b14e-49402afefbe2 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:20.314180603 +0000 UTC m=+37.127565780 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ec41f5a3-a7a6-4607-b14e-49402afefbe2-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-f7qdl" (UID: "ec41f5a3-a7a6-4607-b14e-49402afefbe2") : secret "networking-console-plugin-cert" not found Apr 22 19:58:18.314284 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:18.314214 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:58:18.314284 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:18.314229 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64b9f68d58-mbcdm: secret "image-registry-tls" not found Apr 22 19:58:18.314284 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:18.314229 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vrf5f\" (UID: \"d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vrf5f" Apr 22 19:58:18.314284 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:18.314264 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-registry-tls podName:c1f10812-b9aa-426e-ac37-99d23b87a05a nodeName:}" failed. No retries permitted until 2026-04-22 19:58:20.314253606 +0000 UTC m=+37.127638772 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-registry-tls") pod "image-registry-64b9f68d58-mbcdm" (UID: "c1f10812-b9aa-426e-ac37-99d23b87a05a") : secret "image-registry-tls" not found Apr 22 19:58:18.314284 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:18.314265 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:58:18.314706 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:18.314331 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab3165ee-b810-41ae-b9dc-8e3198db6bc1-cluster-monitoring-operator-tls podName:ab3165ee-b810-41ae-b9dc-8e3198db6bc1 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:20.314300448 +0000 UTC m=+37.127685623 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ab3165ee-b810-41ae-b9dc-8e3198db6bc1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-b8884" (UID: "ab3165ee-b810-41ae-b9dc-8e3198db6bc1") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:58:18.314706 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:18.314341 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:58:18.314706 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:18.314388 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4-samples-operator-tls podName:d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:20.314378129 +0000 UTC m=+37.127763288 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-vrf5f" (UID: "d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4") : secret "samples-operator-tls" not found Apr 22 19:58:18.414643 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:18.414604 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-metrics-certs\") pod \"router-default-5ff55498bb-hwmkq\" (UID: \"9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9\") " pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:18.414802 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:18.414661 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-service-ca-bundle\") pod \"router-default-5ff55498bb-hwmkq\" (UID: \"9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9\") " pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:18.414802 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:18.414754 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:58:18.414884 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:18.414758 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-service-ca-bundle podName:9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:20.41474342 +0000 UTC m=+37.228128578 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-service-ca-bundle") pod "router-default-5ff55498bb-hwmkq" (UID: "9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9") : configmap references non-existent config key: service-ca.crt Apr 22 19:58:18.414944 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:18.414911 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-metrics-certs podName:9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:20.414893684 +0000 UTC m=+37.228278852 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-metrics-certs") pod "router-default-5ff55498bb-hwmkq" (UID: "9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9") : secret "router-metrics-certs-default" not found Apr 22 19:58:18.518638 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:18.517726 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efcdc84a-302d-4ab3-b70a-269b931a9634-cert\") pod \"ingress-canary-ggr2l\" (UID: \"efcdc84a-302d-4ab3-b70a-269b931a9634\") " pod="openshift-ingress-canary/ingress-canary-ggr2l" Apr 22 19:58:18.518638 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:18.517808 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7561fdeb-e97b-4652-ba5c-1e555b68f4aa-metrics-tls\") pod \"dns-default-fpzrf\" (UID: \"7561fdeb-e97b-4652-ba5c-1e555b68f4aa\") " pod="openshift-dns/dns-default-fpzrf" Apr 22 19:58:18.518638 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:18.518082 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:18.518638 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:18.518154 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7561fdeb-e97b-4652-ba5c-1e555b68f4aa-metrics-tls podName:7561fdeb-e97b-4652-ba5c-1e555b68f4aa nodeName:}" failed. No retries permitted until 2026-04-22 19:58:20.51813527 +0000 UTC m=+37.331520449 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7561fdeb-e97b-4652-ba5c-1e555b68f4aa-metrics-tls") pod "dns-default-fpzrf" (UID: "7561fdeb-e97b-4652-ba5c-1e555b68f4aa") : secret "dns-default-metrics-tls" not found Apr 22 19:58:18.518638 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:18.518636 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:18.519429 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:18.518692 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efcdc84a-302d-4ab3-b70a-269b931a9634-cert podName:efcdc84a-302d-4ab3-b70a-269b931a9634 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:20.518676676 +0000 UTC m=+37.332061838 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efcdc84a-302d-4ab3-b70a-269b931a9634-cert") pod "ingress-canary-ggr2l" (UID: "efcdc84a-302d-4ab3-b70a-269b931a9634") : secret "canary-serving-cert" not found Apr 22 19:58:18.780582 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:18.780302 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4wqkr"] Apr 22 19:58:18.785211 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:18.785096 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-cvg2s"] Apr 22 19:58:18.785211 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:18.785149 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-46lb7"] Apr 22 19:58:18.804430 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:58:18.803332 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fa0c2d8_cebb_4563_88fb_c974a238cc8d.slice/crio-6cb11d2f4e32721be2e2837e29c4d6cfaa79d15394ed9f4fb6f91b967ecc646b WatchSource:0}: Error finding container 6cb11d2f4e32721be2e2837e29c4d6cfaa79d15394ed9f4fb6f91b967ecc646b: Status 404 returned error can't find the container with id 6cb11d2f4e32721be2e2837e29c4d6cfaa79d15394ed9f4fb6f91b967ecc646b Apr 22 19:58:18.823861 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:18.823812 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-bb6558494-8xmvt"] Apr 22 19:58:18.828013 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:18.827979 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2"] Apr 22 19:58:18.832783 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:18.832601 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86b985fbd5-gx4rj"] Apr 22 19:58:18.835424 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:18.835403 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-vfhht"] Apr 22 19:58:18.839248 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:58:18.839209 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod338d333b_8626_46b3_b450_fbdd521183d8.slice/crio-d10535db6ee6083cd88040ad9b50911244c5c9eb60693ac4f0c558c409aeefca WatchSource:0}: Error finding container d10535db6ee6083cd88040ad9b50911244c5c9eb60693ac4f0c558c409aeefca: Status 404 returned error can't find the container with id d10535db6ee6083cd88040ad9b50911244c5c9eb60693ac4f0c558c409aeefca Apr 22 19:58:18.844139 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:18.844112 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ncvmn"] Apr 22 19:58:18.847925 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:18.847770 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dqgqv"] Apr 22 19:58:18.848040 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:58:18.847949 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf55cb5a5_4e42_4775_bf9f_5f92344b63ff.slice/crio-a327c81df0df57fcdb5716c539c4155a3e29d82bed32567a275cac16fce21f97 WatchSource:0}: Error finding container a327c81df0df57fcdb5716c539c4155a3e29d82bed32567a275cac16fce21f97: Status 404 returned error can't find the container with id a327c81df0df57fcdb5716c539c4155a3e29d82bed32567a275cac16fce21f97 Apr 22 19:58:18.850022 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:18.849997 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zxkxh"] Apr 22 19:58:18.850925 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:18.850903 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rs9hj"] Apr 22 19:58:18.851589 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:58:18.851565 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3484f3e2_bbf8_464e_a5f4_8d66ff2230f3.slice/crio-83a666aa312b198139a904b8551425f1f9a040e07527c4c1f961ae9e68c8e37c WatchSource:0}: Error finding container 83a666aa312b198139a904b8551425f1f9a040e07527c4c1f961ae9e68c8e37c: Status 404 returned error can't find the container with id 83a666aa312b198139a904b8551425f1f9a040e07527c4c1f961ae9e68c8e37c Apr 22 19:58:18.852248 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:58:18.852167 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccc07e82_60c3_4ce8_9255_15e80fff83d9.slice/crio-0e79ff9ce4534f20a1ced28920cc85c0d611855b87cf1af071da618be3ecdc41 WatchSource:0}: Error finding container 0e79ff9ce4534f20a1ced28920cc85c0d611855b87cf1af071da618be3ecdc41: Status 404 returned error can't find the container with id 0e79ff9ce4534f20a1ced28920cc85c0d611855b87cf1af071da618be3ecdc41 Apr 22 19:58:18.866780 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:58:18.866750 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41ae5fd4_d9cf_4314_b727_acf1d473957e.slice/crio-9aca0f7b7cef3924165c840f55476323905635e00a68cf661e2134a1f10bbe09 WatchSource:0}: Error finding container 9aca0f7b7cef3924165c840f55476323905635e00a68cf661e2134a1f10bbe09: Status 404 returned error can't find the container with id 9aca0f7b7cef3924165c840f55476323905635e00a68cf661e2134a1f10bbe09 Apr 22 19:58:18.867623 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:18.867589 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bb6558494-8xmvt" event={"ID":"245ccfea-9935-4d23-820f-9267b29a9257","Type":"ContainerStarted","Data":"c77381c7a020f0a0913f34e9ef478630fd702b8921fa9789900feb948acee3c4"} Apr 22 19:58:18.868524 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:18.868478 2575 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:kube-storage-version-migrator-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f76a74458025ce3949072ad4a42dff7b49b25dcf13de81204e47f697e9cb8523,Command:[cluster-kube-storage-version-migrator-operator start],Args:[--config=/var/run/configmaps/config/config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:14318c2321b81771440ed4021e547da23e03f4b3d00957941d2293ccaac35e47,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f76a74458025ce3949072ad4a42dff7b49b25dcf13de81204e47f697e9cb8523,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.20.19,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.20.19,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vk9ll,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-storage-version-migrator-operator-6769c5d45-zxkxh_openshift-kube-storage-version-migrator-operator(41ae5fd4-d9cf-4314-b727-acf1d473957e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Apr 22 19:58:18.869026 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:18.869006 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-vfhht" event={"ID":"338d333b-8626-46b3-b450-fbdd521183d8","Type":"ContainerStarted","Data":"d10535db6ee6083cd88040ad9b50911244c5c9eb60693ac4f0c558c409aeefca"} Apr 22 19:58:18.869570 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:18.869550 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-storage-version-migrator-operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zxkxh" podUID="41ae5fd4-d9cf-4314-b727-acf1d473957e" Apr 22 19:58:18.869875 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:18.869858 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86b985fbd5-gx4rj" event={"ID":"5c826ea9-4c12-4695-924d-37f9fda8d480","Type":"ContainerStarted","Data":"e625b8c488f796d30e0dfc94bfea954e31ccb4e8729987109dd870d6150ac9fc"} Apr 22 19:58:18.870687 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:18.870670 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4wqkr" event={"ID":"240239de-ebef-48d8-bfd9-2171161b364f","Type":"ContainerStarted","Data":"9666c1357708db957a5afda5c2f6a695eaf6f51b1a9d871997982a259cd03d64"} Apr 22 19:58:18.871559 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:18.871542 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-46lb7" event={"ID":"7fa0c2d8-cebb-4563-88fb-c974a238cc8d","Type":"ContainerStarted","Data":"6cb11d2f4e32721be2e2837e29c4d6cfaa79d15394ed9f4fb6f91b967ecc646b"} Apr 22 19:58:18.872441 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:18.872413 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" event={"ID":"5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32","Type":"ContainerStarted","Data":"ecc923565464ca5ea3f5b77e25a03554f3e0d8ebc43e1cb79ba1d57ccd08ed3a"} Apr 22 19:58:18.873344 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:18.873305 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ncvmn" event={"ID":"f55cb5a5-4e42-4775-bf9f-5f92344b63ff","Type":"ContainerStarted","Data":"a327c81df0df57fcdb5716c539c4155a3e29d82bed32567a275cac16fce21f97"} Apr 22 19:58:18.874106 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:18.874090 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rs9hj" event={"ID":"3484f3e2-bbf8-464e-a5f4-8d66ff2230f3","Type":"ContainerStarted","Data":"83a666aa312b198139a904b8551425f1f9a040e07527c4c1f961ae9e68c8e37c"} Apr 22 19:58:18.875035 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:18.875012 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dqgqv" event={"ID":"ccc07e82-60c3-4ce8-9255-15e80fff83d9","Type":"ContainerStarted","Data":"0e79ff9ce4534f20a1ced28920cc85c0d611855b87cf1af071da618be3ecdc41"} Apr 22 19:58:18.875899 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:18.875878 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-cvg2s" event={"ID":"cba227e9-dcf5-4cf7-9c4b-83013a0b20fb","Type":"ContainerStarted","Data":"94f09727a9dfb064377972f4174028d5065a36240aa2467d38f5e19b7ec50a16"} Apr 22 19:58:19.883913 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:19.882951 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zxkxh" event={"ID":"41ae5fd4-d9cf-4314-b727-acf1d473957e","Type":"ContainerStarted","Data":"9aca0f7b7cef3924165c840f55476323905635e00a68cf661e2134a1f10bbe09"} Apr 22 19:58:19.885621 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:19.885477 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-storage-version-migrator-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f76a74458025ce3949072ad4a42dff7b49b25dcf13de81204e47f697e9cb8523\\\": ErrImagePull: pull QPS exceeded\"" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zxkxh" podUID="41ae5fd4-d9cf-4314-b727-acf1d473957e" Apr 22 19:58:19.903763 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:19.903730 2575 generic.go:358] "Generic (PLEG): container finished" podID="153f74bd-9a2c-4a02-88c9-243b60b35439" containerID="d4042b37d7439ec755f79b998b481d33e918fb0dc11dd8fabfc4c1ec589248b3" exitCode=0 Apr 22 19:58:19.903915 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:19.903784 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cfjtn" event={"ID":"153f74bd-9a2c-4a02-88c9-243b60b35439","Type":"ContainerDied","Data":"d4042b37d7439ec755f79b998b481d33e918fb0dc11dd8fabfc4c1ec589248b3"} Apr 22 19:58:20.339128 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:20.338061 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ec41f5a3-a7a6-4607-b14e-49402afefbe2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-f7qdl\" (UID: \"ec41f5a3-a7a6-4607-b14e-49402afefbe2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-f7qdl" Apr 22 19:58:20.339128 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:20.338192 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-registry-tls\") pod \"image-registry-64b9f68d58-mbcdm\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:20.339128 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:20.338257 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ab3165ee-b810-41ae-b9dc-8e3198db6bc1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-b8884\" (UID: \"ab3165ee-b810-41ae-b9dc-8e3198db6bc1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-b8884" Apr 22 19:58:20.339128 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:20.338308 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vrf5f\" (UID: \"d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vrf5f" Apr 22 19:58:20.339128 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:20.338475 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:58:20.339128 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:20.338535 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4-samples-operator-tls podName:d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:24.338516718 +0000 UTC m=+41.151901882 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-vrf5f" (UID: "d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4") : secret "samples-operator-tls" not found Apr 22 19:58:20.339128 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:20.338594 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:58:20.339128 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:20.338630 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec41f5a3-a7a6-4607-b14e-49402afefbe2-networking-console-plugin-cert podName:ec41f5a3-a7a6-4607-b14e-49402afefbe2 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:24.33861893 +0000 UTC m=+41.152004095 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ec41f5a3-a7a6-4607-b14e-49402afefbe2-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-f7qdl" (UID: "ec41f5a3-a7a6-4607-b14e-49402afefbe2") : secret "networking-console-plugin-cert" not found Apr 22 19:58:20.339128 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:20.338688 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:58:20.339128 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:20.338699 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64b9f68d58-mbcdm: secret "image-registry-tls" not found Apr 22 19:58:20.339128 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:20.338728 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-registry-tls podName:c1f10812-b9aa-426e-ac37-99d23b87a05a nodeName:}" failed. No retries permitted until 2026-04-22 19:58:24.338718089 +0000 UTC m=+41.152103253 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-registry-tls") pod "image-registry-64b9f68d58-mbcdm" (UID: "c1f10812-b9aa-426e-ac37-99d23b87a05a") : secret "image-registry-tls" not found Apr 22 19:58:20.339128 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:20.338778 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:58:20.339128 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:20.338807 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab3165ee-b810-41ae-b9dc-8e3198db6bc1-cluster-monitoring-operator-tls podName:ab3165ee-b810-41ae-b9dc-8e3198db6bc1 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:24.338797425 +0000 UTC m=+41.152182588 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ab3165ee-b810-41ae-b9dc-8e3198db6bc1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-b8884" (UID: "ab3165ee-b810-41ae-b9dc-8e3198db6bc1") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:58:20.440949 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:20.439897 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-metrics-certs\") pod \"router-default-5ff55498bb-hwmkq\" (UID: \"9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9\") " pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:20.440949 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:20.439971 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-service-ca-bundle\") pod \"router-default-5ff55498bb-hwmkq\" (UID: \"9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9\") " pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:20.440949 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:20.440337 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-service-ca-bundle podName:9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:24.440296031 +0000 UTC m=+41.253681210 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-service-ca-bundle") pod "router-default-5ff55498bb-hwmkq" (UID: "9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9") : configmap references non-existent config key: service-ca.crt Apr 22 19:58:20.440949 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:20.440847 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:58:20.440949 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:20.440902 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-metrics-certs podName:9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:24.440885876 +0000 UTC m=+41.254271037 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-metrics-certs") pod "router-default-5ff55498bb-hwmkq" (UID: "9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9") : secret "router-metrics-certs-default" not found Apr 22 19:58:20.542593 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:20.541570 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efcdc84a-302d-4ab3-b70a-269b931a9634-cert\") pod \"ingress-canary-ggr2l\" (UID: \"efcdc84a-302d-4ab3-b70a-269b931a9634\") " pod="openshift-ingress-canary/ingress-canary-ggr2l" Apr 22 19:58:20.542593 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:20.541641 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7561fdeb-e97b-4652-ba5c-1e555b68f4aa-metrics-tls\") pod \"dns-default-fpzrf\" (UID: \"7561fdeb-e97b-4652-ba5c-1e555b68f4aa\") " pod="openshift-dns/dns-default-fpzrf" Apr 22 19:58:20.542593 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:20.541968 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:20.542593 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:20.542036 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7561fdeb-e97b-4652-ba5c-1e555b68f4aa-metrics-tls podName:7561fdeb-e97b-4652-ba5c-1e555b68f4aa nodeName:}" failed. No retries permitted until 2026-04-22 19:58:24.542016139 +0000 UTC m=+41.355401302 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7561fdeb-e97b-4652-ba5c-1e555b68f4aa-metrics-tls") pod "dns-default-fpzrf" (UID: "7561fdeb-e97b-4652-ba5c-1e555b68f4aa") : secret "dns-default-metrics-tls" not found Apr 22 19:58:20.542593 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:20.542501 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:20.542593 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:20.542552 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efcdc84a-302d-4ab3-b70a-269b931a9634-cert podName:efcdc84a-302d-4ab3-b70a-269b931a9634 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:24.54253718 +0000 UTC m=+41.355922345 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efcdc84a-302d-4ab3-b70a-269b931a9634-cert") pod "ingress-canary-ggr2l" (UID: "efcdc84a-302d-4ab3-b70a-269b931a9634") : secret "canary-serving-cert" not found Apr 22 19:58:20.953999 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:20.952625 2575 generic.go:358] "Generic (PLEG): container finished" podID="153f74bd-9a2c-4a02-88c9-243b60b35439" containerID="7f3b1a78109ab3553c8cccc2f3a8a246b367abb65a49e49b044ca49f8760d547" exitCode=0 Apr 22 19:58:20.953999 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:20.953738 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cfjtn" event={"ID":"153f74bd-9a2c-4a02-88c9-243b60b35439","Type":"ContainerDied","Data":"7f3b1a78109ab3553c8cccc2f3a8a246b367abb65a49e49b044ca49f8760d547"} Apr 22 19:58:20.963209 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:20.962350 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-storage-version-migrator-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f76a74458025ce3949072ad4a42dff7b49b25dcf13de81204e47f697e9cb8523\\\": ErrImagePull: pull QPS exceeded\"" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zxkxh" podUID="41ae5fd4-d9cf-4314-b727-acf1d473957e" Apr 22 19:58:22.035903 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:22.034940 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cfjtn" event={"ID":"153f74bd-9a2c-4a02-88c9-243b60b35439","Type":"ContainerStarted","Data":"8beb655d111504fe29aacfadf52e4b6aa21fc5410f04087c6bac7a05fb82fe0e"} Apr 22 19:58:22.074520 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:22.073192 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-cfjtn" podStartSLOduration=5.828074578 podStartE2EDuration="38.073176646s" podCreationTimestamp="2026-04-22 19:57:44 +0000 UTC" firstStartedPulling="2026-04-22 19:57:46.577547006 +0000 UTC m=+3.390932171" lastFinishedPulling="2026-04-22 19:58:18.822649066 +0000 UTC m=+35.636034239" observedRunningTime="2026-04-22 19:58:22.073048693 +0000 UTC m=+38.886433887" watchObservedRunningTime="2026-04-22 19:58:22.073176646 +0000 UTC m=+38.886561830" Apr 22 19:58:24.388665 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:24.388608 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ec41f5a3-a7a6-4607-b14e-49402afefbe2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-f7qdl\" (UID: \"ec41f5a3-a7a6-4607-b14e-49402afefbe2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-f7qdl" Apr 22 19:58:24.389113 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:24.388751 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-registry-tls\") pod \"image-registry-64b9f68d58-mbcdm\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:24.389113 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:24.388775 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:58:24.389113 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:24.388814 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ab3165ee-b810-41ae-b9dc-8e3198db6bc1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-b8884\" (UID: \"ab3165ee-b810-41ae-b9dc-8e3198db6bc1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-b8884" Apr 22 19:58:24.389113 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:24.388853 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec41f5a3-a7a6-4607-b14e-49402afefbe2-networking-console-plugin-cert podName:ec41f5a3-a7a6-4607-b14e-49402afefbe2 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:32.388831166 +0000 UTC m=+49.202216329 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ec41f5a3-a7a6-4607-b14e-49402afefbe2-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-f7qdl" (UID: "ec41f5a3-a7a6-4607-b14e-49402afefbe2") : secret "networking-console-plugin-cert" not found Apr 22 19:58:24.389113 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:24.388907 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:58:24.389113 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:24.388912 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vrf5f\" (UID: \"d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vrf5f" Apr 22 19:58:24.389113 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:24.388926 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:58:24.389113 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:24.388945 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64b9f68d58-mbcdm: secret "image-registry-tls" not found Apr 22 19:58:24.389113 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:24.388967 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab3165ee-b810-41ae-b9dc-8e3198db6bc1-cluster-monitoring-operator-tls podName:ab3165ee-b810-41ae-b9dc-8e3198db6bc1 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:32.388955022 +0000 UTC m=+49.202340182 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ab3165ee-b810-41ae-b9dc-8e3198db6bc1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-b8884" (UID: "ab3165ee-b810-41ae-b9dc-8e3198db6bc1") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:58:24.389113 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:24.389003 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-registry-tls podName:c1f10812-b9aa-426e-ac37-99d23b87a05a nodeName:}" failed. No retries permitted until 2026-04-22 19:58:32.388988141 +0000 UTC m=+49.202373301 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-registry-tls") pod "image-registry-64b9f68d58-mbcdm" (UID: "c1f10812-b9aa-426e-ac37-99d23b87a05a") : secret "image-registry-tls" not found Apr 22 19:58:24.389113 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:24.389014 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:58:24.389113 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:24.389052 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4-samples-operator-tls podName:d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:32.389040162 +0000 UTC m=+49.202425324 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-vrf5f" (UID: "d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4") : secret "samples-operator-tls" not found Apr 22 19:58:24.489647 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:24.489611 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-metrics-certs\") pod \"router-default-5ff55498bb-hwmkq\" (UID: \"9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9\") " pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:24.489840 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:24.489678 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-service-ca-bundle\") pod \"router-default-5ff55498bb-hwmkq\" (UID: \"9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9\") " pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:24.489840 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:24.489793 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-service-ca-bundle podName:9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:32.489774912 +0000 UTC m=+49.303160083 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-service-ca-bundle") pod "router-default-5ff55498bb-hwmkq" (UID: "9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9") : configmap references non-existent config key: service-ca.crt Apr 22 19:58:24.489840 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:24.489794 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:58:24.490069 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:24.489866 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-metrics-certs podName:9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:32.489852574 +0000 UTC m=+49.303237733 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-metrics-certs") pod "router-default-5ff55498bb-hwmkq" (UID: "9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9") : secret "router-metrics-certs-default" not found Apr 22 19:58:24.591152 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:24.591114 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efcdc84a-302d-4ab3-b70a-269b931a9634-cert\") pod \"ingress-canary-ggr2l\" (UID: \"efcdc84a-302d-4ab3-b70a-269b931a9634\") " pod="openshift-ingress-canary/ingress-canary-ggr2l" Apr 22 19:58:24.591374 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:24.591181 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7561fdeb-e97b-4652-ba5c-1e555b68f4aa-metrics-tls\") pod \"dns-default-fpzrf\" (UID: \"7561fdeb-e97b-4652-ba5c-1e555b68f4aa\") " pod="openshift-dns/dns-default-fpzrf" Apr 22 19:58:24.591374 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:24.591283 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:24.591489 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:24.591367 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:24.591489 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:24.591377 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efcdc84a-302d-4ab3-b70a-269b931a9634-cert podName:efcdc84a-302d-4ab3-b70a-269b931a9634 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:32.591361731 +0000 UTC m=+49.404746902 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efcdc84a-302d-4ab3-b70a-269b931a9634-cert") pod "ingress-canary-ggr2l" (UID: "efcdc84a-302d-4ab3-b70a-269b931a9634") : secret "canary-serving-cert" not found Apr 22 19:58:24.591489 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:24.591452 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7561fdeb-e97b-4652-ba5c-1e555b68f4aa-metrics-tls podName:7561fdeb-e97b-4652-ba5c-1e555b68f4aa nodeName:}" failed. No retries permitted until 2026-04-22 19:58:32.591435029 +0000 UTC m=+49.404820200 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7561fdeb-e97b-4652-ba5c-1e555b68f4aa-metrics-tls") pod "dns-default-fpzrf" (UID: "7561fdeb-e97b-4652-ba5c-1e555b68f4aa") : secret "dns-default-metrics-tls" not found Apr 22 19:58:32.454753 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:32.454668 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ab3165ee-b810-41ae-b9dc-8e3198db6bc1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-b8884\" (UID: \"ab3165ee-b810-41ae-b9dc-8e3198db6bc1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-b8884" Apr 22 19:58:32.454753 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:32.454724 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vrf5f\" (UID: \"d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vrf5f" Apr 22 19:58:32.454753 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:32.454750 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ec41f5a3-a7a6-4607-b14e-49402afefbe2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-f7qdl\" (UID: \"ec41f5a3-a7a6-4607-b14e-49402afefbe2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-f7qdl" Apr 22 19:58:32.455210 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:32.454825 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:58:32.455210 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:32.454861 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:58:32.455210 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:32.454868 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 19:58:32.455210 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:32.454885 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab3165ee-b810-41ae-b9dc-8e3198db6bc1-cluster-monitoring-operator-tls podName:ab3165ee-b810-41ae-b9dc-8e3198db6bc1 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:48.454870676 +0000 UTC m=+65.268255835 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ab3165ee-b810-41ae-b9dc-8e3198db6bc1-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-b8884" (UID: "ab3165ee-b810-41ae-b9dc-8e3198db6bc1") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:58:32.455210 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:32.454903 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec41f5a3-a7a6-4607-b14e-49402afefbe2-networking-console-plugin-cert podName:ec41f5a3-a7a6-4607-b14e-49402afefbe2 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:48.45489142 +0000 UTC m=+65.268276582 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ec41f5a3-a7a6-4607-b14e-49402afefbe2-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-f7qdl" (UID: "ec41f5a3-a7a6-4607-b14e-49402afefbe2") : secret "networking-console-plugin-cert" not found Apr 22 19:58:32.455210 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:32.454924 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-registry-tls\") pod \"image-registry-64b9f68d58-mbcdm\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:32.455210 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:32.454983 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4-samples-operator-tls podName:d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:48.454970587 +0000 UTC m=+65.268355749 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-vrf5f" (UID: "d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4") : secret "samples-operator-tls" not found Apr 22 19:58:32.455210 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:32.454989 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:58:32.455210 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:32.454997 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-64b9f68d58-mbcdm: secret "image-registry-tls" not found Apr 22 19:58:32.455210 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:32.455026 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-registry-tls podName:c1f10812-b9aa-426e-ac37-99d23b87a05a nodeName:}" failed. No retries permitted until 2026-04-22 19:58:48.455018947 +0000 UTC m=+65.268404106 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-registry-tls") pod "image-registry-64b9f68d58-mbcdm" (UID: "c1f10812-b9aa-426e-ac37-99d23b87a05a") : secret "image-registry-tls" not found Apr 22 19:58:32.555388 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:32.555359 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-metrics-certs\") pod \"router-default-5ff55498bb-hwmkq\" (UID: \"9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9\") " pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:32.555499 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:32.555423 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-service-ca-bundle\") pod \"router-default-5ff55498bb-hwmkq\" (UID: \"9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9\") " pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:32.555558 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:32.555506 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 19:58:32.555613 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:32.555574 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-metrics-certs podName:9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:48.555555087 +0000 UTC m=+65.368940247 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-metrics-certs") pod "router-default-5ff55498bb-hwmkq" (UID: "9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9") : secret "router-metrics-certs-default" not found Apr 22 19:58:32.555662 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:32.555607 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-service-ca-bundle podName:9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:48.555587779 +0000 UTC m=+65.368972959 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-service-ca-bundle") pod "router-default-5ff55498bb-hwmkq" (UID: "9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9") : configmap references non-existent config key: service-ca.crt Apr 22 19:58:32.656244 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:32.656209 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7561fdeb-e97b-4652-ba5c-1e555b68f4aa-metrics-tls\") pod \"dns-default-fpzrf\" (UID: \"7561fdeb-e97b-4652-ba5c-1e555b68f4aa\") " pod="openshift-dns/dns-default-fpzrf" Apr 22 19:58:32.656508 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:32.656487 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efcdc84a-302d-4ab3-b70a-269b931a9634-cert\") pod \"ingress-canary-ggr2l\" (UID: \"efcdc84a-302d-4ab3-b70a-269b931a9634\") " pod="openshift-ingress-canary/ingress-canary-ggr2l" Apr 22 19:58:32.656631 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:32.656613 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:58:32.656813 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:32.656680 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efcdc84a-302d-4ab3-b70a-269b931a9634-cert podName:efcdc84a-302d-4ab3-b70a-269b931a9634 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:48.656660283 +0000 UTC m=+65.470045443 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/efcdc84a-302d-4ab3-b70a-269b931a9634-cert") pod "ingress-canary-ggr2l" (UID: "efcdc84a-302d-4ab3-b70a-269b931a9634") : secret "canary-serving-cert" not found Apr 22 19:58:32.657111 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:32.657091 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:58:32.657189 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:32.657145 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7561fdeb-e97b-4652-ba5c-1e555b68f4aa-metrics-tls podName:7561fdeb-e97b-4652-ba5c-1e555b68f4aa nodeName:}" failed. No retries permitted until 2026-04-22 19:58:48.657130518 +0000 UTC m=+65.470515680 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7561fdeb-e97b-4652-ba5c-1e555b68f4aa-metrics-tls") pod "dns-default-fpzrf" (UID: "7561fdeb-e97b-4652-ba5c-1e555b68f4aa") : secret "dns-default-metrics-tls" not found Apr 22 19:58:33.066614 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:33.066580 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bb6558494-8xmvt" event={"ID":"245ccfea-9935-4d23-820f-9267b29a9257","Type":"ContainerStarted","Data":"6636376fba8c2f6275b069aee8ccafcb23f131d5521172614b5ab479c30dca88"} Apr 22 19:58:33.067047 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:33.067011 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bb6558494-8xmvt" Apr 22 19:58:33.068789 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:33.068767 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bb6558494-8xmvt" Apr 22 19:58:33.068789 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:33.068777 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vfhht_338d333b-8626-46b3-b450-fbdd521183d8/console-operator/0.log" Apr 22 19:58:33.068960 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:33.068805 2575 generic.go:358] "Generic (PLEG): container finished" podID="338d333b-8626-46b3-b450-fbdd521183d8" containerID="09524fe9899039cbc3329e29615122e565ec0f19c90a8f925889ddf83e83227b" exitCode=255 Apr 22 19:58:33.068960 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:33.068872 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-vfhht" event={"ID":"338d333b-8626-46b3-b450-fbdd521183d8","Type":"ContainerDied","Data":"09524fe9899039cbc3329e29615122e565ec0f19c90a8f925889ddf83e83227b"} Apr 22 19:58:33.069078 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:33.069067 2575 scope.go:117] "RemoveContainer" containerID="09524fe9899039cbc3329e29615122e565ec0f19c90a8f925889ddf83e83227b" Apr 22 19:58:33.070937 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:33.070908 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86b985fbd5-gx4rj" event={"ID":"5c826ea9-4c12-4695-924d-37f9fda8d480","Type":"ContainerStarted","Data":"ab03971e0f5c729e749b62f3047a744a681430ce00265d623308f44f9f34d596"} Apr 22 19:58:33.072634 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:33.072604 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4wqkr" event={"ID":"240239de-ebef-48d8-bfd9-2171161b364f","Type":"ContainerStarted","Data":"8aa55f1e383d37260a8cb2a61b4b273cc73a25d64b1bcd7e1e9da108b3e2b77d"} Apr 22 19:58:33.073964 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:33.073929 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-46lb7" event={"ID":"7fa0c2d8-cebb-4563-88fb-c974a238cc8d","Type":"ContainerStarted","Data":"43db704e029d91777790639d18d7b0b19ee00715f4f438c553ea7df37c21d341"} Apr 22 19:58:33.075498 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:33.075480 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" event={"ID":"5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32","Type":"ContainerStarted","Data":"23fc7defce5c18e691d75d7d22823534bb7774230c3b584850fd353c70ea2587"} Apr 22 19:58:33.076883 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:33.076864 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ncvmn" event={"ID":"f55cb5a5-4e42-4775-bf9f-5f92344b63ff","Type":"ContainerStarted","Data":"8038f7610e761b064009191bccecb2e3f42a7754cbc3face3322dd118c05856f"} Apr 22 19:58:33.076980 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:33.076915 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-ncvmn" Apr 22 19:58:33.078465 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:33.078425 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rs9hj" event={"ID":"3484f3e2-bbf8-464e-a5f4-8d66ff2230f3","Type":"ContainerStarted","Data":"75c014e42610f9d0392f71992a97b1118f5f91de73347744aaf22e7220f37a5e"} Apr 22 19:58:33.079891 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:33.079855 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dqgqv" event={"ID":"ccc07e82-60c3-4ce8-9255-15e80fff83d9","Type":"ContainerStarted","Data":"9864778e9b015186412aa53b8df619e73777f23892fd87f1135b7548711b9573"} Apr 22 19:58:33.082011 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:33.081643 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-cvg2s" event={"ID":"cba227e9-dcf5-4cf7-9c4b-83013a0b20fb","Type":"ContainerStarted","Data":"594a20179d6d833520e386fb452ab09f7426edeeab1aeda304452839ae8f05ac"} Apr 22 19:58:33.101376 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:33.101333 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-bb6558494-8xmvt" podStartSLOduration=20.403588414 podStartE2EDuration="34.101304631s" podCreationTimestamp="2026-04-22 19:57:59 +0000 UTC" firstStartedPulling="2026-04-22 19:58:18.828619139 +0000 UTC m=+35.642004302" lastFinishedPulling="2026-04-22 19:58:32.526335354 +0000 UTC m=+49.339720519" observedRunningTime="2026-04-22 19:58:33.100486018 +0000 UTC m=+49.913871196" watchObservedRunningTime="2026-04-22 19:58:33.101304631 +0000 UTC m=+49.914689812" Apr 22 19:58:33.132253 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:33.132205 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-46lb7" podStartSLOduration=11.439519115 podStartE2EDuration="25.132191096s" podCreationTimestamp="2026-04-22 19:58:08 +0000 UTC" firstStartedPulling="2026-04-22 19:58:18.810628351 +0000 UTC m=+35.624013514" lastFinishedPulling="2026-04-22 19:58:32.503300329 +0000 UTC m=+49.316685495" observedRunningTime="2026-04-22 19:58:33.131033531 +0000 UTC m=+49.944418712" watchObservedRunningTime="2026-04-22 19:58:33.132191096 +0000 UTC m=+49.945576274" Apr 22 19:58:33.148810 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:33.148765 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rs9hj" podStartSLOduration=11.494981648 podStartE2EDuration="25.148748747s" podCreationTimestamp="2026-04-22 19:58:08 +0000 UTC" firstStartedPulling="2026-04-22 19:58:18.853379503 +0000 UTC m=+35.666764680" lastFinishedPulling="2026-04-22 19:58:32.507146613 +0000 UTC m=+49.320531779" observedRunningTime="2026-04-22 19:58:33.147732238 +0000 UTC m=+49.961117420" watchObservedRunningTime="2026-04-22 19:58:33.148748747 +0000 UTC m=+49.962133927" Apr 22 19:58:33.168129 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:33.168080 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-cvg2s" podStartSLOduration=34.468039345 podStartE2EDuration="48.168063278s" podCreationTimestamp="2026-04-22 19:57:45 +0000 UTC" firstStartedPulling="2026-04-22 19:58:18.805641991 +0000 UTC m=+35.619027165" lastFinishedPulling="2026-04-22 19:58:32.505665934 +0000 UTC m=+49.319051098" observedRunningTime="2026-04-22 19:58:33.167626736 +0000 UTC m=+49.981011920" watchObservedRunningTime="2026-04-22 19:58:33.168063278 +0000 UTC m=+49.981448464" Apr 22 19:58:33.185721 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:33.185674 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-ncvmn" podStartSLOduration=35.530351361 podStartE2EDuration="49.185659812s" podCreationTimestamp="2026-04-22 19:57:44 +0000 UTC" firstStartedPulling="2026-04-22 19:58:18.849953547 +0000 UTC m=+35.663338710" lastFinishedPulling="2026-04-22 19:58:32.505261999 +0000 UTC m=+49.318647161" observedRunningTime="2026-04-22 19:58:33.184259623 +0000 UTC m=+49.997644807" watchObservedRunningTime="2026-04-22 19:58:33.185659812 +0000 UTC m=+49.999044993" Apr 22 19:58:33.217054 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:33.216926 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dqgqv" podStartSLOduration=12.038900875 podStartE2EDuration="25.216908789s" podCreationTimestamp="2026-04-22 19:58:08 +0000 UTC" firstStartedPulling="2026-04-22 19:58:18.854293148 +0000 UTC m=+35.667678311" lastFinishedPulling="2026-04-22 19:58:32.032301053 +0000 UTC m=+48.845686225" observedRunningTime="2026-04-22 19:58:33.19998553 +0000 UTC m=+50.013370713" watchObservedRunningTime="2026-04-22 19:58:33.216908789 +0000 UTC m=+50.030293952" Apr 22 19:58:33.218086 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:33.218047 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-86b985fbd5-gx4rj" podStartSLOduration=20.553817573 podStartE2EDuration="34.218037172s" podCreationTimestamp="2026-04-22 19:57:59 +0000 UTC" firstStartedPulling="2026-04-22 19:58:18.839872607 +0000 UTC m=+35.653257781" lastFinishedPulling="2026-04-22 19:58:32.504092215 +0000 UTC m=+49.317477380" observedRunningTime="2026-04-22 19:58:33.216548306 +0000 UTC m=+50.029933492" watchObservedRunningTime="2026-04-22 19:58:33.218037172 +0000 UTC m=+50.031422354" Apr 22 19:58:33.232262 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:33.232221 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4wqkr" podStartSLOduration=11.524888602 podStartE2EDuration="25.23220759s" podCreationTimestamp="2026-04-22 19:58:08 +0000 UTC" firstStartedPulling="2026-04-22 19:58:18.797935954 +0000 UTC m=+35.611321116" lastFinishedPulling="2026-04-22 19:58:32.505254936 +0000 UTC m=+49.318640104" observedRunningTime="2026-04-22 19:58:33.231523235 +0000 UTC m=+50.044908418" watchObservedRunningTime="2026-04-22 19:58:33.23220759 +0000 UTC m=+50.045592770" Apr 22 19:58:34.086775 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:34.086688 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vfhht_338d333b-8626-46b3-b450-fbdd521183d8/console-operator/1.log" Apr 22 19:58:34.087265 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:34.087148 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vfhht_338d333b-8626-46b3-b450-fbdd521183d8/console-operator/0.log" Apr 22 19:58:34.087265 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:34.087183 2575 generic.go:358] "Generic (PLEG): container finished" podID="338d333b-8626-46b3-b450-fbdd521183d8" containerID="c33c69eb16ca15c9391e89299cd076ff3056b5cfe6a0bddb7e428d4ddd1a20e9" exitCode=255 Apr 22 19:58:34.088005 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:34.087977 2575 scope.go:117] "RemoveContainer" containerID="c33c69eb16ca15c9391e89299cd076ff3056b5cfe6a0bddb7e428d4ddd1a20e9" Apr 22 19:58:34.088780 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:34.088222 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-vfhht" event={"ID":"338d333b-8626-46b3-b450-fbdd521183d8","Type":"ContainerDied","Data":"c33c69eb16ca15c9391e89299cd076ff3056b5cfe6a0bddb7e428d4ddd1a20e9"} Apr 22 19:58:34.088780 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:34.088274 2575 scope.go:117] "RemoveContainer" containerID="09524fe9899039cbc3329e29615122e565ec0f19c90a8f925889ddf83e83227b" Apr 22 19:58:34.089040 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:34.089017 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-vfhht_openshift-console-operator(338d333b-8626-46b3-b450-fbdd521183d8)\"" pod="openshift-console-operator/console-operator-9d4b6777b-vfhht" podUID="338d333b-8626-46b3-b450-fbdd521183d8" Apr 22 19:58:35.091540 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:35.091511 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vfhht_338d333b-8626-46b3-b450-fbdd521183d8/console-operator/1.log" Apr 22 19:58:35.091945 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:35.091919 2575 scope.go:117] "RemoveContainer" containerID="c33c69eb16ca15c9391e89299cd076ff3056b5cfe6a0bddb7e428d4ddd1a20e9" Apr 22 19:58:35.092164 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:35.092144 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-vfhht_openshift-console-operator(338d333b-8626-46b3-b450-fbdd521183d8)\"" pod="openshift-console-operator/console-operator-9d4b6777b-vfhht" podUID="338d333b-8626-46b3-b450-fbdd521183d8" Apr 22 19:58:35.093408 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:35.093383 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" event={"ID":"5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32","Type":"ContainerStarted","Data":"52a82cc55b65d53ccb37042580b1693a64c1581adbc9a591dc5a055ba293c7a2"} Apr 22 19:58:35.093408 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:35.093411 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" event={"ID":"5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32","Type":"ContainerStarted","Data":"086f858f82ebec67469eed7ead25afbaa39dc41df62a9e30c467616916dc2fdf"} Apr 22 19:58:35.131907 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:35.131856 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" podStartSLOduration=20.045710527 podStartE2EDuration="36.131842452s" podCreationTimestamp="2026-04-22 19:57:59 +0000 UTC" firstStartedPulling="2026-04-22 19:58:18.835370889 +0000 UTC m=+35.648756057" lastFinishedPulling="2026-04-22 19:58:34.921502807 +0000 UTC m=+51.734887982" observedRunningTime="2026-04-22 19:58:35.13111324 +0000 UTC m=+51.944498413" watchObservedRunningTime="2026-04-22 19:58:35.131842452 +0000 UTC m=+51.945227633" Apr 22 19:58:35.439441 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:35.439414 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jlns8_0a9f8642-7eca-4b0a-bf48-893987e02188/dns-node-resolver/0.log" Apr 22 19:58:36.440362 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:36.440334 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-lnz7b_c63ee2c0-b298-42a6-bcd1-b05ffc7971f2/node-ca/0.log" Apr 22 19:58:36.951094 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:36.951058 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-vfhht" Apr 22 19:58:36.951094 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:36.951096 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-vfhht" Apr 22 19:58:36.951524 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:36.951507 2575 scope.go:117] "RemoveContainer" containerID="c33c69eb16ca15c9391e89299cd076ff3056b5cfe6a0bddb7e428d4ddd1a20e9" Apr 22 19:58:36.951703 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:36.951687 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-vfhht_openshift-console-operator(338d333b-8626-46b3-b450-fbdd521183d8)\"" pod="openshift-console-operator/console-operator-9d4b6777b-vfhht" podUID="338d333b-8626-46b3-b450-fbdd521183d8" Apr 22 19:58:38.101869 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:38.101828 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zxkxh" event={"ID":"41ae5fd4-d9cf-4314-b727-acf1d473957e","Type":"ContainerStarted","Data":"00485b29876b003451d69435b15995b5a6db3b388a7e752267d503e1bd1e068b"} Apr 22 19:58:38.122511 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:38.122466 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zxkxh" podStartSLOduration=11.470280238 podStartE2EDuration="30.122453196s" podCreationTimestamp="2026-04-22 19:58:08 +0000 UTC" firstStartedPulling="2026-04-22 19:58:18.86835537 +0000 UTC m=+35.681740528" lastFinishedPulling="2026-04-22 19:58:37.520528326 +0000 UTC m=+54.333913486" observedRunningTime="2026-04-22 19:58:38.120830969 +0000 UTC m=+54.934216171" watchObservedRunningTime="2026-04-22 19:58:38.122453196 +0000 UTC m=+54.935838377" Apr 22 19:58:41.859118 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:41.859085 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6jf5c" Apr 22 19:58:48.503005 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:48.502965 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ec41f5a3-a7a6-4607-b14e-49402afefbe2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-f7qdl\" (UID: \"ec41f5a3-a7a6-4607-b14e-49402afefbe2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-f7qdl" Apr 22 19:58:48.503464 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:48.503122 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:58:48.503464 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:48.503157 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-registry-tls\") pod \"image-registry-64b9f68d58-mbcdm\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:48.503464 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:58:48.503195 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec41f5a3-a7a6-4607-b14e-49402afefbe2-networking-console-plugin-cert podName:ec41f5a3-a7a6-4607-b14e-49402afefbe2 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:20.503179348 +0000 UTC m=+97.316564506 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ec41f5a3-a7a6-4607-b14e-49402afefbe2-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-f7qdl" (UID: "ec41f5a3-a7a6-4607-b14e-49402afefbe2") : secret "networking-console-plugin-cert" not found Apr 22 19:58:48.503464 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:48.503254 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ab3165ee-b810-41ae-b9dc-8e3198db6bc1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-b8884\" (UID: \"ab3165ee-b810-41ae-b9dc-8e3198db6bc1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-b8884" Apr 22 19:58:48.503464 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:48.503291 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vrf5f\" (UID: \"d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vrf5f" Apr 22 19:58:48.507152 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:48.507121 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-registry-tls\") pod \"image-registry-64b9f68d58-mbcdm\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:48.507152 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:48.507147 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-vrf5f\" (UID: \"d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vrf5f" Apr 22 19:58:48.507824 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:48.507805 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ab3165ee-b810-41ae-b9dc-8e3198db6bc1-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-b8884\" (UID: \"ab3165ee-b810-41ae-b9dc-8e3198db6bc1\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-b8884" Apr 22 19:58:48.604435 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:48.604395 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-metrics-certs\") pod \"router-default-5ff55498bb-hwmkq\" (UID: \"9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9\") " pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:48.604621 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:48.604443 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-service-ca-bundle\") pod \"router-default-5ff55498bb-hwmkq\" (UID: \"9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9\") " pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:48.605088 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:48.605068 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-service-ca-bundle\") pod \"router-default-5ff55498bb-hwmkq\" (UID: \"9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9\") " pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:48.606771 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:48.606748 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9-metrics-certs\") pod \"router-default-5ff55498bb-hwmkq\" (UID: \"9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9\") " pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:48.663824 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:48.663788 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-wxhns\"" Apr 22 19:58:48.672680 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:48.672659 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:48.704858 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:48.704831 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efcdc84a-302d-4ab3-b70a-269b931a9634-cert\") pod \"ingress-canary-ggr2l\" (UID: \"efcdc84a-302d-4ab3-b70a-269b931a9634\") " pod="openshift-ingress-canary/ingress-canary-ggr2l" Apr 22 19:58:48.705009 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:48.704863 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7561fdeb-e97b-4652-ba5c-1e555b68f4aa-metrics-tls\") pod \"dns-default-fpzrf\" (UID: \"7561fdeb-e97b-4652-ba5c-1e555b68f4aa\") " pod="openshift-dns/dns-default-fpzrf" Apr 22 19:58:48.707203 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:48.707176 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7561fdeb-e97b-4652-ba5c-1e555b68f4aa-metrics-tls\") pod \"dns-default-fpzrf\" (UID: \"7561fdeb-e97b-4652-ba5c-1e555b68f4aa\") " pod="openshift-dns/dns-default-fpzrf" Apr 22 19:58:48.707339 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:48.707303 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/efcdc84a-302d-4ab3-b70a-269b931a9634-cert\") pod \"ingress-canary-ggr2l\" (UID: \"efcdc84a-302d-4ab3-b70a-269b931a9634\") " pod="openshift-ingress-canary/ingress-canary-ggr2l" Apr 22 19:58:48.718501 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:48.718477 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-57qct\"" Apr 22 19:58:48.726624 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:48.726597 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-b8884" Apr 22 19:58:48.740611 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:48.740585 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-fkjtp\"" Apr 22 19:58:48.749382 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:48.749349 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vrf5f" Apr 22 19:58:48.806284 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:48.806222 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-64b9f68d58-mbcdm"] Apr 22 19:58:48.810533 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:58:48.810507 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1f10812_b9aa_426e_ac37_99d23b87a05a.slice/crio-970db0698203143d8a23ed1048d1421339df90fd34a2a553d8834232eb3b98cb WatchSource:0}: Error finding container 970db0698203143d8a23ed1048d1421339df90fd34a2a553d8834232eb3b98cb: Status 404 returned error can't find the container with id 970db0698203143d8a23ed1048d1421339df90fd34a2a553d8834232eb3b98cb Apr 22 19:58:48.826857 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:48.826829 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-7t6fr\"" Apr 22 19:58:48.834813 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:48.834711 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:48.871756 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:48.871519 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-b8884"] Apr 22 19:58:48.888526 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:48.888213 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-2xv8f\"" Apr 22 19:58:48.888526 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:48.888442 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vrf5f"] Apr 22 19:58:48.895858 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:48.895831 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ggr2l" Apr 22 19:58:48.901337 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:48.901103 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wrr4x\"" Apr 22 19:58:48.909748 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:48.909726 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fpzrf" Apr 22 19:58:49.016206 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:49.016153 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5ff55498bb-hwmkq"] Apr 22 19:58:49.021748 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:58:49.021701 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e041c4f_74bf_46d4_bd4a_0c9e8952bdb9.slice/crio-77351967e0e9d130b22381946fe64bdb80e56de23a98226c8acf655fa058f5e1 WatchSource:0}: Error finding container 77351967e0e9d130b22381946fe64bdb80e56de23a98226c8acf655fa058f5e1: Status 404 returned error can't find the container with id 77351967e0e9d130b22381946fe64bdb80e56de23a98226c8acf655fa058f5e1 Apr 22 19:58:49.061058 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:49.061026 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ggr2l"] Apr 22 19:58:49.065040 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:58:49.065015 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefcdc84a_302d_4ab3_b70a_269b931a9634.slice/crio-24c124a19debcec029ab3d7c631ba8d50fb04a6c19b897b6908e24bf04cb14b4 WatchSource:0}: Error finding container 24c124a19debcec029ab3d7c631ba8d50fb04a6c19b897b6908e24bf04cb14b4: Status 404 returned error can't find the container with id 24c124a19debcec029ab3d7c631ba8d50fb04a6c19b897b6908e24bf04cb14b4 Apr 22 19:58:49.081331 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:49.081281 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fpzrf"] Apr 22 19:58:49.085686 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:58:49.085656 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7561fdeb_e97b_4652_ba5c_1e555b68f4aa.slice/crio-1ca247cce1ecee78c8c4208f3492a5d72ccdbc9b0fa0f522449acd52b0dd9bcd WatchSource:0}: Error finding container 1ca247cce1ecee78c8c4208f3492a5d72ccdbc9b0fa0f522449acd52b0dd9bcd: Status 404 returned error can't find the container with id 1ca247cce1ecee78c8c4208f3492a5d72ccdbc9b0fa0f522449acd52b0dd9bcd Apr 22 19:58:49.131600 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:49.131565 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ggr2l" event={"ID":"efcdc84a-302d-4ab3-b70a-269b931a9634","Type":"ContainerStarted","Data":"24c124a19debcec029ab3d7c631ba8d50fb04a6c19b897b6908e24bf04cb14b4"} Apr 22 19:58:49.132767 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:49.132737 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fpzrf" event={"ID":"7561fdeb-e97b-4652-ba5c-1e555b68f4aa","Type":"ContainerStarted","Data":"1ca247cce1ecee78c8c4208f3492a5d72ccdbc9b0fa0f522449acd52b0dd9bcd"} Apr 22 19:58:49.134538 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:49.134508 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5ff55498bb-hwmkq" event={"ID":"9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9","Type":"ContainerStarted","Data":"26a2542d4f570e0be07e72c2f86dbf0e746b23ca0c94f3276995d4943ad21b84"} Apr 22 19:58:49.134666 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:49.134542 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5ff55498bb-hwmkq" event={"ID":"9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9","Type":"ContainerStarted","Data":"77351967e0e9d130b22381946fe64bdb80e56de23a98226c8acf655fa058f5e1"} Apr 22 19:58:49.135726 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:49.135701 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vrf5f" event={"ID":"d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4","Type":"ContainerStarted","Data":"c3cd818ecb3a62e74f08c3299830b6fe43fb2a10731ce2520f7f3eb78a4775ff"} Apr 22 19:58:49.136822 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:49.136791 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-b8884" event={"ID":"ab3165ee-b810-41ae-b9dc-8e3198db6bc1","Type":"ContainerStarted","Data":"d8cd633a8d86f536a8311fbc86efd5e8c400ab959091ede2d3daf68ae1d09868"} Apr 22 19:58:49.138305 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:49.138284 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" event={"ID":"c1f10812-b9aa-426e-ac37-99d23b87a05a","Type":"ContainerStarted","Data":"bb72c38f32da84d39874a717d6e6e111c82ac5a98bef16a0a457c9ecea16e5ad"} Apr 22 19:58:49.138305 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:49.138310 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" event={"ID":"c1f10812-b9aa-426e-ac37-99d23b87a05a","Type":"ContainerStarted","Data":"970db0698203143d8a23ed1048d1421339df90fd34a2a553d8834232eb3b98cb"} Apr 22 19:58:49.138458 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:49.138425 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:58:49.159673 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:49.159627 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5ff55498bb-hwmkq" podStartSLOduration=41.15961229 podStartE2EDuration="41.15961229s" podCreationTimestamp="2026-04-22 19:58:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:58:49.159048298 +0000 UTC m=+65.972433478" watchObservedRunningTime="2026-04-22 19:58:49.15961229 +0000 UTC m=+65.972997470" Apr 22 19:58:49.195128 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:49.195077 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" podStartSLOduration=65.195059087 podStartE2EDuration="1m5.195059087s" podCreationTimestamp="2026-04-22 19:57:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:58:49.193466677 +0000 UTC m=+66.006851857" watchObservedRunningTime="2026-04-22 19:58:49.195059087 +0000 UTC m=+66.008444268" Apr 22 19:58:49.419338 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:49.419234 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7bee1d4-9229-4b17-8ec5-e19b53d61c9d-metrics-certs\") pod \"network-metrics-daemon-v8fph\" (UID: \"d7bee1d4-9229-4b17-8ec5-e19b53d61c9d\") " pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:58:49.421457 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:49.421437 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d7bee1d4-9229-4b17-8ec5-e19b53d61c9d-metrics-certs\") pod \"network-metrics-daemon-v8fph\" (UID: \"d7bee1d4-9229-4b17-8ec5-e19b53d61c9d\") " pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:58:49.481695 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:49.481660 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gl2tf\"" Apr 22 19:58:49.490542 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:49.490518 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-v8fph" Apr 22 19:58:49.652795 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:49.652733 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-v8fph"] Apr 22 19:58:49.668978 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:58:49.668222 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7bee1d4_9229_4b17_8ec5_e19b53d61c9d.slice/crio-3a6d74828df47e2ba8a72475fdd5ac871e9598e090eee807d0fee86495ca74e7 WatchSource:0}: Error finding container 3a6d74828df47e2ba8a72475fdd5ac871e9598e090eee807d0fee86495ca74e7: Status 404 returned error can't find the container with id 3a6d74828df47e2ba8a72475fdd5ac871e9598e090eee807d0fee86495ca74e7 Apr 22 19:58:49.835264 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:49.835193 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:49.838602 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:49.838575 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:50.146368 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:50.146291 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-v8fph" event={"ID":"d7bee1d4-9229-4b17-8ec5-e19b53d61c9d","Type":"ContainerStarted","Data":"3a6d74828df47e2ba8a72475fdd5ac871e9598e090eee807d0fee86495ca74e7"} Apr 22 19:58:50.147072 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:50.146898 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:50.148086 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:50.148068 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5ff55498bb-hwmkq" Apr 22 19:58:51.727035 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:51.727005 2575 scope.go:117] "RemoveContainer" containerID="c33c69eb16ca15c9391e89299cd076ff3056b5cfe6a0bddb7e428d4ddd1a20e9" Apr 22 19:58:53.166557 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:53.166533 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vfhht_338d333b-8626-46b3-b450-fbdd521183d8/console-operator/1.log" Apr 22 19:58:53.166879 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:53.166659 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-vfhht" event={"ID":"338d333b-8626-46b3-b450-fbdd521183d8","Type":"ContainerStarted","Data":"33a7bde54aa547861636676beb397998023586de680373b19c33f06124ff1747"} Apr 22 19:58:53.167456 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:53.167415 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-vfhht" Apr 22 19:58:53.177598 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:53.176147 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vrf5f" event={"ID":"d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4","Type":"ContainerStarted","Data":"1d29d25572d35ded07b24160d73ea3fa3d17a45da6e2493cf71c3b3eed442a1d"} Apr 22 19:58:53.190037 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:53.189281 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-vfhht" podStartSLOduration=31.868180893 podStartE2EDuration="45.189261349s" podCreationTimestamp="2026-04-22 19:58:08 +0000 UTC" firstStartedPulling="2026-04-22 19:58:18.840874587 +0000 UTC m=+35.654259751" lastFinishedPulling="2026-04-22 19:58:32.161955048 +0000 UTC m=+48.975340207" observedRunningTime="2026-04-22 19:58:53.189059469 +0000 UTC m=+70.002444650" watchObservedRunningTime="2026-04-22 19:58:53.189261349 +0000 UTC m=+70.002646534" Apr 22 19:58:53.642717 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:53.642685 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-vfhht" Apr 22 19:58:54.179739 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:54.179704 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-b8884" event={"ID":"ab3165ee-b810-41ae-b9dc-8e3198db6bc1","Type":"ContainerStarted","Data":"c8b5a13a85f3b9cb73a5cec6ee47b41081702f82c6f1b4fcad887284ad28041d"} Apr 22 19:58:54.180966 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:54.180942 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ggr2l" event={"ID":"efcdc84a-302d-4ab3-b70a-269b931a9634","Type":"ContainerStarted","Data":"767aef7e3792a208b1a72b7793471970d91c513b9642aa55164a25b7ec6f602e"} Apr 22 19:58:54.182404 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:54.182385 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vrf5f" event={"ID":"d9263ef4-55cf-4de0-8dd9-d6d7cf8390e4","Type":"ContainerStarted","Data":"44b7695dba9c33e8aeb157abad574d8bd59bcb3792ef5f781011f1f0ba220a81"} Apr 22 19:58:54.183968 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:54.183945 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fpzrf" event={"ID":"7561fdeb-e97b-4652-ba5c-1e555b68f4aa","Type":"ContainerStarted","Data":"8beffa70bb0f7401013389ab866c6c51a1b8024d929a8d5f3be9557df85d5f3c"} Apr 22 19:58:54.184072 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:54.183977 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fpzrf" event={"ID":"7561fdeb-e97b-4652-ba5c-1e555b68f4aa","Type":"ContainerStarted","Data":"e8f43741ec079bd44113206c3420c4ce83ef4f5bad2eeaf0b68a157c2d2782c1"} Apr 22 19:58:54.184133 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:54.184091 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-fpzrf" Apr 22 19:58:54.185412 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:54.185393 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-v8fph" event={"ID":"d7bee1d4-9229-4b17-8ec5-e19b53d61c9d","Type":"ContainerStarted","Data":"54a5e3886178fa884e30fccbcdc61ae69d58064481d8737e6eb0c987dd28b159"} Apr 22 19:58:54.185510 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:54.185415 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-v8fph" event={"ID":"d7bee1d4-9229-4b17-8ec5-e19b53d61c9d","Type":"ContainerStarted","Data":"b51c6a4290d52aeb7abed71ad98b74dc0495bb73d8f3363968d43443b6bf97ed"} Apr 22 19:58:54.198537 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:54.198493 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-b8884" podStartSLOduration=42.115648748 podStartE2EDuration="46.198478529s" podCreationTimestamp="2026-04-22 19:58:08 +0000 UTC" firstStartedPulling="2026-04-22 19:58:48.87596911 +0000 UTC m=+65.689354286" lastFinishedPulling="2026-04-22 19:58:52.958798893 +0000 UTC m=+69.772184067" observedRunningTime="2026-04-22 19:58:54.197770373 +0000 UTC m=+71.011155557" watchObservedRunningTime="2026-04-22 19:58:54.198478529 +0000 UTC m=+71.011863711" Apr 22 19:58:54.215336 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:54.215273 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-fpzrf" podStartSLOduration=34.348463524 podStartE2EDuration="38.215261688s" podCreationTimestamp="2026-04-22 19:58:16 +0000 UTC" firstStartedPulling="2026-04-22 19:58:49.09058526 +0000 UTC m=+65.903970419" lastFinishedPulling="2026-04-22 19:58:52.95738342 +0000 UTC m=+69.770768583" observedRunningTime="2026-04-22 19:58:54.214753144 +0000 UTC m=+71.028138326" watchObservedRunningTime="2026-04-22 19:58:54.215261688 +0000 UTC m=+71.028646857" Apr 22 19:58:54.233911 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:54.233868 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-ggr2l" podStartSLOduration=34.343648929 podStartE2EDuration="38.233857079s" podCreationTimestamp="2026-04-22 19:58:16 +0000 UTC" firstStartedPulling="2026-04-22 19:58:49.067541212 +0000 UTC m=+65.880926371" lastFinishedPulling="2026-04-22 19:58:52.957749363 +0000 UTC m=+69.771134521" observedRunningTime="2026-04-22 19:58:54.232029592 +0000 UTC m=+71.045414776" watchObservedRunningTime="2026-04-22 19:58:54.233857079 +0000 UTC m=+71.047242260" Apr 22 19:58:54.247471 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:54.247433 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-v8fph" podStartSLOduration=67.958978172 podStartE2EDuration="1m11.247421644s" podCreationTimestamp="2026-04-22 19:57:43 +0000 UTC" firstStartedPulling="2026-04-22 19:58:49.670786303 +0000 UTC m=+66.484171464" lastFinishedPulling="2026-04-22 19:58:52.959229776 +0000 UTC m=+69.772614936" observedRunningTime="2026-04-22 19:58:54.246461934 +0000 UTC m=+71.059847127" watchObservedRunningTime="2026-04-22 19:58:54.247421644 +0000 UTC m=+71.060806825" Apr 22 19:58:54.264418 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:54.264369 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-vrf5f" podStartSLOduration=42.272881543 podStartE2EDuration="46.26435304s" podCreationTimestamp="2026-04-22 19:58:08 +0000 UTC" firstStartedPulling="2026-04-22 19:58:48.966245064 +0000 UTC m=+65.779630222" lastFinishedPulling="2026-04-22 19:58:52.957716546 +0000 UTC m=+69.771101719" observedRunningTime="2026-04-22 19:58:54.263194365 +0000 UTC m=+71.076579557" watchObservedRunningTime="2026-04-22 19:58:54.26435304 +0000 UTC m=+71.077738220" Apr 22 19:58:56.991191 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:56.991155 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-bcrgz"] Apr 22 19:58:57.036184 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:57.036155 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-bcrgz"] Apr 22 19:58:57.036365 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:57.036278 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-bcrgz" Apr 22 19:58:57.039662 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:57.039635 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-t9dg2\"" Apr 22 19:58:57.039801 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:57.039641 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 19:58:57.039801 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:57.039717 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 19:58:57.192163 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:57.192130 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5efb288f-2e5e-41f2-b49e-ac95f1491b32-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bcrgz\" (UID: \"5efb288f-2e5e-41f2-b49e-ac95f1491b32\") " pod="openshift-insights/insights-runtime-extractor-bcrgz" Apr 22 19:58:57.192163 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:57.192169 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5efb288f-2e5e-41f2-b49e-ac95f1491b32-data-volume\") pod \"insights-runtime-extractor-bcrgz\" (UID: \"5efb288f-2e5e-41f2-b49e-ac95f1491b32\") " pod="openshift-insights/insights-runtime-extractor-bcrgz" Apr 22 19:58:57.192439 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:57.192188 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5efb288f-2e5e-41f2-b49e-ac95f1491b32-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bcrgz\" (UID: \"5efb288f-2e5e-41f2-b49e-ac95f1491b32\") " pod="openshift-insights/insights-runtime-extractor-bcrgz" Apr 22 19:58:57.192439 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:57.192270 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kwsq\" (UniqueName: \"kubernetes.io/projected/5efb288f-2e5e-41f2-b49e-ac95f1491b32-kube-api-access-7kwsq\") pod \"insights-runtime-extractor-bcrgz\" (UID: \"5efb288f-2e5e-41f2-b49e-ac95f1491b32\") " pod="openshift-insights/insights-runtime-extractor-bcrgz" Apr 22 19:58:57.192439 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:57.192355 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5efb288f-2e5e-41f2-b49e-ac95f1491b32-crio-socket\") pod \"insights-runtime-extractor-bcrgz\" (UID: \"5efb288f-2e5e-41f2-b49e-ac95f1491b32\") " pod="openshift-insights/insights-runtime-extractor-bcrgz" Apr 22 19:58:57.293175 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:57.293136 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5efb288f-2e5e-41f2-b49e-ac95f1491b32-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bcrgz\" (UID: \"5efb288f-2e5e-41f2-b49e-ac95f1491b32\") " pod="openshift-insights/insights-runtime-extractor-bcrgz" Apr 22 19:58:57.293405 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:57.293186 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5efb288f-2e5e-41f2-b49e-ac95f1491b32-data-volume\") pod \"insights-runtime-extractor-bcrgz\" (UID: \"5efb288f-2e5e-41f2-b49e-ac95f1491b32\") " pod="openshift-insights/insights-runtime-extractor-bcrgz" Apr 22 19:58:57.293405 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:57.293309 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5efb288f-2e5e-41f2-b49e-ac95f1491b32-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bcrgz\" (UID: \"5efb288f-2e5e-41f2-b49e-ac95f1491b32\") " pod="openshift-insights/insights-runtime-extractor-bcrgz" Apr 22 19:58:57.293405 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:57.293400 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7kwsq\" (UniqueName: \"kubernetes.io/projected/5efb288f-2e5e-41f2-b49e-ac95f1491b32-kube-api-access-7kwsq\") pod \"insights-runtime-extractor-bcrgz\" (UID: \"5efb288f-2e5e-41f2-b49e-ac95f1491b32\") " pod="openshift-insights/insights-runtime-extractor-bcrgz" Apr 22 19:58:57.293630 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:57.293467 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5efb288f-2e5e-41f2-b49e-ac95f1491b32-crio-socket\") pod \"insights-runtime-extractor-bcrgz\" (UID: \"5efb288f-2e5e-41f2-b49e-ac95f1491b32\") " pod="openshift-insights/insights-runtime-extractor-bcrgz" Apr 22 19:58:57.293630 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:57.293558 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5efb288f-2e5e-41f2-b49e-ac95f1491b32-crio-socket\") pod \"insights-runtime-extractor-bcrgz\" (UID: \"5efb288f-2e5e-41f2-b49e-ac95f1491b32\") " pod="openshift-insights/insights-runtime-extractor-bcrgz" Apr 22 19:58:57.293630 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:57.293566 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5efb288f-2e5e-41f2-b49e-ac95f1491b32-data-volume\") pod \"insights-runtime-extractor-bcrgz\" (UID: \"5efb288f-2e5e-41f2-b49e-ac95f1491b32\") " pod="openshift-insights/insights-runtime-extractor-bcrgz" Apr 22 19:58:57.293860 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:57.293839 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5efb288f-2e5e-41f2-b49e-ac95f1491b32-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bcrgz\" (UID: \"5efb288f-2e5e-41f2-b49e-ac95f1491b32\") " pod="openshift-insights/insights-runtime-extractor-bcrgz" Apr 22 19:58:57.295570 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:57.295546 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5efb288f-2e5e-41f2-b49e-ac95f1491b32-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bcrgz\" (UID: \"5efb288f-2e5e-41f2-b49e-ac95f1491b32\") " pod="openshift-insights/insights-runtime-extractor-bcrgz" Apr 22 19:58:57.309978 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:57.309954 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kwsq\" (UniqueName: \"kubernetes.io/projected/5efb288f-2e5e-41f2-b49e-ac95f1491b32-kube-api-access-7kwsq\") pod \"insights-runtime-extractor-bcrgz\" (UID: \"5efb288f-2e5e-41f2-b49e-ac95f1491b32\") " pod="openshift-insights/insights-runtime-extractor-bcrgz" Apr 22 19:58:57.348211 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:57.348178 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-bcrgz" Apr 22 19:58:57.475044 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:57.475015 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-bcrgz"] Apr 22 19:58:57.477427 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:58:57.477399 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5efb288f_2e5e_41f2_b49e_ac95f1491b32.slice/crio-4dd1a6ac72f8db1e1b4a01c5309ba48a3b3094aa3231fe6fa1c38c06ad748bd3 WatchSource:0}: Error finding container 4dd1a6ac72f8db1e1b4a01c5309ba48a3b3094aa3231fe6fa1c38c06ad748bd3: Status 404 returned error can't find the container with id 4dd1a6ac72f8db1e1b4a01c5309ba48a3b3094aa3231fe6fa1c38c06ad748bd3 Apr 22 19:58:58.197532 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:58.197497 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bcrgz" event={"ID":"5efb288f-2e5e-41f2-b49e-ac95f1491b32","Type":"ContainerStarted","Data":"f226230ebdec4669040e7d4975057d0174b81593348679aba3f137e53f189687"} Apr 22 19:58:58.197532 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:58.197530 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bcrgz" event={"ID":"5efb288f-2e5e-41f2-b49e-ac95f1491b32","Type":"ContainerStarted","Data":"4dd1a6ac72f8db1e1b4a01c5309ba48a3b3094aa3231fe6fa1c38c06ad748bd3"} Apr 22 19:58:59.203690 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:58:59.203643 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bcrgz" event={"ID":"5efb288f-2e5e-41f2-b49e-ac95f1491b32","Type":"ContainerStarted","Data":"117f290680fa8047a5a946b676e0039f02bc9a2aa4177959e8efe4a2e5ad50d6"} Apr 22 19:59:01.211961 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:01.211926 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bcrgz" event={"ID":"5efb288f-2e5e-41f2-b49e-ac95f1491b32","Type":"ContainerStarted","Data":"60d58cc8ce2db08db0ffc59fe66b23ff69e059c67648c0d658ee3ca690c1a205"} Apr 22 19:59:01.228606 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:01.228525 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-bcrgz" podStartSLOduration=2.361703578 podStartE2EDuration="5.228510516s" podCreationTimestamp="2026-04-22 19:58:56 +0000 UTC" firstStartedPulling="2026-04-22 19:58:57.597263971 +0000 UTC m=+74.410649129" lastFinishedPulling="2026-04-22 19:59:00.464070894 +0000 UTC m=+77.277456067" observedRunningTime="2026-04-22 19:59:01.2273783 +0000 UTC m=+78.040763488" watchObservedRunningTime="2026-04-22 19:59:01.228510516 +0000 UTC m=+78.041895697" Apr 22 19:59:04.091263 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:04.091145 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-ncvmn" Apr 22 19:59:04.190041 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:04.190011 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-fpzrf" Apr 22 19:59:05.203377 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.203345 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-mmt2k"] Apr 22 19:59:05.237125 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.237097 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:05.239694 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.239664 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 19:59:05.240308 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.240292 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 19:59:05.240490 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.240476 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 19:59:05.240622 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.240600 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 19:59:05.244368 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.244347 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-q4d8q\"" Apr 22 19:59:05.364584 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.364550 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt4k7\" (UniqueName: \"kubernetes.io/projected/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-kube-api-access-nt4k7\") pod \"node-exporter-mmt2k\" (UID: \"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e\") " pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:05.364746 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.364595 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-node-exporter-wtmp\") pod \"node-exporter-mmt2k\" (UID: \"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e\") " pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:05.364746 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.364659 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-node-exporter-textfile\") pod \"node-exporter-mmt2k\" (UID: \"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e\") " pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:05.364746 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.364707 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-node-exporter-accelerators-collector-config\") pod \"node-exporter-mmt2k\" (UID: \"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e\") " pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:05.364746 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.364729 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-root\") pod \"node-exporter-mmt2k\" (UID: \"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e\") " pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:05.364746 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.364745 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-node-exporter-tls\") pod \"node-exporter-mmt2k\" (UID: \"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e\") " pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:05.364928 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.364792 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mmt2k\" (UID: \"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e\") " pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:05.364928 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.364846 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-sys\") pod \"node-exporter-mmt2k\" (UID: \"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e\") " pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:05.364928 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.364894 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-metrics-client-ca\") pod \"node-exporter-mmt2k\" (UID: \"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e\") " pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:05.466019 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.465940 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-node-exporter-accelerators-collector-config\") pod \"node-exporter-mmt2k\" (UID: \"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e\") " pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:05.466019 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.465976 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-root\") pod \"node-exporter-mmt2k\" (UID: \"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e\") " pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:05.466019 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.465997 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-node-exporter-tls\") pod \"node-exporter-mmt2k\" (UID: \"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e\") " pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:05.466019 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.466017 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mmt2k\" (UID: \"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e\") " pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:05.466355 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.466035 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-sys\") pod \"node-exporter-mmt2k\" (UID: \"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e\") " pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:05.466355 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.466065 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-metrics-client-ca\") pod \"node-exporter-mmt2k\" (UID: \"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e\") " pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:05.466355 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.466072 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-root\") pod \"node-exporter-mmt2k\" (UID: \"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e\") " pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:05.466355 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.466094 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nt4k7\" (UniqueName: \"kubernetes.io/projected/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-kube-api-access-nt4k7\") pod \"node-exporter-mmt2k\" (UID: \"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e\") " pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:05.466355 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.466122 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-sys\") pod \"node-exporter-mmt2k\" (UID: \"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e\") " pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:05.466355 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:59:05.466129 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 19:59:05.466355 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.466159 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-node-exporter-wtmp\") pod \"node-exporter-mmt2k\" (UID: \"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e\") " pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:05.466355 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:59:05.466212 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-node-exporter-tls podName:b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e nodeName:}" failed. No retries permitted until 2026-04-22 19:59:05.966192837 +0000 UTC m=+82.779578003 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-node-exporter-tls") pod "node-exporter-mmt2k" (UID: "b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e") : secret "node-exporter-tls" not found Apr 22 19:59:05.466355 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.466249 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-node-exporter-textfile\") pod \"node-exporter-mmt2k\" (UID: \"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e\") " pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:05.466355 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.466299 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-node-exporter-wtmp\") pod \"node-exporter-mmt2k\" (UID: \"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e\") " pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:05.466792 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.466592 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-node-exporter-textfile\") pod \"node-exporter-mmt2k\" (UID: \"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e\") " pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:05.466792 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.466602 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-metrics-client-ca\") pod \"node-exporter-mmt2k\" (UID: \"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e\") " pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:05.466792 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.466592 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-node-exporter-accelerators-collector-config\") pod \"node-exporter-mmt2k\" (UID: \"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e\") " pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:05.468511 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.468482 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-mmt2k\" (UID: \"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e\") " pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:05.476377 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.476355 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt4k7\" (UniqueName: \"kubernetes.io/projected/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-kube-api-access-nt4k7\") pod \"node-exporter-mmt2k\" (UID: \"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e\") " pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:05.970080 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:05.970039 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-node-exporter-tls\") pod \"node-exporter-mmt2k\" (UID: \"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e\") " pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:05.970282 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:59:05.970228 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 19:59:05.970394 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:59:05.970306 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-node-exporter-tls podName:b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e nodeName:}" failed. No retries permitted until 2026-04-22 19:59:06.97028514 +0000 UTC m=+83.783670319 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-node-exporter-tls") pod "node-exporter-mmt2k" (UID: "b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e") : secret "node-exporter-tls" not found Apr 22 19:59:06.978839 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:06.978800 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-node-exporter-tls\") pod \"node-exporter-mmt2k\" (UID: \"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e\") " pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:06.981053 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:06.981026 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e-node-exporter-tls\") pod \"node-exporter-mmt2k\" (UID: \"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e\") " pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:07.047042 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:07.047010 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-mmt2k" Apr 22 19:59:07.057642 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:59:07.057604 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5e55ed1_358f_40a9_b0e5_1ce0a7ed253e.slice/crio-22204390224fd819d4f0a3672a1066bb3ce9adbbcb7fce8a9d84cd202219000e WatchSource:0}: Error finding container 22204390224fd819d4f0a3672a1066bb3ce9adbbcb7fce8a9d84cd202219000e: Status 404 returned error can't find the container with id 22204390224fd819d4f0a3672a1066bb3ce9adbbcb7fce8a9d84cd202219000e Apr 22 19:59:07.235058 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:07.234971 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mmt2k" event={"ID":"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e","Type":"ContainerStarted","Data":"22204390224fd819d4f0a3672a1066bb3ce9adbbcb7fce8a9d84cd202219000e"} Apr 22 19:59:08.677053 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:08.677017 2575 patch_prober.go:28] interesting pod/image-registry-64b9f68d58-mbcdm container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 19:59:08.677428 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:08.677074 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" podUID="c1f10812-b9aa-426e-ac37-99d23b87a05a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:59:09.244391 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:09.244351 2575 generic.go:358] "Generic (PLEG): container finished" podID="b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e" containerID="458bde7e0f8cba33a010cc7ec6e1f3d449125a83c9a6a832ab5b0866dcd2fb6c" exitCode=0 Apr 22 19:59:09.244574 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:09.244427 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mmt2k" event={"ID":"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e","Type":"ContainerDied","Data":"458bde7e0f8cba33a010cc7ec6e1f3d449125a83c9a6a832ab5b0866dcd2fb6c"} Apr 22 19:59:10.151246 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:10.151219 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:59:10.249299 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:10.249263 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mmt2k" event={"ID":"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e","Type":"ContainerStarted","Data":"68e3d3c6ce021e6e3e976ad80e1f8379860dc47f253e9e45581035a8e649e329"} Apr 22 19:59:10.249299 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:10.249298 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-mmt2k" event={"ID":"b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e","Type":"ContainerStarted","Data":"c0abe24cf7d5471e9aa9273fd2f9afe4d524df3fc4e01271b6f9b1ff8ed5a704"} Apr 22 19:59:10.269217 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:10.269173 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-mmt2k" podStartSLOduration=4.195076438 podStartE2EDuration="5.269160244s" podCreationTimestamp="2026-04-22 19:59:05 +0000 UTC" firstStartedPulling="2026-04-22 19:59:07.059386665 +0000 UTC m=+83.872771828" lastFinishedPulling="2026-04-22 19:59:08.133470472 +0000 UTC m=+84.946855634" observedRunningTime="2026-04-22 19:59:10.268992479 +0000 UTC m=+87.082377672" watchObservedRunningTime="2026-04-22 19:59:10.269160244 +0000 UTC m=+87.082545424" Apr 22 19:59:18.794747 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:18.794707 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-64b9f68d58-mbcdm"] Apr 22 19:59:20.591548 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:20.591513 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ec41f5a3-a7a6-4607-b14e-49402afefbe2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-f7qdl\" (UID: \"ec41f5a3-a7a6-4607-b14e-49402afefbe2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-f7qdl" Apr 22 19:59:20.594005 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:20.593981 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ec41f5a3-a7a6-4607-b14e-49402afefbe2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-f7qdl\" (UID: \"ec41f5a3-a7a6-4607-b14e-49402afefbe2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-f7qdl" Apr 22 19:59:20.778372 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:20.778340 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-hhjcv\"" Apr 22 19:59:20.786090 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:20.786066 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-f7qdl" Apr 22 19:59:20.906623 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:20.906595 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-f7qdl"] Apr 22 19:59:20.909816 ip-10-0-133-60 kubenswrapper[2575]: W0422 19:59:20.909784 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec41f5a3_a7a6_4607_b14e_49402afefbe2.slice/crio-42854de9bbb6b2a7a70ed724cc12cb35dffd8ff57cdf28ea768d03eaf249a2b6 WatchSource:0}: Error finding container 42854de9bbb6b2a7a70ed724cc12cb35dffd8ff57cdf28ea768d03eaf249a2b6: Status 404 returned error can't find the container with id 42854de9bbb6b2a7a70ed724cc12cb35dffd8ff57cdf28ea768d03eaf249a2b6 Apr 22 19:59:21.281948 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:21.281905 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-f7qdl" event={"ID":"ec41f5a3-a7a6-4607-b14e-49402afefbe2","Type":"ContainerStarted","Data":"42854de9bbb6b2a7a70ed724cc12cb35dffd8ff57cdf28ea768d03eaf249a2b6"} Apr 22 19:59:22.286834 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:22.286796 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-f7qdl" event={"ID":"ec41f5a3-a7a6-4607-b14e-49402afefbe2","Type":"ContainerStarted","Data":"fbbcae445028573b09bb20dd76f794f8fde538abacfbd4cbb358180146b65236"} Apr 22 19:59:22.303103 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:22.303057 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-f7qdl" podStartSLOduration=73.492371934 podStartE2EDuration="1m14.3030449s" podCreationTimestamp="2026-04-22 19:58:08 +0000 UTC" firstStartedPulling="2026-04-22 19:59:20.912781925 +0000 UTC m=+97.726167087" lastFinishedPulling="2026-04-22 19:59:21.72345489 +0000 UTC m=+98.536840053" observedRunningTime="2026-04-22 19:59:22.301904655 +0000 UTC m=+99.115289849" watchObservedRunningTime="2026-04-22 19:59:22.3030449 +0000 UTC m=+99.116430080" Apr 22 19:59:43.814264 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:43.814199 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" podUID="c1f10812-b9aa-426e-ac37-99d23b87a05a" containerName="registry" containerID="cri-o://bb72c38f32da84d39874a717d6e6e111c82ac5a98bef16a0a457c9ecea16e5ad" gracePeriod=30 Apr 22 19:59:44.058524 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.058499 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:59:44.181506 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.181420 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c1f10812-b9aa-426e-ac37-99d23b87a05a-registry-certificates\") pod \"c1f10812-b9aa-426e-ac37-99d23b87a05a\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " Apr 22 19:59:44.181506 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.181480 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c1f10812-b9aa-426e-ac37-99d23b87a05a-image-registry-private-configuration\") pod \"c1f10812-b9aa-426e-ac37-99d23b87a05a\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " Apr 22 19:59:44.181506 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.181502 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gg2l\" (UniqueName: \"kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-kube-api-access-4gg2l\") pod \"c1f10812-b9aa-426e-ac37-99d23b87a05a\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " Apr 22 19:59:44.181782 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.181546 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c1f10812-b9aa-426e-ac37-99d23b87a05a-ca-trust-extracted\") pod \"c1f10812-b9aa-426e-ac37-99d23b87a05a\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " Apr 22 19:59:44.181782 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.181569 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-bound-sa-token\") pod \"c1f10812-b9aa-426e-ac37-99d23b87a05a\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " Apr 22 19:59:44.181782 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.181591 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c1f10812-b9aa-426e-ac37-99d23b87a05a-installation-pull-secrets\") pod \"c1f10812-b9aa-426e-ac37-99d23b87a05a\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " Apr 22 19:59:44.181782 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.181621 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-registry-tls\") pod \"c1f10812-b9aa-426e-ac37-99d23b87a05a\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " Apr 22 19:59:44.181782 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.181656 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1f10812-b9aa-426e-ac37-99d23b87a05a-trusted-ca\") pod \"c1f10812-b9aa-426e-ac37-99d23b87a05a\" (UID: \"c1f10812-b9aa-426e-ac37-99d23b87a05a\") " Apr 22 19:59:44.182289 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.181880 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1f10812-b9aa-426e-ac37-99d23b87a05a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "c1f10812-b9aa-426e-ac37-99d23b87a05a" (UID: "c1f10812-b9aa-426e-ac37-99d23b87a05a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:59:44.182428 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.182293 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1f10812-b9aa-426e-ac37-99d23b87a05a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "c1f10812-b9aa-426e-ac37-99d23b87a05a" (UID: "c1f10812-b9aa-426e-ac37-99d23b87a05a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:59:44.184248 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.184217 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-kube-api-access-4gg2l" (OuterVolumeSpecName: "kube-api-access-4gg2l") pod "c1f10812-b9aa-426e-ac37-99d23b87a05a" (UID: "c1f10812-b9aa-426e-ac37-99d23b87a05a"). InnerVolumeSpecName "kube-api-access-4gg2l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:59:44.184400 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.184303 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "c1f10812-b9aa-426e-ac37-99d23b87a05a" (UID: "c1f10812-b9aa-426e-ac37-99d23b87a05a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:59:44.184400 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.184372 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "c1f10812-b9aa-426e-ac37-99d23b87a05a" (UID: "c1f10812-b9aa-426e-ac37-99d23b87a05a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:59:44.184515 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.184450 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1f10812-b9aa-426e-ac37-99d23b87a05a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "c1f10812-b9aa-426e-ac37-99d23b87a05a" (UID: "c1f10812-b9aa-426e-ac37-99d23b87a05a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:59:44.184549 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.184515 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1f10812-b9aa-426e-ac37-99d23b87a05a-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "c1f10812-b9aa-426e-ac37-99d23b87a05a" (UID: "c1f10812-b9aa-426e-ac37-99d23b87a05a"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:59:44.192619 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.192591 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1f10812-b9aa-426e-ac37-99d23b87a05a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "c1f10812-b9aa-426e-ac37-99d23b87a05a" (UID: "c1f10812-b9aa-426e-ac37-99d23b87a05a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:59:44.282954 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.282914 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c1f10812-b9aa-426e-ac37-99d23b87a05a-image-registry-private-configuration\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 19:59:44.282954 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.282948 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4gg2l\" (UniqueName: \"kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-kube-api-access-4gg2l\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 19:59:44.282954 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.282959 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c1f10812-b9aa-426e-ac37-99d23b87a05a-ca-trust-extracted\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 19:59:44.283172 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.282968 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-bound-sa-token\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 19:59:44.283172 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.282979 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c1f10812-b9aa-426e-ac37-99d23b87a05a-installation-pull-secrets\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 19:59:44.283172 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.282989 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c1f10812-b9aa-426e-ac37-99d23b87a05a-registry-tls\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 19:59:44.283172 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.282997 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1f10812-b9aa-426e-ac37-99d23b87a05a-trusted-ca\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 19:59:44.283172 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.283005 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c1f10812-b9aa-426e-ac37-99d23b87a05a-registry-certificates\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 19:59:44.354247 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.354211 2575 generic.go:358] "Generic (PLEG): container finished" podID="3484f3e2-bbf8-464e-a5f4-8d66ff2230f3" containerID="75c014e42610f9d0392f71992a97b1118f5f91de73347744aaf22e7220f37a5e" exitCode=0 Apr 22 19:59:44.354446 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.354291 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rs9hj" event={"ID":"3484f3e2-bbf8-464e-a5f4-8d66ff2230f3","Type":"ContainerDied","Data":"75c014e42610f9d0392f71992a97b1118f5f91de73347744aaf22e7220f37a5e"} Apr 22 19:59:44.354692 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.354678 2575 scope.go:117] "RemoveContainer" containerID="75c014e42610f9d0392f71992a97b1118f5f91de73347744aaf22e7220f37a5e" Apr 22 19:59:44.355479 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.355456 2575 generic.go:358] "Generic (PLEG): container finished" podID="c1f10812-b9aa-426e-ac37-99d23b87a05a" containerID="bb72c38f32da84d39874a717d6e6e111c82ac5a98bef16a0a457c9ecea16e5ad" exitCode=0 Apr 22 19:59:44.355585 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.355477 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" event={"ID":"c1f10812-b9aa-426e-ac37-99d23b87a05a","Type":"ContainerDied","Data":"bb72c38f32da84d39874a717d6e6e111c82ac5a98bef16a0a457c9ecea16e5ad"} Apr 22 19:59:44.355585 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.355508 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" Apr 22 19:59:44.355585 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.355517 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-64b9f68d58-mbcdm" event={"ID":"c1f10812-b9aa-426e-ac37-99d23b87a05a","Type":"ContainerDied","Data":"970db0698203143d8a23ed1048d1421339df90fd34a2a553d8834232eb3b98cb"} Apr 22 19:59:44.355585 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.355533 2575 scope.go:117] "RemoveContainer" containerID="bb72c38f32da84d39874a717d6e6e111c82ac5a98bef16a0a457c9ecea16e5ad" Apr 22 19:59:44.364054 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.364034 2575 scope.go:117] "RemoveContainer" containerID="bb72c38f32da84d39874a717d6e6e111c82ac5a98bef16a0a457c9ecea16e5ad" Apr 22 19:59:44.364346 ip-10-0-133-60 kubenswrapper[2575]: E0422 19:59:44.364303 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb72c38f32da84d39874a717d6e6e111c82ac5a98bef16a0a457c9ecea16e5ad\": container with ID starting with bb72c38f32da84d39874a717d6e6e111c82ac5a98bef16a0a457c9ecea16e5ad not found: ID does not exist" containerID="bb72c38f32da84d39874a717d6e6e111c82ac5a98bef16a0a457c9ecea16e5ad" Apr 22 19:59:44.364418 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.364353 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb72c38f32da84d39874a717d6e6e111c82ac5a98bef16a0a457c9ecea16e5ad"} err="failed to get container status \"bb72c38f32da84d39874a717d6e6e111c82ac5a98bef16a0a457c9ecea16e5ad\": rpc error: code = NotFound desc = could not find container \"bb72c38f32da84d39874a717d6e6e111c82ac5a98bef16a0a457c9ecea16e5ad\": container with ID starting with bb72c38f32da84d39874a717d6e6e111c82ac5a98bef16a0a457c9ecea16e5ad not found: ID does not exist" Apr 22 19:59:44.384616 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.384583 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-64b9f68d58-mbcdm"] Apr 22 19:59:44.388411 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:44.388383 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-64b9f68d58-mbcdm"] Apr 22 19:59:45.361056 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:45.361023 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rs9hj" event={"ID":"3484f3e2-bbf8-464e-a5f4-8d66ff2230f3","Type":"ContainerStarted","Data":"40275aca36e3a434d631c5af7fd3c1584efe0c8c277b3977dd6dfe849d5ccc64"} Apr 22 19:59:45.730977 ip-10-0-133-60 kubenswrapper[2575]: I0422 19:59:45.730900 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1f10812-b9aa-426e-ac37-99d23b87a05a" path="/var/lib/kubelet/pods/c1f10812-b9aa-426e-ac37-99d23b87a05a/volumes" Apr 22 20:00:03.413250 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:00:03.413217 2575 generic.go:358] "Generic (PLEG): container finished" podID="7fa0c2d8-cebb-4563-88fb-c974a238cc8d" containerID="43db704e029d91777790639d18d7b0b19ee00715f4f438c553ea7df37c21d341" exitCode=0 Apr 22 20:00:03.413693 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:00:03.413292 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-46lb7" event={"ID":"7fa0c2d8-cebb-4563-88fb-c974a238cc8d","Type":"ContainerDied","Data":"43db704e029d91777790639d18d7b0b19ee00715f4f438c553ea7df37c21d341"} Apr 22 20:00:03.413693 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:00:03.413655 2575 scope.go:117] "RemoveContainer" containerID="43db704e029d91777790639d18d7b0b19ee00715f4f438c553ea7df37c21d341" Apr 22 20:00:03.414660 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:00:03.414638 2575 generic.go:358] "Generic (PLEG): container finished" podID="41ae5fd4-d9cf-4314-b727-acf1d473957e" containerID="00485b29876b003451d69435b15995b5a6db3b388a7e752267d503e1bd1e068b" exitCode=0 Apr 22 20:00:03.414749 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:00:03.414662 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zxkxh" event={"ID":"41ae5fd4-d9cf-4314-b727-acf1d473957e","Type":"ContainerDied","Data":"00485b29876b003451d69435b15995b5a6db3b388a7e752267d503e1bd1e068b"} Apr 22 20:00:03.414946 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:00:03.414930 2575 scope.go:117] "RemoveContainer" containerID="00485b29876b003451d69435b15995b5a6db3b388a7e752267d503e1bd1e068b" Apr 22 20:00:04.419062 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:00:04.419024 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-zxkxh" event={"ID":"41ae5fd4-d9cf-4314-b727-acf1d473957e","Type":"ContainerStarted","Data":"a68426c5d88cad0bed095903d933eecaf953173a439f66e48852af5259176428"} Apr 22 20:00:04.420731 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:00:04.420704 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-46lb7" event={"ID":"7fa0c2d8-cebb-4563-88fb-c974a238cc8d","Type":"ContainerStarted","Data":"6eeefae223046506f9d4584b86c90931cbea58e9ab14f4d0bdd4994d4708e083"} Apr 22 20:00:07.058434 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:00:07.058397 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" podUID="5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 20:00:17.058217 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:00:17.058152 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" podUID="5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 20:00:27.057900 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:00:27.057857 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" podUID="5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 20:00:27.058267 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:00:27.057928 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" Apr 22 20:00:27.058429 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:00:27.058411 2575 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"52a82cc55b65d53ccb37042580b1693a64c1581adbc9a591dc5a055ba293c7a2"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 22 20:00:27.058472 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:00:27.058449 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" podUID="5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32" containerName="service-proxy" containerID="cri-o://52a82cc55b65d53ccb37042580b1693a64c1581adbc9a591dc5a055ba293c7a2" gracePeriod=30 Apr 22 20:00:27.487695 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:00:27.487617 2575 generic.go:358] "Generic (PLEG): container finished" podID="5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32" containerID="52a82cc55b65d53ccb37042580b1693a64c1581adbc9a591dc5a055ba293c7a2" exitCode=2 Apr 22 20:00:27.487695 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:00:27.487685 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" event={"ID":"5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32","Type":"ContainerDied","Data":"52a82cc55b65d53ccb37042580b1693a64c1581adbc9a591dc5a055ba293c7a2"} Apr 22 20:00:27.487868 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:00:27.487719 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6c75cc995-6jqp2" event={"ID":"5a9e9d0c-6539-4aea-b151-3ee3dd1a9b32","Type":"ContainerStarted","Data":"9195d16d2e07a3c7436d4827dd4af0ce83a2275a3231ddff3b0f230f5346489d"} Apr 22 20:02:43.644815 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:02:43.644779 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vfhht_338d333b-8626-46b3-b450-fbdd521183d8/console-operator/1.log" Apr 22 20:02:43.645378 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:02:43.644940 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vfhht_338d333b-8626-46b3-b450-fbdd521183d8/console-operator/1.log" Apr 22 20:02:43.655894 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:02:43.655874 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 20:03:54.243555 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:54.243514 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5846f88986-sb9lm"] Apr 22 20:03:54.244141 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:54.243899 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1f10812-b9aa-426e-ac37-99d23b87a05a" containerName="registry" Apr 22 20:03:54.244141 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:54.243916 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1f10812-b9aa-426e-ac37-99d23b87a05a" containerName="registry" Apr 22 20:03:54.244141 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:54.244012 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c1f10812-b9aa-426e-ac37-99d23b87a05a" containerName="registry" Apr 22 20:03:54.246970 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:54.246950 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5846f88986-sb9lm" Apr 22 20:03:54.251099 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:54.251062 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-f4nfg\"" Apr 22 20:03:54.252089 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:54.252059 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 22 20:03:54.252249 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:54.252164 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 22 20:03:54.252249 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:54.252218 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 22 20:03:54.252409 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:54.252165 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 22 20:03:54.252409 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:54.252115 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 22 20:03:54.271772 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:54.271749 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5846f88986-sb9lm"] Apr 22 20:03:54.309618 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:54.309593 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q6k4\" (UniqueName: \"kubernetes.io/projected/c5b8fee2-7d89-406d-8086-b029e86eaa75-kube-api-access-5q6k4\") pod \"lws-controller-manager-5846f88986-sb9lm\" (UID: \"c5b8fee2-7d89-406d-8086-b029e86eaa75\") " pod="openshift-lws-operator/lws-controller-manager-5846f88986-sb9lm" Apr 22 20:03:54.309752 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:54.309630 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5b8fee2-7d89-406d-8086-b029e86eaa75-metrics-cert\") pod \"lws-controller-manager-5846f88986-sb9lm\" (UID: \"c5b8fee2-7d89-406d-8086-b029e86eaa75\") " pod="openshift-lws-operator/lws-controller-manager-5846f88986-sb9lm" Apr 22 20:03:54.309752 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:54.309662 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5b8fee2-7d89-406d-8086-b029e86eaa75-cert\") pod \"lws-controller-manager-5846f88986-sb9lm\" (UID: \"c5b8fee2-7d89-406d-8086-b029e86eaa75\") " pod="openshift-lws-operator/lws-controller-manager-5846f88986-sb9lm" Apr 22 20:03:54.309752 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:54.309697 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/c5b8fee2-7d89-406d-8086-b029e86eaa75-manager-config\") pod \"lws-controller-manager-5846f88986-sb9lm\" (UID: \"c5b8fee2-7d89-406d-8086-b029e86eaa75\") " pod="openshift-lws-operator/lws-controller-manager-5846f88986-sb9lm" Apr 22 20:03:54.410106 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:54.410073 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5b8fee2-7d89-406d-8086-b029e86eaa75-cert\") pod \"lws-controller-manager-5846f88986-sb9lm\" (UID: \"c5b8fee2-7d89-406d-8086-b029e86eaa75\") " pod="openshift-lws-operator/lws-controller-manager-5846f88986-sb9lm" Apr 22 20:03:54.410267 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:54.410116 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/c5b8fee2-7d89-406d-8086-b029e86eaa75-manager-config\") pod \"lws-controller-manager-5846f88986-sb9lm\" (UID: \"c5b8fee2-7d89-406d-8086-b029e86eaa75\") " pod="openshift-lws-operator/lws-controller-manager-5846f88986-sb9lm" Apr 22 20:03:54.410267 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:54.410150 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5q6k4\" (UniqueName: \"kubernetes.io/projected/c5b8fee2-7d89-406d-8086-b029e86eaa75-kube-api-access-5q6k4\") pod \"lws-controller-manager-5846f88986-sb9lm\" (UID: \"c5b8fee2-7d89-406d-8086-b029e86eaa75\") " pod="openshift-lws-operator/lws-controller-manager-5846f88986-sb9lm" Apr 22 20:03:54.410267 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:54.410173 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5b8fee2-7d89-406d-8086-b029e86eaa75-metrics-cert\") pod \"lws-controller-manager-5846f88986-sb9lm\" (UID: \"c5b8fee2-7d89-406d-8086-b029e86eaa75\") " pod="openshift-lws-operator/lws-controller-manager-5846f88986-sb9lm" Apr 22 20:03:54.410848 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:54.410827 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/c5b8fee2-7d89-406d-8086-b029e86eaa75-manager-config\") pod \"lws-controller-manager-5846f88986-sb9lm\" (UID: \"c5b8fee2-7d89-406d-8086-b029e86eaa75\") " pod="openshift-lws-operator/lws-controller-manager-5846f88986-sb9lm" Apr 22 20:03:54.412492 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:54.412469 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5b8fee2-7d89-406d-8086-b029e86eaa75-metrics-cert\") pod \"lws-controller-manager-5846f88986-sb9lm\" (UID: \"c5b8fee2-7d89-406d-8086-b029e86eaa75\") " pod="openshift-lws-operator/lws-controller-manager-5846f88986-sb9lm" Apr 22 20:03:54.412592 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:54.412574 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5b8fee2-7d89-406d-8086-b029e86eaa75-cert\") pod \"lws-controller-manager-5846f88986-sb9lm\" (UID: \"c5b8fee2-7d89-406d-8086-b029e86eaa75\") " pod="openshift-lws-operator/lws-controller-manager-5846f88986-sb9lm" Apr 22 20:03:54.422123 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:54.422101 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q6k4\" (UniqueName: \"kubernetes.io/projected/c5b8fee2-7d89-406d-8086-b029e86eaa75-kube-api-access-5q6k4\") pod \"lws-controller-manager-5846f88986-sb9lm\" (UID: \"c5b8fee2-7d89-406d-8086-b029e86eaa75\") " pod="openshift-lws-operator/lws-controller-manager-5846f88986-sb9lm" Apr 22 20:03:54.556972 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:54.556933 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5846f88986-sb9lm" Apr 22 20:03:54.687464 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:54.687433 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5846f88986-sb9lm"] Apr 22 20:03:54.690617 ip-10-0-133-60 kubenswrapper[2575]: W0422 20:03:54.690592 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5b8fee2_7d89_406d_8086_b029e86eaa75.slice/crio-b44b8f8c37d130e6104c5fd331a5c16c89122911a1c569f263ee248dfb00d63b WatchSource:0}: Error finding container b44b8f8c37d130e6104c5fd331a5c16c89122911a1c569f263ee248dfb00d63b: Status 404 returned error can't find the container with id b44b8f8c37d130e6104c5fd331a5c16c89122911a1c569f263ee248dfb00d63b Apr 22 20:03:54.692145 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:54.692127 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:03:55.044270 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:55.044239 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5846f88986-sb9lm" event={"ID":"c5b8fee2-7d89-406d-8086-b029e86eaa75","Type":"ContainerStarted","Data":"b44b8f8c37d130e6104c5fd331a5c16c89122911a1c569f263ee248dfb00d63b"} Apr 22 20:03:58.056936 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:58.056906 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5846f88986-sb9lm" event={"ID":"c5b8fee2-7d89-406d-8086-b029e86eaa75","Type":"ContainerStarted","Data":"4b39e108393c630f6d8a02bbadf7aa731fefa6a1de25e87a0666004de007160a"} Apr 22 20:03:58.057304 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:58.056960 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5846f88986-sb9lm" Apr 22 20:03:58.079068 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:03:58.079018 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5846f88986-sb9lm" podStartSLOduration=1.39598182 podStartE2EDuration="4.079005771s" podCreationTimestamp="2026-04-22 20:03:54 +0000 UTC" firstStartedPulling="2026-04-22 20:03:54.692276958 +0000 UTC m=+371.505662117" lastFinishedPulling="2026-04-22 20:03:57.375300894 +0000 UTC m=+374.188686068" observedRunningTime="2026-04-22 20:03:58.078595226 +0000 UTC m=+374.891980408" watchObservedRunningTime="2026-04-22 20:03:58.079005771 +0000 UTC m=+374.892390950" Apr 22 20:04:09.061924 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:04:09.061889 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5846f88986-sb9lm" Apr 22 20:04:38.139788 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:04:38.139751 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-k62gr"] Apr 22 20:04:38.142637 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:04:38.142614 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-k62gr" Apr 22 20:04:38.145786 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:04:38.145764 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 22 20:04:38.145917 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:04:38.145796 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 20:04:38.145917 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:04:38.145796 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-ghvlf\"" Apr 22 20:04:38.145917 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:04:38.145798 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 20:04:38.146465 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:04:38.146449 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 22 20:04:38.151838 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:04:38.151815 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-k62gr"] Apr 22 20:04:38.246841 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:04:38.246804 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa225954-d141-4b64-9c9d-08f387ce34e0-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-k62gr\" (UID: \"aa225954-d141-4b64-9c9d-08f387ce34e0\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-k62gr" Apr 22 20:04:38.247014 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:04:38.246850 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/aa225954-d141-4b64-9c9d-08f387ce34e0-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-k62gr\" (UID: \"aa225954-d141-4b64-9c9d-08f387ce34e0\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-k62gr" Apr 22 20:04:38.247014 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:04:38.246920 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gklb\" (UniqueName: \"kubernetes.io/projected/aa225954-d141-4b64-9c9d-08f387ce34e0-kube-api-access-8gklb\") pod \"kuadrant-console-plugin-6c886788f8-k62gr\" (UID: \"aa225954-d141-4b64-9c9d-08f387ce34e0\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-k62gr" Apr 22 20:04:38.348248 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:04:38.348203 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa225954-d141-4b64-9c9d-08f387ce34e0-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-k62gr\" (UID: \"aa225954-d141-4b64-9c9d-08f387ce34e0\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-k62gr" Apr 22 20:04:38.348248 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:04:38.348254 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/aa225954-d141-4b64-9c9d-08f387ce34e0-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-k62gr\" (UID: \"aa225954-d141-4b64-9c9d-08f387ce34e0\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-k62gr" Apr 22 20:04:38.348456 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:04:38.348282 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8gklb\" (UniqueName: \"kubernetes.io/projected/aa225954-d141-4b64-9c9d-08f387ce34e0-kube-api-access-8gklb\") pod \"kuadrant-console-plugin-6c886788f8-k62gr\" (UID: \"aa225954-d141-4b64-9c9d-08f387ce34e0\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-k62gr" Apr 22 20:04:38.348456 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:04:38.348361 2575 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 22 20:04:38.348456 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:04:38.348436 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa225954-d141-4b64-9c9d-08f387ce34e0-plugin-serving-cert podName:aa225954-d141-4b64-9c9d-08f387ce34e0 nodeName:}" failed. No retries permitted until 2026-04-22 20:04:38.848415263 +0000 UTC m=+415.661800422 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/aa225954-d141-4b64-9c9d-08f387ce34e0-plugin-serving-cert") pod "kuadrant-console-plugin-6c886788f8-k62gr" (UID: "aa225954-d141-4b64-9c9d-08f387ce34e0") : secret "plugin-serving-cert" not found Apr 22 20:04:38.348933 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:04:38.348914 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/aa225954-d141-4b64-9c9d-08f387ce34e0-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-k62gr\" (UID: \"aa225954-d141-4b64-9c9d-08f387ce34e0\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-k62gr" Apr 22 20:04:38.356945 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:04:38.356916 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gklb\" (UniqueName: \"kubernetes.io/projected/aa225954-d141-4b64-9c9d-08f387ce34e0-kube-api-access-8gklb\") pod \"kuadrant-console-plugin-6c886788f8-k62gr\" (UID: \"aa225954-d141-4b64-9c9d-08f387ce34e0\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-k62gr" Apr 22 20:04:38.852610 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:04:38.852574 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa225954-d141-4b64-9c9d-08f387ce34e0-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-k62gr\" (UID: \"aa225954-d141-4b64-9c9d-08f387ce34e0\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-k62gr" Apr 22 20:04:38.854906 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:04:38.854888 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa225954-d141-4b64-9c9d-08f387ce34e0-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-k62gr\" (UID: \"aa225954-d141-4b64-9c9d-08f387ce34e0\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-k62gr" Apr 22 20:04:39.053068 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:04:39.053032 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-k62gr" Apr 22 20:04:39.173691 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:04:39.173656 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-k62gr"] Apr 22 20:04:40.173939 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:04:40.173897 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-k62gr" event={"ID":"aa225954-d141-4b64-9c9d-08f387ce34e0","Type":"ContainerStarted","Data":"7528afc5168ca0afebda1fef2f2da64b906c4fe9e37a29ad3a5009c4aecbeafd"} Apr 22 20:04:45.191377 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:04:45.191332 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-k62gr" event={"ID":"aa225954-d141-4b64-9c9d-08f387ce34e0","Type":"ContainerStarted","Data":"9c90b5e3af587519774aba2ebb21bb76de8eb0fcf451ba6fa52976e37b34b0b6"} Apr 22 20:04:45.207053 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:04:45.206993 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-k62gr" podStartSLOduration=2.089686509 podStartE2EDuration="7.206977948s" podCreationTimestamp="2026-04-22 20:04:38 +0000 UTC" firstStartedPulling="2026-04-22 20:04:39.178916095 +0000 UTC m=+415.992301255" lastFinishedPulling="2026-04-22 20:04:44.296207535 +0000 UTC m=+421.109592694" observedRunningTime="2026-04-22 20:04:45.205993746 +0000 UTC m=+422.019378926" watchObservedRunningTime="2026-04-22 20:04:45.206977948 +0000 UTC m=+422.020363168" Apr 22 20:05:23.885098 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:23.885063 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-dkll2"] Apr 22 20:05:23.909836 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:23.909806 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-dkll2"] Apr 22 20:05:23.910016 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:23.909921 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-dkll2" Apr 22 20:05:23.912259 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:23.912236 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-kl4hw\"" Apr 22 20:05:24.017219 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:24.017187 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f9w2\" (UniqueName: \"kubernetes.io/projected/b9632493-41c6-483b-aa3d-8e8a93a6a24b-kube-api-access-4f9w2\") pod \"authorino-674b59b84c-dkll2\" (UID: \"b9632493-41c6-483b-aa3d-8e8a93a6a24b\") " pod="kuadrant-system/authorino-674b59b84c-dkll2" Apr 22 20:05:24.045020 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:24.044989 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-fvftr"] Apr 22 20:05:24.048175 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:24.048159 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-fvftr" Apr 22 20:05:24.054228 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:24.054203 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-fvftr"] Apr 22 20:05:24.118416 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:24.118375 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4f9w2\" (UniqueName: \"kubernetes.io/projected/b9632493-41c6-483b-aa3d-8e8a93a6a24b-kube-api-access-4f9w2\") pod \"authorino-674b59b84c-dkll2\" (UID: \"b9632493-41c6-483b-aa3d-8e8a93a6a24b\") " pod="kuadrant-system/authorino-674b59b84c-dkll2" Apr 22 20:05:24.118601 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:24.118479 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2lmp\" (UniqueName: \"kubernetes.io/projected/63e10ef4-0cd7-490d-9e1d-2ae5c19737a7-kube-api-access-h2lmp\") pod \"authorino-79cbc94b89-fvftr\" (UID: \"63e10ef4-0cd7-490d-9e1d-2ae5c19737a7\") " pod="kuadrant-system/authorino-79cbc94b89-fvftr" Apr 22 20:05:24.126080 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:24.126052 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f9w2\" (UniqueName: \"kubernetes.io/projected/b9632493-41c6-483b-aa3d-8e8a93a6a24b-kube-api-access-4f9w2\") pod \"authorino-674b59b84c-dkll2\" (UID: \"b9632493-41c6-483b-aa3d-8e8a93a6a24b\") " pod="kuadrant-system/authorino-674b59b84c-dkll2" Apr 22 20:05:24.218884 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:24.218808 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-dkll2" Apr 22 20:05:24.219035 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:24.219013 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2lmp\" (UniqueName: \"kubernetes.io/projected/63e10ef4-0cd7-490d-9e1d-2ae5c19737a7-kube-api-access-h2lmp\") pod \"authorino-79cbc94b89-fvftr\" (UID: \"63e10ef4-0cd7-490d-9e1d-2ae5c19737a7\") " pod="kuadrant-system/authorino-79cbc94b89-fvftr" Apr 22 20:05:24.227604 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:24.227581 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2lmp\" (UniqueName: \"kubernetes.io/projected/63e10ef4-0cd7-490d-9e1d-2ae5c19737a7-kube-api-access-h2lmp\") pod \"authorino-79cbc94b89-fvftr\" (UID: \"63e10ef4-0cd7-490d-9e1d-2ae5c19737a7\") " pod="kuadrant-system/authorino-79cbc94b89-fvftr" Apr 22 20:05:24.343900 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:24.343863 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-dkll2"] Apr 22 20:05:24.346770 ip-10-0-133-60 kubenswrapper[2575]: W0422 20:05:24.346741 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9632493_41c6_483b_aa3d_8e8a93a6a24b.slice/crio-365057dd3e844007963a766162004b5c5c417aa35c2031f23f49d4872179b03b WatchSource:0}: Error finding container 365057dd3e844007963a766162004b5c5c417aa35c2031f23f49d4872179b03b: Status 404 returned error can't find the container with id 365057dd3e844007963a766162004b5c5c417aa35c2031f23f49d4872179b03b Apr 22 20:05:24.357557 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:24.357531 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-fvftr" Apr 22 20:05:24.474897 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:24.474868 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-fvftr"] Apr 22 20:05:24.477226 ip-10-0-133-60 kubenswrapper[2575]: W0422 20:05:24.477198 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63e10ef4_0cd7_490d_9e1d_2ae5c19737a7.slice/crio-4257e2cc8707f7a150a24e7ce00aec93e91ad4aa9bde0cdb0a7051736f28b024 WatchSource:0}: Error finding container 4257e2cc8707f7a150a24e7ce00aec93e91ad4aa9bde0cdb0a7051736f28b024: Status 404 returned error can't find the container with id 4257e2cc8707f7a150a24e7ce00aec93e91ad4aa9bde0cdb0a7051736f28b024 Apr 22 20:05:25.309566 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:25.309477 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-dkll2" event={"ID":"b9632493-41c6-483b-aa3d-8e8a93a6a24b","Type":"ContainerStarted","Data":"365057dd3e844007963a766162004b5c5c417aa35c2031f23f49d4872179b03b"} Apr 22 20:05:25.312574 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:25.312542 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-fvftr" event={"ID":"63e10ef4-0cd7-490d-9e1d-2ae5c19737a7","Type":"ContainerStarted","Data":"4257e2cc8707f7a150a24e7ce00aec93e91ad4aa9bde0cdb0a7051736f28b024"} Apr 22 20:05:27.319591 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:27.319548 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-dkll2" event={"ID":"b9632493-41c6-483b-aa3d-8e8a93a6a24b","Type":"ContainerStarted","Data":"230fc4860c41d1654b17f1bb3b746b75c012f7642e77600fb29cfe23dd850537"} Apr 22 20:05:27.321279 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:27.321251 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-fvftr" event={"ID":"63e10ef4-0cd7-490d-9e1d-2ae5c19737a7","Type":"ContainerStarted","Data":"0ffabc5fee4f9fc20bc2264393f2325e1f2129e8a4a9fe1f15c262f11f4cfcf0"} Apr 22 20:05:27.352057 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:27.351992 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-dkll2" podStartSLOduration=1.5429141990000002 podStartE2EDuration="4.351974189s" podCreationTimestamp="2026-04-22 20:05:23 +0000 UTC" firstStartedPulling="2026-04-22 20:05:24.348207969 +0000 UTC m=+461.161593128" lastFinishedPulling="2026-04-22 20:05:27.15726796 +0000 UTC m=+463.970653118" observedRunningTime="2026-04-22 20:05:27.333542271 +0000 UTC m=+464.146927459" watchObservedRunningTime="2026-04-22 20:05:27.351974189 +0000 UTC m=+464.165359371" Apr 22 20:05:27.352994 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:27.352937 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-fvftr" podStartSLOduration=0.664331576 podStartE2EDuration="3.352923539s" podCreationTimestamp="2026-04-22 20:05:24 +0000 UTC" firstStartedPulling="2026-04-22 20:05:24.478558465 +0000 UTC m=+461.291943624" lastFinishedPulling="2026-04-22 20:05:27.167150425 +0000 UTC m=+463.980535587" observedRunningTime="2026-04-22 20:05:27.35172098 +0000 UTC m=+464.165106162" watchObservedRunningTime="2026-04-22 20:05:27.352923539 +0000 UTC m=+464.166308720" Apr 22 20:05:27.373627 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:27.373591 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-dkll2"] Apr 22 20:05:29.327867 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:29.327827 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-dkll2" podUID="b9632493-41c6-483b-aa3d-8e8a93a6a24b" containerName="authorino" containerID="cri-o://230fc4860c41d1654b17f1bb3b746b75c012f7642e77600fb29cfe23dd850537" gracePeriod=30 Apr 22 20:05:29.568895 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:29.568870 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-dkll2" Apr 22 20:05:29.668418 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:29.668298 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f9w2\" (UniqueName: \"kubernetes.io/projected/b9632493-41c6-483b-aa3d-8e8a93a6a24b-kube-api-access-4f9w2\") pod \"b9632493-41c6-483b-aa3d-8e8a93a6a24b\" (UID: \"b9632493-41c6-483b-aa3d-8e8a93a6a24b\") " Apr 22 20:05:29.670441 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:29.670417 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9632493-41c6-483b-aa3d-8e8a93a6a24b-kube-api-access-4f9w2" (OuterVolumeSpecName: "kube-api-access-4f9w2") pod "b9632493-41c6-483b-aa3d-8e8a93a6a24b" (UID: "b9632493-41c6-483b-aa3d-8e8a93a6a24b"). InnerVolumeSpecName "kube-api-access-4f9w2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:05:29.769292 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:29.769259 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4f9w2\" (UniqueName: \"kubernetes.io/projected/b9632493-41c6-483b-aa3d-8e8a93a6a24b-kube-api-access-4f9w2\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:05:30.331838 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:30.331800 2575 generic.go:358] "Generic (PLEG): container finished" podID="b9632493-41c6-483b-aa3d-8e8a93a6a24b" containerID="230fc4860c41d1654b17f1bb3b746b75c012f7642e77600fb29cfe23dd850537" exitCode=0 Apr 22 20:05:30.331838 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:30.331840 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-dkll2" event={"ID":"b9632493-41c6-483b-aa3d-8e8a93a6a24b","Type":"ContainerDied","Data":"230fc4860c41d1654b17f1bb3b746b75c012f7642e77600fb29cfe23dd850537"} Apr 22 20:05:30.332356 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:30.331864 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-dkll2" event={"ID":"b9632493-41c6-483b-aa3d-8e8a93a6a24b","Type":"ContainerDied","Data":"365057dd3e844007963a766162004b5c5c417aa35c2031f23f49d4872179b03b"} Apr 22 20:05:30.332356 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:30.331863 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-dkll2" Apr 22 20:05:30.332356 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:30.331885 2575 scope.go:117] "RemoveContainer" containerID="230fc4860c41d1654b17f1bb3b746b75c012f7642e77600fb29cfe23dd850537" Apr 22 20:05:30.341341 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:30.341299 2575 scope.go:117] "RemoveContainer" containerID="230fc4860c41d1654b17f1bb3b746b75c012f7642e77600fb29cfe23dd850537" Apr 22 20:05:30.341654 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:05:30.341633 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"230fc4860c41d1654b17f1bb3b746b75c012f7642e77600fb29cfe23dd850537\": container with ID starting with 230fc4860c41d1654b17f1bb3b746b75c012f7642e77600fb29cfe23dd850537 not found: ID does not exist" containerID="230fc4860c41d1654b17f1bb3b746b75c012f7642e77600fb29cfe23dd850537" Apr 22 20:05:30.341712 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:30.341663 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"230fc4860c41d1654b17f1bb3b746b75c012f7642e77600fb29cfe23dd850537"} err="failed to get container status \"230fc4860c41d1654b17f1bb3b746b75c012f7642e77600fb29cfe23dd850537\": rpc error: code = NotFound desc = could not find container \"230fc4860c41d1654b17f1bb3b746b75c012f7642e77600fb29cfe23dd850537\": container with ID starting with 230fc4860c41d1654b17f1bb3b746b75c012f7642e77600fb29cfe23dd850537 not found: ID does not exist" Apr 22 20:05:30.352099 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:30.352077 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-dkll2"] Apr 22 20:05:30.356329 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:30.356292 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-dkll2"] Apr 22 20:05:31.731026 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:31.730984 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9632493-41c6-483b-aa3d-8e8a93a6a24b" path="/var/lib/kubelet/pods/b9632493-41c6-483b-aa3d-8e8a93a6a24b/volumes" Apr 22 20:05:47.840111 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:47.840073 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bd676465-8xzwg"] Apr 22 20:05:47.840557 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:47.840426 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9632493-41c6-483b-aa3d-8e8a93a6a24b" containerName="authorino" Apr 22 20:05:47.840557 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:47.840439 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9632493-41c6-483b-aa3d-8e8a93a6a24b" containerName="authorino" Apr 22 20:05:47.840557 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:47.840511 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9632493-41c6-483b-aa3d-8e8a93a6a24b" containerName="authorino" Apr 22 20:05:47.844342 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:47.844305 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-8xzwg" Apr 22 20:05:47.846457 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:47.846437 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 22 20:05:47.849044 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:47.849022 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-8xzwg"] Apr 22 20:05:47.927485 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:47.927453 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2cfs\" (UniqueName: \"kubernetes.io/projected/041e72fc-4d39-42df-a5e4-ebae82c88495-kube-api-access-f2cfs\") pod \"authorino-68bd676465-8xzwg\" (UID: \"041e72fc-4d39-42df-a5e4-ebae82c88495\") " pod="kuadrant-system/authorino-68bd676465-8xzwg" Apr 22 20:05:47.927652 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:47.927564 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/041e72fc-4d39-42df-a5e4-ebae82c88495-tls-cert\") pod \"authorino-68bd676465-8xzwg\" (UID: \"041e72fc-4d39-42df-a5e4-ebae82c88495\") " pod="kuadrant-system/authorino-68bd676465-8xzwg" Apr 22 20:05:48.028651 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:48.028612 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2cfs\" (UniqueName: \"kubernetes.io/projected/041e72fc-4d39-42df-a5e4-ebae82c88495-kube-api-access-f2cfs\") pod \"authorino-68bd676465-8xzwg\" (UID: \"041e72fc-4d39-42df-a5e4-ebae82c88495\") " pod="kuadrant-system/authorino-68bd676465-8xzwg" Apr 22 20:05:48.028835 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:48.028715 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/041e72fc-4d39-42df-a5e4-ebae82c88495-tls-cert\") pod \"authorino-68bd676465-8xzwg\" (UID: \"041e72fc-4d39-42df-a5e4-ebae82c88495\") " pod="kuadrant-system/authorino-68bd676465-8xzwg" Apr 22 20:05:48.031095 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:48.031071 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/041e72fc-4d39-42df-a5e4-ebae82c88495-tls-cert\") pod \"authorino-68bd676465-8xzwg\" (UID: \"041e72fc-4d39-42df-a5e4-ebae82c88495\") " pod="kuadrant-system/authorino-68bd676465-8xzwg" Apr 22 20:05:48.036286 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:48.036269 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2cfs\" (UniqueName: \"kubernetes.io/projected/041e72fc-4d39-42df-a5e4-ebae82c88495-kube-api-access-f2cfs\") pod \"authorino-68bd676465-8xzwg\" (UID: \"041e72fc-4d39-42df-a5e4-ebae82c88495\") " pod="kuadrant-system/authorino-68bd676465-8xzwg" Apr 22 20:05:48.154827 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:48.154758 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-8xzwg" Apr 22 20:05:48.275260 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:48.275236 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-8xzwg"] Apr 22 20:05:48.277521 ip-10-0-133-60 kubenswrapper[2575]: W0422 20:05:48.277491 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod041e72fc_4d39_42df_a5e4_ebae82c88495.slice/crio-1bab16904d33a774b83954ecf115c98daa43659de46478f324bbfb98ba071c7d WatchSource:0}: Error finding container 1bab16904d33a774b83954ecf115c98daa43659de46478f324bbfb98ba071c7d: Status 404 returned error can't find the container with id 1bab16904d33a774b83954ecf115c98daa43659de46478f324bbfb98ba071c7d Apr 22 20:05:48.384682 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:48.384652 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-8xzwg" event={"ID":"041e72fc-4d39-42df-a5e4-ebae82c88495","Type":"ContainerStarted","Data":"1bab16904d33a774b83954ecf115c98daa43659de46478f324bbfb98ba071c7d"} Apr 22 20:05:49.388306 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:49.388264 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-8xzwg" event={"ID":"041e72fc-4d39-42df-a5e4-ebae82c88495","Type":"ContainerStarted","Data":"0cd4f94d26c0f5ad47440817e623a538329bbaef7418619744c9f5d748d9b5b8"} Apr 22 20:05:49.403769 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:49.403723 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bd676465-8xzwg" podStartSLOduration=2.023325646 podStartE2EDuration="2.403709395s" podCreationTimestamp="2026-04-22 20:05:47 +0000 UTC" firstStartedPulling="2026-04-22 20:05:48.278789064 +0000 UTC m=+485.092174223" lastFinishedPulling="2026-04-22 20:05:48.659172813 +0000 UTC m=+485.472557972" observedRunningTime="2026-04-22 20:05:49.402477285 +0000 UTC m=+486.215862491" watchObservedRunningTime="2026-04-22 20:05:49.403709395 +0000 UTC m=+486.217094576" Apr 22 20:05:49.429206 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:49.429175 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-fvftr"] Apr 22 20:05:49.429410 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:49.429388 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-fvftr" podUID="63e10ef4-0cd7-490d-9e1d-2ae5c19737a7" containerName="authorino" containerID="cri-o://0ffabc5fee4f9fc20bc2264393f2325e1f2129e8a4a9fe1f15c262f11f4cfcf0" gracePeriod=30 Apr 22 20:05:49.663284 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:49.663263 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-fvftr" Apr 22 20:05:49.743724 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:49.743694 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2lmp\" (UniqueName: \"kubernetes.io/projected/63e10ef4-0cd7-490d-9e1d-2ae5c19737a7-kube-api-access-h2lmp\") pod \"63e10ef4-0cd7-490d-9e1d-2ae5c19737a7\" (UID: \"63e10ef4-0cd7-490d-9e1d-2ae5c19737a7\") " Apr 22 20:05:49.745775 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:49.745752 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63e10ef4-0cd7-490d-9e1d-2ae5c19737a7-kube-api-access-h2lmp" (OuterVolumeSpecName: "kube-api-access-h2lmp") pod "63e10ef4-0cd7-490d-9e1d-2ae5c19737a7" (UID: "63e10ef4-0cd7-490d-9e1d-2ae5c19737a7"). InnerVolumeSpecName "kube-api-access-h2lmp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:05:49.844991 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:49.844940 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h2lmp\" (UniqueName: \"kubernetes.io/projected/63e10ef4-0cd7-490d-9e1d-2ae5c19737a7-kube-api-access-h2lmp\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:05:50.391645 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:50.391605 2575 generic.go:358] "Generic (PLEG): container finished" podID="63e10ef4-0cd7-490d-9e1d-2ae5c19737a7" containerID="0ffabc5fee4f9fc20bc2264393f2325e1f2129e8a4a9fe1f15c262f11f4cfcf0" exitCode=0 Apr 22 20:05:50.392107 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:50.391653 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-fvftr" Apr 22 20:05:50.392107 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:50.391690 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-fvftr" event={"ID":"63e10ef4-0cd7-490d-9e1d-2ae5c19737a7","Type":"ContainerDied","Data":"0ffabc5fee4f9fc20bc2264393f2325e1f2129e8a4a9fe1f15c262f11f4cfcf0"} Apr 22 20:05:50.392107 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:50.391727 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-fvftr" event={"ID":"63e10ef4-0cd7-490d-9e1d-2ae5c19737a7","Type":"ContainerDied","Data":"4257e2cc8707f7a150a24e7ce00aec93e91ad4aa9bde0cdb0a7051736f28b024"} Apr 22 20:05:50.392107 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:50.391743 2575 scope.go:117] "RemoveContainer" containerID="0ffabc5fee4f9fc20bc2264393f2325e1f2129e8a4a9fe1f15c262f11f4cfcf0" Apr 22 20:05:50.399429 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:50.399409 2575 scope.go:117] "RemoveContainer" containerID="0ffabc5fee4f9fc20bc2264393f2325e1f2129e8a4a9fe1f15c262f11f4cfcf0" Apr 22 20:05:50.399635 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:05:50.399618 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ffabc5fee4f9fc20bc2264393f2325e1f2129e8a4a9fe1f15c262f11f4cfcf0\": container with ID starting with 0ffabc5fee4f9fc20bc2264393f2325e1f2129e8a4a9fe1f15c262f11f4cfcf0 not found: ID does not exist" containerID="0ffabc5fee4f9fc20bc2264393f2325e1f2129e8a4a9fe1f15c262f11f4cfcf0" Apr 22 20:05:50.399690 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:50.399640 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ffabc5fee4f9fc20bc2264393f2325e1f2129e8a4a9fe1f15c262f11f4cfcf0"} err="failed to get container status \"0ffabc5fee4f9fc20bc2264393f2325e1f2129e8a4a9fe1f15c262f11f4cfcf0\": rpc error: code = NotFound desc = could not find container \"0ffabc5fee4f9fc20bc2264393f2325e1f2129e8a4a9fe1f15c262f11f4cfcf0\": container with ID starting with 0ffabc5fee4f9fc20bc2264393f2325e1f2129e8a4a9fe1f15c262f11f4cfcf0 not found: ID does not exist" Apr 22 20:05:50.410780 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:50.410751 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-fvftr"] Apr 22 20:05:50.416301 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:50.416280 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-fvftr"] Apr 22 20:05:51.730546 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:05:51.730512 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63e10ef4-0cd7-490d-9e1d-2ae5c19737a7" path="/var/lib/kubelet/pods/63e10ef4-0cd7-490d-9e1d-2ae5c19737a7/volumes" Apr 22 20:06:08.130181 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:06:08.130148 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-55f79dccc6-fvrwp"] Apr 22 20:06:08.130736 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:06:08.130590 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="63e10ef4-0cd7-490d-9e1d-2ae5c19737a7" containerName="authorino" Apr 22 20:06:08.130736 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:06:08.130610 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="63e10ef4-0cd7-490d-9e1d-2ae5c19737a7" containerName="authorino" Apr 22 20:06:08.130736 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:06:08.130699 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="63e10ef4-0cd7-490d-9e1d-2ae5c19737a7" containerName="authorino" Apr 22 20:06:08.133523 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:06:08.133504 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-55f79dccc6-fvrwp" Apr 22 20:06:08.136072 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:06:08.136052 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 22 20:06:08.136161 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:06:08.136079 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-6dkhn\"" Apr 22 20:06:08.136943 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:06:08.136924 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 20:06:08.137030 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:06:08.136944 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 20:06:08.143781 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:06:08.143759 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-55f79dccc6-fvrwp"] Apr 22 20:06:08.185228 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:06:08.185202 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f2e04ae-561b-4e42-abd9-76587f41da06-cert\") pod \"llmisvc-controller-manager-55f79dccc6-fvrwp\" (UID: \"0f2e04ae-561b-4e42-abd9-76587f41da06\") " pod="kserve/llmisvc-controller-manager-55f79dccc6-fvrwp" Apr 22 20:06:08.185383 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:06:08.185264 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bczgx\" (UniqueName: \"kubernetes.io/projected/0f2e04ae-561b-4e42-abd9-76587f41da06-kube-api-access-bczgx\") pod \"llmisvc-controller-manager-55f79dccc6-fvrwp\" (UID: \"0f2e04ae-561b-4e42-abd9-76587f41da06\") " pod="kserve/llmisvc-controller-manager-55f79dccc6-fvrwp" Apr 22 20:06:08.286567 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:06:08.286536 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bczgx\" (UniqueName: \"kubernetes.io/projected/0f2e04ae-561b-4e42-abd9-76587f41da06-kube-api-access-bczgx\") pod \"llmisvc-controller-manager-55f79dccc6-fvrwp\" (UID: \"0f2e04ae-561b-4e42-abd9-76587f41da06\") " pod="kserve/llmisvc-controller-manager-55f79dccc6-fvrwp" Apr 22 20:06:08.286711 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:06:08.286583 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f2e04ae-561b-4e42-abd9-76587f41da06-cert\") pod \"llmisvc-controller-manager-55f79dccc6-fvrwp\" (UID: \"0f2e04ae-561b-4e42-abd9-76587f41da06\") " pod="kserve/llmisvc-controller-manager-55f79dccc6-fvrwp" Apr 22 20:06:08.286711 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:06:08.286680 2575 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 22 20:06:08.286779 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:06:08.286738 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f2e04ae-561b-4e42-abd9-76587f41da06-cert podName:0f2e04ae-561b-4e42-abd9-76587f41da06 nodeName:}" failed. No retries permitted until 2026-04-22 20:06:08.786724018 +0000 UTC m=+505.600109177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0f2e04ae-561b-4e42-abd9-76587f41da06-cert") pod "llmisvc-controller-manager-55f79dccc6-fvrwp" (UID: "0f2e04ae-561b-4e42-abd9-76587f41da06") : secret "llmisvc-webhook-server-cert" not found Apr 22 20:06:08.297864 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:06:08.297835 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bczgx\" (UniqueName: \"kubernetes.io/projected/0f2e04ae-561b-4e42-abd9-76587f41da06-kube-api-access-bczgx\") pod \"llmisvc-controller-manager-55f79dccc6-fvrwp\" (UID: \"0f2e04ae-561b-4e42-abd9-76587f41da06\") " pod="kserve/llmisvc-controller-manager-55f79dccc6-fvrwp" Apr 22 20:06:08.790834 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:06:08.790798 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f2e04ae-561b-4e42-abd9-76587f41da06-cert\") pod \"llmisvc-controller-manager-55f79dccc6-fvrwp\" (UID: \"0f2e04ae-561b-4e42-abd9-76587f41da06\") " pod="kserve/llmisvc-controller-manager-55f79dccc6-fvrwp" Apr 22 20:06:08.793264 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:06:08.793235 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f2e04ae-561b-4e42-abd9-76587f41da06-cert\") pod \"llmisvc-controller-manager-55f79dccc6-fvrwp\" (UID: \"0f2e04ae-561b-4e42-abd9-76587f41da06\") " pod="kserve/llmisvc-controller-manager-55f79dccc6-fvrwp" Apr 22 20:06:09.043047 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:06:09.042965 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-55f79dccc6-fvrwp" Apr 22 20:06:09.164678 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:06:09.164657 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-55f79dccc6-fvrwp"] Apr 22 20:06:09.166926 ip-10-0-133-60 kubenswrapper[2575]: W0422 20:06:09.166898 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0f2e04ae_561b_4e42_abd9_76587f41da06.slice/crio-5c92b59c87d227121312a2376770e0aefd9279784f01fb95307341269d059983 WatchSource:0}: Error finding container 5c92b59c87d227121312a2376770e0aefd9279784f01fb95307341269d059983: Status 404 returned error can't find the container with id 5c92b59c87d227121312a2376770e0aefd9279784f01fb95307341269d059983 Apr 22 20:06:09.452486 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:06:09.452406 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-55f79dccc6-fvrwp" event={"ID":"0f2e04ae-561b-4e42-abd9-76587f41da06","Type":"ContainerStarted","Data":"5c92b59c87d227121312a2376770e0aefd9279784f01fb95307341269d059983"} Apr 22 20:06:13.465128 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:06:13.465089 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-55f79dccc6-fvrwp" event={"ID":"0f2e04ae-561b-4e42-abd9-76587f41da06","Type":"ContainerStarted","Data":"fe93a825496f44c15310b75a40b3556a51afc3f5623b9569fa6228717c78ba92"} Apr 22 20:06:13.465586 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:06:13.465227 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-55f79dccc6-fvrwp" Apr 22 20:06:13.480868 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:06:13.480814 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-55f79dccc6-fvrwp" podStartSLOduration=1.885303613 podStartE2EDuration="5.480798156s" podCreationTimestamp="2026-04-22 20:06:08 +0000 UTC" firstStartedPulling="2026-04-22 20:06:09.168727109 +0000 UTC m=+505.982112267" lastFinishedPulling="2026-04-22 20:06:12.76422165 +0000 UTC m=+509.577606810" observedRunningTime="2026-04-22 20:06:13.479066972 +0000 UTC m=+510.292452153" watchObservedRunningTime="2026-04-22 20:06:13.480798156 +0000 UTC m=+510.294183338" Apr 22 20:06:44.471274 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:06:44.471243 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-55f79dccc6-fvrwp" Apr 22 20:07:43.667239 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:07:43.667212 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vfhht_338d333b-8626-46b3-b450-fbdd521183d8/console-operator/1.log" Apr 22 20:07:43.668036 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:07:43.668017 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vfhht_338d333b-8626-46b3-b450-fbdd521183d8/console-operator/1.log" Apr 22 20:08:14.496966 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:14.496893 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6"] Apr 22 20:08:14.500414 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:14.500396 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" Apr 22 20:08:14.503927 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:14.503905 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 20:08:14.504052 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:14.503934 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 22 20:08:14.504052 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:14.503939 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tlcdw\"" Apr 22 20:08:14.504170 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:14.504061 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 20:08:14.511448 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:14.511424 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6"] Apr 22 20:08:14.604252 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:14.604215 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f351df6-25b6-4271-adc5-8c82b1828666-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6\" (UID: \"4f351df6-25b6-4271-adc5-8c82b1828666\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" Apr 22 20:08:14.604430 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:14.604267 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck7c9\" (UniqueName: \"kubernetes.io/projected/4f351df6-25b6-4271-adc5-8c82b1828666-kube-api-access-ck7c9\") pod \"scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6\" (UID: \"4f351df6-25b6-4271-adc5-8c82b1828666\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" Apr 22 20:08:14.604430 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:14.604292 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4f351df6-25b6-4271-adc5-8c82b1828666-home\") pod \"scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6\" (UID: \"4f351df6-25b6-4271-adc5-8c82b1828666\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" Apr 22 20:08:14.604430 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:14.604336 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4f351df6-25b6-4271-adc5-8c82b1828666-model-cache\") pod \"scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6\" (UID: \"4f351df6-25b6-4271-adc5-8c82b1828666\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" Apr 22 20:08:14.604430 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:14.604419 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4f351df6-25b6-4271-adc5-8c82b1828666-dshm\") pod \"scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6\" (UID: \"4f351df6-25b6-4271-adc5-8c82b1828666\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" Apr 22 20:08:14.604565 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:14.604452 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4f351df6-25b6-4271-adc5-8c82b1828666-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6\" (UID: \"4f351df6-25b6-4271-adc5-8c82b1828666\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" Apr 22 20:08:14.705594 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:14.705565 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4f351df6-25b6-4271-adc5-8c82b1828666-home\") pod \"scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6\" (UID: \"4f351df6-25b6-4271-adc5-8c82b1828666\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" Apr 22 20:08:14.705594 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:14.705596 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4f351df6-25b6-4271-adc5-8c82b1828666-model-cache\") pod \"scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6\" (UID: \"4f351df6-25b6-4271-adc5-8c82b1828666\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" Apr 22 20:08:14.705894 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:14.705635 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4f351df6-25b6-4271-adc5-8c82b1828666-dshm\") pod \"scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6\" (UID: \"4f351df6-25b6-4271-adc5-8c82b1828666\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" Apr 22 20:08:14.705894 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:14.705666 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4f351df6-25b6-4271-adc5-8c82b1828666-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6\" (UID: \"4f351df6-25b6-4271-adc5-8c82b1828666\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" Apr 22 20:08:14.705894 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:14.705710 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f351df6-25b6-4271-adc5-8c82b1828666-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6\" (UID: \"4f351df6-25b6-4271-adc5-8c82b1828666\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" Apr 22 20:08:14.705894 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:14.705762 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ck7c9\" (UniqueName: \"kubernetes.io/projected/4f351df6-25b6-4271-adc5-8c82b1828666-kube-api-access-ck7c9\") pod \"scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6\" (UID: \"4f351df6-25b6-4271-adc5-8c82b1828666\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" Apr 22 20:08:14.706157 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:14.705979 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4f351df6-25b6-4271-adc5-8c82b1828666-home\") pod \"scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6\" (UID: \"4f351df6-25b6-4271-adc5-8c82b1828666\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" Apr 22 20:08:14.706157 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:14.706021 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4f351df6-25b6-4271-adc5-8c82b1828666-model-cache\") pod \"scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6\" (UID: \"4f351df6-25b6-4271-adc5-8c82b1828666\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" Apr 22 20:08:14.706157 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:14.706090 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f351df6-25b6-4271-adc5-8c82b1828666-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6\" (UID: \"4f351df6-25b6-4271-adc5-8c82b1828666\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" Apr 22 20:08:14.707966 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:14.707939 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4f351df6-25b6-4271-adc5-8c82b1828666-dshm\") pod \"scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6\" (UID: \"4f351df6-25b6-4271-adc5-8c82b1828666\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" Apr 22 20:08:14.708165 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:14.708149 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4f351df6-25b6-4271-adc5-8c82b1828666-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6\" (UID: \"4f351df6-25b6-4271-adc5-8c82b1828666\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" Apr 22 20:08:14.715181 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:14.715153 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck7c9\" (UniqueName: \"kubernetes.io/projected/4f351df6-25b6-4271-adc5-8c82b1828666-kube-api-access-ck7c9\") pod \"scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6\" (UID: \"4f351df6-25b6-4271-adc5-8c82b1828666\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" Apr 22 20:08:14.810146 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:14.810043 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" Apr 22 20:08:14.934240 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:14.934207 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6"] Apr 22 20:08:14.938406 ip-10-0-133-60 kubenswrapper[2575]: W0422 20:08:14.938374 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f351df6_25b6_4271_adc5_8c82b1828666.slice/crio-37b87c3eece084662e93ce85b8dcaa430067b35f623d81c3e04a8a9b7ac07f44 WatchSource:0}: Error finding container 37b87c3eece084662e93ce85b8dcaa430067b35f623d81c3e04a8a9b7ac07f44: Status 404 returned error can't find the container with id 37b87c3eece084662e93ce85b8dcaa430067b35f623d81c3e04a8a9b7ac07f44 Apr 22 20:08:15.818209 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:15.818168 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" event={"ID":"4f351df6-25b6-4271-adc5-8c82b1828666","Type":"ContainerStarted","Data":"37b87c3eece084662e93ce85b8dcaa430067b35f623d81c3e04a8a9b7ac07f44"} Apr 22 20:08:18.828124 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:18.828080 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" event={"ID":"4f351df6-25b6-4271-adc5-8c82b1828666","Type":"ContainerStarted","Data":"2f294d683816fe5603343b2b9c53f20e70ba66b6a3d6c41c770f20cd2dc86aaf"} Apr 22 20:08:22.843925 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:22.843884 2575 generic.go:358] "Generic (PLEG): container finished" podID="4f351df6-25b6-4271-adc5-8c82b1828666" containerID="2f294d683816fe5603343b2b9c53f20e70ba66b6a3d6c41c770f20cd2dc86aaf" exitCode=0 Apr 22 20:08:22.844460 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:22.843918 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" event={"ID":"4f351df6-25b6-4271-adc5-8c82b1828666","Type":"ContainerDied","Data":"2f294d683816fe5603343b2b9c53f20e70ba66b6a3d6c41c770f20cd2dc86aaf"} Apr 22 20:08:24.852603 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:24.852569 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" event={"ID":"4f351df6-25b6-4271-adc5-8c82b1828666","Type":"ContainerStarted","Data":"a501be0a3ac954efbfeae203a72547a560c73347e65e9b1a4f6dc88b270b8b5f"} Apr 22 20:08:24.868744 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:24.868666 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" podStartSLOduration=1.8226924869999999 podStartE2EDuration="10.868652611s" podCreationTimestamp="2026-04-22 20:08:14 +0000 UTC" firstStartedPulling="2026-04-22 20:08:14.940224221 +0000 UTC m=+631.753609384" lastFinishedPulling="2026-04-22 20:08:23.986184345 +0000 UTC m=+640.799569508" observedRunningTime="2026-04-22 20:08:24.86839255 +0000 UTC m=+641.681777741" watchObservedRunningTime="2026-04-22 20:08:24.868652611 +0000 UTC m=+641.682037792" Apr 22 20:08:34.810930 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:34.810896 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" Apr 22 20:08:34.810930 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:34.810938 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" Apr 22 20:08:34.823776 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:34.823754 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" Apr 22 20:08:34.892831 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:08:34.892803 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" Apr 22 20:09:11.007226 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.007193 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh"] Apr 22 20:09:11.010717 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.010699 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" Apr 22 20:09:11.012985 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.012965 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 22 20:09:11.018167 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.018149 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh"] Apr 22 20:09:11.085555 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.085516 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh\" (UID: \"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" Apr 22 20:09:11.085555 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.085571 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-model-cache\") pod \"scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh\" (UID: \"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" Apr 22 20:09:11.085779 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.085621 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpg8l\" (UniqueName: \"kubernetes.io/projected/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-kube-api-access-dpg8l\") pod \"scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh\" (UID: \"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" Apr 22 20:09:11.085779 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.085708 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-dshm\") pod \"scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh\" (UID: \"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" Apr 22 20:09:11.085779 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.085744 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-home\") pod \"scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh\" (UID: \"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" Apr 22 20:09:11.085779 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.085770 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh\" (UID: \"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" Apr 22 20:09:11.186667 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.186631 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-dshm\") pod \"scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh\" (UID: \"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" Apr 22 20:09:11.186667 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.186672 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-home\") pod \"scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh\" (UID: \"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" Apr 22 20:09:11.186868 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.186695 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh\" (UID: \"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" Apr 22 20:09:11.186868 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.186745 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh\" (UID: \"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" Apr 22 20:09:11.186868 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.186778 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-model-cache\") pod \"scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh\" (UID: \"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" Apr 22 20:09:11.186998 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.186924 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dpg8l\" (UniqueName: \"kubernetes.io/projected/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-kube-api-access-dpg8l\") pod \"scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh\" (UID: \"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" Apr 22 20:09:11.187177 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.187145 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh\" (UID: \"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" Apr 22 20:09:11.187177 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.187164 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-home\") pod \"scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh\" (UID: \"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" Apr 22 20:09:11.187391 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.187210 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-model-cache\") pod \"scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh\" (UID: \"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" Apr 22 20:09:11.188957 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.188935 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-dshm\") pod \"scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh\" (UID: \"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" Apr 22 20:09:11.189197 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.189181 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh\" (UID: \"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" Apr 22 20:09:11.195594 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.195573 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpg8l\" (UniqueName: \"kubernetes.io/projected/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-kube-api-access-dpg8l\") pod \"scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh\" (UID: \"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" Apr 22 20:09:11.281372 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.281287 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn"] Apr 22 20:09:11.285061 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.285042 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" Apr 22 20:09:11.287259 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.287229 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-25ljw\"" Apr 22 20:09:11.294767 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.294739 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn"] Apr 22 20:09:11.321409 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.321379 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" Apr 22 20:09:11.388606 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.388576 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn\" (UID: \"88198d8c-94ba-43a6-a1a5-d5d633fe31a2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" Apr 22 20:09:11.388750 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.388610 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn\" (UID: \"88198d8c-94ba-43a6-a1a5-d5d633fe31a2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" Apr 22 20:09:11.388750 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.388634 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn\" (UID: \"88198d8c-94ba-43a6-a1a5-d5d633fe31a2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" Apr 22 20:09:11.388750 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.388677 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6qhd\" (UniqueName: \"kubernetes.io/projected/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-kube-api-access-m6qhd\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn\" (UID: \"88198d8c-94ba-43a6-a1a5-d5d633fe31a2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" Apr 22 20:09:11.388750 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.388697 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn\" (UID: \"88198d8c-94ba-43a6-a1a5-d5d633fe31a2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" Apr 22 20:09:11.388750 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.388711 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn\" (UID: \"88198d8c-94ba-43a6-a1a5-d5d633fe31a2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" Apr 22 20:09:11.442851 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.442814 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh"] Apr 22 20:09:11.447380 ip-10-0-133-60 kubenswrapper[2575]: W0422 20:09:11.447346 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ebcea6c_965f_48dc_ab7b_2e65a0f62c98.slice/crio-0fb9eb924ce1d184973afdfe8c624ab8f97e4351507e22db172a032a3b4fce5a WatchSource:0}: Error finding container 0fb9eb924ce1d184973afdfe8c624ab8f97e4351507e22db172a032a3b4fce5a: Status 404 returned error can't find the container with id 0fb9eb924ce1d184973afdfe8c624ab8f97e4351507e22db172a032a3b4fce5a Apr 22 20:09:11.449112 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.449096 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:09:11.489232 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.489208 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6qhd\" (UniqueName: \"kubernetes.io/projected/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-kube-api-access-m6qhd\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn\" (UID: \"88198d8c-94ba-43a6-a1a5-d5d633fe31a2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" Apr 22 20:09:11.489332 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.489244 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn\" (UID: \"88198d8c-94ba-43a6-a1a5-d5d633fe31a2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" Apr 22 20:09:11.489332 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.489271 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn\" (UID: \"88198d8c-94ba-43a6-a1a5-d5d633fe31a2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" Apr 22 20:09:11.489419 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.489353 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn\" (UID: \"88198d8c-94ba-43a6-a1a5-d5d633fe31a2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" Apr 22 20:09:11.489419 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.489379 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn\" (UID: \"88198d8c-94ba-43a6-a1a5-d5d633fe31a2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" Apr 22 20:09:11.489419 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.489409 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn\" (UID: \"88198d8c-94ba-43a6-a1a5-d5d633fe31a2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" Apr 22 20:09:11.489736 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.489714 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn\" (UID: \"88198d8c-94ba-43a6-a1a5-d5d633fe31a2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" Apr 22 20:09:11.489795 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.489774 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn\" (UID: \"88198d8c-94ba-43a6-a1a5-d5d633fe31a2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" Apr 22 20:09:11.489837 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.489789 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn\" (UID: \"88198d8c-94ba-43a6-a1a5-d5d633fe31a2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" Apr 22 20:09:11.489951 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.489936 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn\" (UID: \"88198d8c-94ba-43a6-a1a5-d5d633fe31a2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" Apr 22 20:09:11.491716 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.491691 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn\" (UID: \"88198d8c-94ba-43a6-a1a5-d5d633fe31a2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" Apr 22 20:09:11.496724 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.496699 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6qhd\" (UniqueName: \"kubernetes.io/projected/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-kube-api-access-m6qhd\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn\" (UID: \"88198d8c-94ba-43a6-a1a5-d5d633fe31a2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" Apr 22 20:09:11.595279 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.595198 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" Apr 22 20:09:11.740778 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.740751 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn"] Apr 22 20:09:11.741171 ip-10-0-133-60 kubenswrapper[2575]: W0422 20:09:11.741145 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88198d8c_94ba_43a6_a1a5_d5d633fe31a2.slice/crio-191f5548034260b447d55349d1ee9865990ea3a0159b4ad39e79d3f055cfbcd7 WatchSource:0}: Error finding container 191f5548034260b447d55349d1ee9865990ea3a0159b4ad39e79d3f055cfbcd7: Status 404 returned error can't find the container with id 191f5548034260b447d55349d1ee9865990ea3a0159b4ad39e79d3f055cfbcd7 Apr 22 20:09:11.989838 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.989739 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" event={"ID":"88198d8c-94ba-43a6-a1a5-d5d633fe31a2","Type":"ContainerStarted","Data":"222174c2ebec22a25f884d5e75dc7ffb41b0c61749e85876aa48fc769bca489f"} Apr 22 20:09:11.989838 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.989789 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" event={"ID":"88198d8c-94ba-43a6-a1a5-d5d633fe31a2","Type":"ContainerStarted","Data":"191f5548034260b447d55349d1ee9865990ea3a0159b4ad39e79d3f055cfbcd7"} Apr 22 20:09:11.991202 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.991172 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" event={"ID":"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98","Type":"ContainerStarted","Data":"bbaaa7d2fbad9afd33005967a9b46e24de1ba547d2306d675a1c38d3cc8c882f"} Apr 22 20:09:11.991347 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:11.991208 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" event={"ID":"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98","Type":"ContainerStarted","Data":"0fb9eb924ce1d184973afdfe8c624ab8f97e4351507e22db172a032a3b4fce5a"} Apr 22 20:09:12.631387 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:12.631350 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6"] Apr 22 20:09:12.631816 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:12.631642 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" podUID="4f351df6-25b6-4271-adc5-8c82b1828666" containerName="main" containerID="cri-o://a501be0a3ac954efbfeae203a72547a560c73347e65e9b1a4f6dc88b270b8b5f" gracePeriod=30 Apr 22 20:09:12.901426 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:12.901398 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" Apr 22 20:09:12.999799 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:12.999773 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ck7c9\" (UniqueName: \"kubernetes.io/projected/4f351df6-25b6-4271-adc5-8c82b1828666-kube-api-access-ck7c9\") pod \"4f351df6-25b6-4271-adc5-8c82b1828666\" (UID: \"4f351df6-25b6-4271-adc5-8c82b1828666\") " Apr 22 20:09:12.999945 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:12.999813 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4f351df6-25b6-4271-adc5-8c82b1828666-tls-certs\") pod \"4f351df6-25b6-4271-adc5-8c82b1828666\" (UID: \"4f351df6-25b6-4271-adc5-8c82b1828666\") " Apr 22 20:09:12.999945 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:12.999864 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4f351df6-25b6-4271-adc5-8c82b1828666-model-cache\") pod \"4f351df6-25b6-4271-adc5-8c82b1828666\" (UID: \"4f351df6-25b6-4271-adc5-8c82b1828666\") " Apr 22 20:09:12.999945 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:12.999920 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4f351df6-25b6-4271-adc5-8c82b1828666-home\") pod \"4f351df6-25b6-4271-adc5-8c82b1828666\" (UID: \"4f351df6-25b6-4271-adc5-8c82b1828666\") " Apr 22 20:09:13.000118 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:12.999968 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f351df6-25b6-4271-adc5-8c82b1828666-kserve-provision-location\") pod \"4f351df6-25b6-4271-adc5-8c82b1828666\" (UID: \"4f351df6-25b6-4271-adc5-8c82b1828666\") " Apr 22 20:09:13.000118 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:13.000007 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4f351df6-25b6-4271-adc5-8c82b1828666-dshm\") pod \"4f351df6-25b6-4271-adc5-8c82b1828666\" (UID: \"4f351df6-25b6-4271-adc5-8c82b1828666\") " Apr 22 20:09:13.000118 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:13.000059 2575 generic.go:358] "Generic (PLEG): container finished" podID="88198d8c-94ba-43a6-a1a5-d5d633fe31a2" containerID="222174c2ebec22a25f884d5e75dc7ffb41b0c61749e85876aa48fc769bca489f" exitCode=0 Apr 22 20:09:13.000261 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:13.000126 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f351df6-25b6-4271-adc5-8c82b1828666-model-cache" (OuterVolumeSpecName: "model-cache") pod "4f351df6-25b6-4271-adc5-8c82b1828666" (UID: "4f351df6-25b6-4271-adc5-8c82b1828666"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:09:13.000261 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:13.000168 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" event={"ID":"88198d8c-94ba-43a6-a1a5-d5d633fe31a2","Type":"ContainerDied","Data":"222174c2ebec22a25f884d5e75dc7ffb41b0c61749e85876aa48fc769bca489f"} Apr 22 20:09:13.000399 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:13.000282 2575 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4f351df6-25b6-4271-adc5-8c82b1828666-model-cache\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:09:13.000496 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:13.000476 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f351df6-25b6-4271-adc5-8c82b1828666-home" (OuterVolumeSpecName: "home") pod "4f351df6-25b6-4271-adc5-8c82b1828666" (UID: "4f351df6-25b6-4271-adc5-8c82b1828666"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:09:13.002809 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:13.002779 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f351df6-25b6-4271-adc5-8c82b1828666-kube-api-access-ck7c9" (OuterVolumeSpecName: "kube-api-access-ck7c9") pod "4f351df6-25b6-4271-adc5-8c82b1828666" (UID: "4f351df6-25b6-4271-adc5-8c82b1828666"). InnerVolumeSpecName "kube-api-access-ck7c9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:09:13.002911 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:13.002850 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f351df6-25b6-4271-adc5-8c82b1828666-dshm" (OuterVolumeSpecName: "dshm") pod "4f351df6-25b6-4271-adc5-8c82b1828666" (UID: "4f351df6-25b6-4271-adc5-8c82b1828666"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:09:13.003141 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:13.003111 2575 generic.go:358] "Generic (PLEG): container finished" podID="4f351df6-25b6-4271-adc5-8c82b1828666" containerID="a501be0a3ac954efbfeae203a72547a560c73347e65e9b1a4f6dc88b270b8b5f" exitCode=0 Apr 22 20:09:13.003225 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:13.003141 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f351df6-25b6-4271-adc5-8c82b1828666-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "4f351df6-25b6-4271-adc5-8c82b1828666" (UID: "4f351df6-25b6-4271-adc5-8c82b1828666"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:09:13.003283 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:13.003231 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" Apr 22 20:09:13.003283 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:13.003262 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" event={"ID":"4f351df6-25b6-4271-adc5-8c82b1828666","Type":"ContainerDied","Data":"a501be0a3ac954efbfeae203a72547a560c73347e65e9b1a4f6dc88b270b8b5f"} Apr 22 20:09:13.003374 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:13.003286 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6" event={"ID":"4f351df6-25b6-4271-adc5-8c82b1828666","Type":"ContainerDied","Data":"37b87c3eece084662e93ce85b8dcaa430067b35f623d81c3e04a8a9b7ac07f44"} Apr 22 20:09:13.003374 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:13.003306 2575 scope.go:117] "RemoveContainer" containerID="a501be0a3ac954efbfeae203a72547a560c73347e65e9b1a4f6dc88b270b8b5f" Apr 22 20:09:13.037576 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:13.037522 2575 scope.go:117] "RemoveContainer" containerID="2f294d683816fe5603343b2b9c53f20e70ba66b6a3d6c41c770f20cd2dc86aaf" Apr 22 20:09:13.069513 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:13.069472 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f351df6-25b6-4271-adc5-8c82b1828666-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4f351df6-25b6-4271-adc5-8c82b1828666" (UID: "4f351df6-25b6-4271-adc5-8c82b1828666"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:09:13.101480 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:13.101442 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ck7c9\" (UniqueName: \"kubernetes.io/projected/4f351df6-25b6-4271-adc5-8c82b1828666-kube-api-access-ck7c9\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:09:13.101480 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:13.101479 2575 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4f351df6-25b6-4271-adc5-8c82b1828666-tls-certs\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:09:13.101480 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:13.101494 2575 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4f351df6-25b6-4271-adc5-8c82b1828666-home\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:09:13.101767 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:13.101507 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f351df6-25b6-4271-adc5-8c82b1828666-kserve-provision-location\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:09:13.101767 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:13.101517 2575 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4f351df6-25b6-4271-adc5-8c82b1828666-dshm\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:09:13.103515 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:13.103493 2575 scope.go:117] "RemoveContainer" containerID="a501be0a3ac954efbfeae203a72547a560c73347e65e9b1a4f6dc88b270b8b5f" Apr 22 20:09:13.103876 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:09:13.103846 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a501be0a3ac954efbfeae203a72547a560c73347e65e9b1a4f6dc88b270b8b5f\": container with ID starting with a501be0a3ac954efbfeae203a72547a560c73347e65e9b1a4f6dc88b270b8b5f not found: ID does not exist" containerID="a501be0a3ac954efbfeae203a72547a560c73347e65e9b1a4f6dc88b270b8b5f" Apr 22 20:09:13.103977 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:13.103882 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a501be0a3ac954efbfeae203a72547a560c73347e65e9b1a4f6dc88b270b8b5f"} err="failed to get container status \"a501be0a3ac954efbfeae203a72547a560c73347e65e9b1a4f6dc88b270b8b5f\": rpc error: code = NotFound desc = could not find container \"a501be0a3ac954efbfeae203a72547a560c73347e65e9b1a4f6dc88b270b8b5f\": container with ID starting with a501be0a3ac954efbfeae203a72547a560c73347e65e9b1a4f6dc88b270b8b5f not found: ID does not exist" Apr 22 20:09:13.103977 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:13.103902 2575 scope.go:117] "RemoveContainer" containerID="2f294d683816fe5603343b2b9c53f20e70ba66b6a3d6c41c770f20cd2dc86aaf" Apr 22 20:09:13.104226 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:09:13.104205 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f294d683816fe5603343b2b9c53f20e70ba66b6a3d6c41c770f20cd2dc86aaf\": container with ID starting with 2f294d683816fe5603343b2b9c53f20e70ba66b6a3d6c41c770f20cd2dc86aaf not found: ID does not exist" containerID="2f294d683816fe5603343b2b9c53f20e70ba66b6a3d6c41c770f20cd2dc86aaf" Apr 22 20:09:13.104277 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:13.104231 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f294d683816fe5603343b2b9c53f20e70ba66b6a3d6c41c770f20cd2dc86aaf"} err="failed to get container status \"2f294d683816fe5603343b2b9c53f20e70ba66b6a3d6c41c770f20cd2dc86aaf\": rpc error: code = NotFound desc = could not find container \"2f294d683816fe5603343b2b9c53f20e70ba66b6a3d6c41c770f20cd2dc86aaf\": container with ID starting with 2f294d683816fe5603343b2b9c53f20e70ba66b6a3d6c41c770f20cd2dc86aaf not found: ID does not exist" Apr 22 20:09:13.329732 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:13.329699 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6"] Apr 22 20:09:13.334594 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:13.334566 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-5d9df7d95c-vc4h6"] Apr 22 20:09:13.732866 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:13.732830 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f351df6-25b6-4271-adc5-8c82b1828666" path="/var/lib/kubelet/pods/4f351df6-25b6-4271-adc5-8c82b1828666/volumes" Apr 22 20:09:15.021405 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:15.021343 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" event={"ID":"88198d8c-94ba-43a6-a1a5-d5d633fe31a2","Type":"ContainerStarted","Data":"049c03c8ca1c6f586fd226e233bb29d46d63d84e26eeac8b072fce2a09fff400"} Apr 22 20:09:17.031798 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:17.031766 2575 generic.go:358] "Generic (PLEG): container finished" podID="1ebcea6c-965f-48dc-ab7b-2e65a0f62c98" containerID="bbaaa7d2fbad9afd33005967a9b46e24de1ba547d2306d675a1c38d3cc8c882f" exitCode=0 Apr 22 20:09:17.032286 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:17.031816 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" event={"ID":"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98","Type":"ContainerDied","Data":"bbaaa7d2fbad9afd33005967a9b46e24de1ba547d2306d675a1c38d3cc8c882f"} Apr 22 20:09:18.038076 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:18.038040 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" event={"ID":"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98","Type":"ContainerStarted","Data":"bba392102b2e90e2196a5e9cb9ec5ea83ff3a8b8c6cef83aa0f9ff0968043357"} Apr 22 20:09:18.054886 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:18.054741 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" podStartSLOduration=8.054720781 podStartE2EDuration="8.054720781s" podCreationTimestamp="2026-04-22 20:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:09:18.053658909 +0000 UTC m=+694.867044100" watchObservedRunningTime="2026-04-22 20:09:18.054720781 +0000 UTC m=+694.868105963" Apr 22 20:09:21.322364 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:21.322309 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" Apr 22 20:09:21.322364 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:21.322371 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" Apr 22 20:09:21.338184 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:21.338162 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" Apr 22 20:09:22.069684 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:22.069651 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" Apr 22 20:09:44.145280 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:44.145237 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" event={"ID":"88198d8c-94ba-43a6-a1a5-d5d633fe31a2","Type":"ContainerStarted","Data":"3030904126b9f4e1079194b3b6551e5f832ed6b520c912600d45f2c7ad689c97"} Apr 22 20:09:44.145693 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:44.145563 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" Apr 22 20:09:44.148455 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:44.148430 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" Apr 22 20:09:44.166617 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:44.166573 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" podStartSLOduration=2.661354256 podStartE2EDuration="33.166556004s" podCreationTimestamp="2026-04-22 20:09:11 +0000 UTC" firstStartedPulling="2026-04-22 20:09:13.001861911 +0000 UTC m=+689.815247069" lastFinishedPulling="2026-04-22 20:09:43.507063659 +0000 UTC m=+720.320448817" observedRunningTime="2026-04-22 20:09:44.164626972 +0000 UTC m=+720.978012152" watchObservedRunningTime="2026-04-22 20:09:44.166556004 +0000 UTC m=+720.979941184" Apr 22 20:09:51.595689 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:51.595651 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" Apr 22 20:09:51.595689 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:51.595700 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" Apr 22 20:09:51.597372 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:51.597345 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" Apr 22 20:09:52.169129 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:52.169103 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" Apr 22 20:09:57.784191 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:57.784157 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn"] Apr 22 20:09:57.784796 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:57.784559 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" podUID="88198d8c-94ba-43a6-a1a5-d5d633fe31a2" containerName="main" containerID="cri-o://049c03c8ca1c6f586fd226e233bb29d46d63d84e26eeac8b072fce2a09fff400" gracePeriod=30 Apr 22 20:09:57.784796 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:57.784617 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" podUID="88198d8c-94ba-43a6-a1a5-d5d633fe31a2" containerName="tokenizer" containerID="cri-o://3030904126b9f4e1079194b3b6551e5f832ed6b520c912600d45f2c7ad689c97" gracePeriod=30 Apr 22 20:09:57.786513 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:57.786485 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh"] Apr 22 20:09:57.786873 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:57.786826 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" podUID="1ebcea6c-965f-48dc-ab7b-2e65a0f62c98" containerName="main" containerID="cri-o://bba392102b2e90e2196a5e9cb9ec5ea83ff3a8b8c6cef83aa0f9ff0968043357" gracePeriod=30 Apr 22 20:09:58.083228 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.083197 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" Apr 22 20:09:58.109254 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.109218 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpg8l\" (UniqueName: \"kubernetes.io/projected/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-kube-api-access-dpg8l\") pod \"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98\" (UID: \"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98\") " Apr 22 20:09:58.109422 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.109293 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-tls-certs\") pod \"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98\" (UID: \"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98\") " Apr 22 20:09:58.109422 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.109357 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-home\") pod \"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98\" (UID: \"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98\") " Apr 22 20:09:58.109422 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.109400 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-kserve-provision-location\") pod \"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98\" (UID: \"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98\") " Apr 22 20:09:58.109595 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.109427 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-dshm\") pod \"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98\" (UID: \"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98\") " Apr 22 20:09:58.109595 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.109477 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-model-cache\") pod \"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98\" (UID: \"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98\") " Apr 22 20:09:58.109695 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.109638 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-home" (OuterVolumeSpecName: "home") pod "1ebcea6c-965f-48dc-ab7b-2e65a0f62c98" (UID: "1ebcea6c-965f-48dc-ab7b-2e65a0f62c98"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:09:58.109794 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.109759 2575 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-home\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:09:58.109901 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.109877 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-model-cache" (OuterVolumeSpecName: "model-cache") pod "1ebcea6c-965f-48dc-ab7b-2e65a0f62c98" (UID: "1ebcea6c-965f-48dc-ab7b-2e65a0f62c98"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:09:58.111787 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.111753 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-dshm" (OuterVolumeSpecName: "dshm") pod "1ebcea6c-965f-48dc-ab7b-2e65a0f62c98" (UID: "1ebcea6c-965f-48dc-ab7b-2e65a0f62c98"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:09:58.111904 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.111804 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-kube-api-access-dpg8l" (OuterVolumeSpecName: "kube-api-access-dpg8l") pod "1ebcea6c-965f-48dc-ab7b-2e65a0f62c98" (UID: "1ebcea6c-965f-48dc-ab7b-2e65a0f62c98"). InnerVolumeSpecName "kube-api-access-dpg8l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:09:58.112446 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.112415 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1ebcea6c-965f-48dc-ab7b-2e65a0f62c98" (UID: "1ebcea6c-965f-48dc-ab7b-2e65a0f62c98"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:09:58.169861 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.169815 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1ebcea6c-965f-48dc-ab7b-2e65a0f62c98" (UID: "1ebcea6c-965f-48dc-ab7b-2e65a0f62c98"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:09:58.187833 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.187783 2575 generic.go:358] "Generic (PLEG): container finished" podID="88198d8c-94ba-43a6-a1a5-d5d633fe31a2" containerID="049c03c8ca1c6f586fd226e233bb29d46d63d84e26eeac8b072fce2a09fff400" exitCode=0 Apr 22 20:09:58.188025 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.187861 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" event={"ID":"88198d8c-94ba-43a6-a1a5-d5d633fe31a2","Type":"ContainerDied","Data":"049c03c8ca1c6f586fd226e233bb29d46d63d84e26eeac8b072fce2a09fff400"} Apr 22 20:09:58.189486 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.189462 2575 generic.go:358] "Generic (PLEG): container finished" podID="1ebcea6c-965f-48dc-ab7b-2e65a0f62c98" containerID="bba392102b2e90e2196a5e9cb9ec5ea83ff3a8b8c6cef83aa0f9ff0968043357" exitCode=0 Apr 22 20:09:58.189625 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.189495 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" event={"ID":"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98","Type":"ContainerDied","Data":"bba392102b2e90e2196a5e9cb9ec5ea83ff3a8b8c6cef83aa0f9ff0968043357"} Apr 22 20:09:58.189625 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.189516 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" event={"ID":"1ebcea6c-965f-48dc-ab7b-2e65a0f62c98","Type":"ContainerDied","Data":"0fb9eb924ce1d184973afdfe8c624ab8f97e4351507e22db172a032a3b4fce5a"} Apr 22 20:09:58.189625 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.189535 2575 scope.go:117] "RemoveContainer" containerID="bba392102b2e90e2196a5e9cb9ec5ea83ff3a8b8c6cef83aa0f9ff0968043357" Apr 22 20:09:58.189625 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.189550 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh" Apr 22 20:09:58.197745 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.197722 2575 scope.go:117] "RemoveContainer" containerID="bbaaa7d2fbad9afd33005967a9b46e24de1ba547d2306d675a1c38d3cc8c882f" Apr 22 20:09:58.210258 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.210232 2575 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-model-cache\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:09:58.210258 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.210256 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dpg8l\" (UniqueName: \"kubernetes.io/projected/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-kube-api-access-dpg8l\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:09:58.210440 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.210267 2575 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-tls-certs\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:09:58.210440 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.210279 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-kserve-provision-location\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:09:58.210440 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.210287 2575 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98-dshm\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:09:58.212736 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.212710 2575 scope.go:117] "RemoveContainer" containerID="bba392102b2e90e2196a5e9cb9ec5ea83ff3a8b8c6cef83aa0f9ff0968043357" Apr 22 20:09:58.213098 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:09:58.213077 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bba392102b2e90e2196a5e9cb9ec5ea83ff3a8b8c6cef83aa0f9ff0968043357\": container with ID starting with bba392102b2e90e2196a5e9cb9ec5ea83ff3a8b8c6cef83aa0f9ff0968043357 not found: ID does not exist" containerID="bba392102b2e90e2196a5e9cb9ec5ea83ff3a8b8c6cef83aa0f9ff0968043357" Apr 22 20:09:58.213173 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.213109 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba392102b2e90e2196a5e9cb9ec5ea83ff3a8b8c6cef83aa0f9ff0968043357"} err="failed to get container status \"bba392102b2e90e2196a5e9cb9ec5ea83ff3a8b8c6cef83aa0f9ff0968043357\": rpc error: code = NotFound desc = could not find container \"bba392102b2e90e2196a5e9cb9ec5ea83ff3a8b8c6cef83aa0f9ff0968043357\": container with ID starting with bba392102b2e90e2196a5e9cb9ec5ea83ff3a8b8c6cef83aa0f9ff0968043357 not found: ID does not exist" Apr 22 20:09:58.213173 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.213127 2575 scope.go:117] "RemoveContainer" containerID="bbaaa7d2fbad9afd33005967a9b46e24de1ba547d2306d675a1c38d3cc8c882f" Apr 22 20:09:58.213459 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:09:58.213436 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbaaa7d2fbad9afd33005967a9b46e24de1ba547d2306d675a1c38d3cc8c882f\": container with ID starting with bbaaa7d2fbad9afd33005967a9b46e24de1ba547d2306d675a1c38d3cc8c882f not found: ID does not exist" containerID="bbaaa7d2fbad9afd33005967a9b46e24de1ba547d2306d675a1c38d3cc8c882f" Apr 22 20:09:58.213522 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.213465 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbaaa7d2fbad9afd33005967a9b46e24de1ba547d2306d675a1c38d3cc8c882f"} err="failed to get container status \"bbaaa7d2fbad9afd33005967a9b46e24de1ba547d2306d675a1c38d3cc8c882f\": rpc error: code = NotFound desc = could not find container \"bbaaa7d2fbad9afd33005967a9b46e24de1ba547d2306d675a1c38d3cc8c882f\": container with ID starting with bbaaa7d2fbad9afd33005967a9b46e24de1ba547d2306d675a1c38d3cc8c882f not found: ID does not exist" Apr 22 20:09:58.214309 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.214290 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh"] Apr 22 20:09:58.217521 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:58.217498 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6c5b68d549-fd5qh"] Apr 22 20:09:59.148306 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.148248 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" Apr 22 20:09:59.195522 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.195494 2575 generic.go:358] "Generic (PLEG): container finished" podID="88198d8c-94ba-43a6-a1a5-d5d633fe31a2" containerID="3030904126b9f4e1079194b3b6551e5f832ed6b520c912600d45f2c7ad689c97" exitCode=0 Apr 22 20:09:59.195673 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.195550 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" event={"ID":"88198d8c-94ba-43a6-a1a5-d5d633fe31a2","Type":"ContainerDied","Data":"3030904126b9f4e1079194b3b6551e5f832ed6b520c912600d45f2c7ad689c97"} Apr 22 20:09:59.195673 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.195575 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" event={"ID":"88198d8c-94ba-43a6-a1a5-d5d633fe31a2","Type":"ContainerDied","Data":"191f5548034260b447d55349d1ee9865990ea3a0159b4ad39e79d3f055cfbcd7"} Apr 22 20:09:59.195673 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.195590 2575 scope.go:117] "RemoveContainer" containerID="3030904126b9f4e1079194b3b6551e5f832ed6b520c912600d45f2c7ad689c97" Apr 22 20:09:59.195673 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.195589 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn" Apr 22 20:09:59.203137 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.203120 2575 scope.go:117] "RemoveContainer" containerID="049c03c8ca1c6f586fd226e233bb29d46d63d84e26eeac8b072fce2a09fff400" Apr 22 20:09:59.210117 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.210097 2575 scope.go:117] "RemoveContainer" containerID="222174c2ebec22a25f884d5e75dc7ffb41b0c61749e85876aa48fc769bca489f" Apr 22 20:09:59.217010 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.216995 2575 scope.go:117] "RemoveContainer" containerID="3030904126b9f4e1079194b3b6551e5f832ed6b520c912600d45f2c7ad689c97" Apr 22 20:09:59.217219 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:09:59.217203 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3030904126b9f4e1079194b3b6551e5f832ed6b520c912600d45f2c7ad689c97\": container with ID starting with 3030904126b9f4e1079194b3b6551e5f832ed6b520c912600d45f2c7ad689c97 not found: ID does not exist" containerID="3030904126b9f4e1079194b3b6551e5f832ed6b520c912600d45f2c7ad689c97" Apr 22 20:09:59.217269 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.217227 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3030904126b9f4e1079194b3b6551e5f832ed6b520c912600d45f2c7ad689c97"} err="failed to get container status \"3030904126b9f4e1079194b3b6551e5f832ed6b520c912600d45f2c7ad689c97\": rpc error: code = NotFound desc = could not find container \"3030904126b9f4e1079194b3b6551e5f832ed6b520c912600d45f2c7ad689c97\": container with ID starting with 3030904126b9f4e1079194b3b6551e5f832ed6b520c912600d45f2c7ad689c97 not found: ID does not exist" Apr 22 20:09:59.217269 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.217253 2575 scope.go:117] "RemoveContainer" containerID="049c03c8ca1c6f586fd226e233bb29d46d63d84e26eeac8b072fce2a09fff400" Apr 22 20:09:59.217526 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:09:59.217501 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"049c03c8ca1c6f586fd226e233bb29d46d63d84e26eeac8b072fce2a09fff400\": container with ID starting with 049c03c8ca1c6f586fd226e233bb29d46d63d84e26eeac8b072fce2a09fff400 not found: ID does not exist" containerID="049c03c8ca1c6f586fd226e233bb29d46d63d84e26eeac8b072fce2a09fff400" Apr 22 20:09:59.217619 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.217529 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"049c03c8ca1c6f586fd226e233bb29d46d63d84e26eeac8b072fce2a09fff400"} err="failed to get container status \"049c03c8ca1c6f586fd226e233bb29d46d63d84e26eeac8b072fce2a09fff400\": rpc error: code = NotFound desc = could not find container \"049c03c8ca1c6f586fd226e233bb29d46d63d84e26eeac8b072fce2a09fff400\": container with ID starting with 049c03c8ca1c6f586fd226e233bb29d46d63d84e26eeac8b072fce2a09fff400 not found: ID does not exist" Apr 22 20:09:59.217619 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.217547 2575 scope.go:117] "RemoveContainer" containerID="222174c2ebec22a25f884d5e75dc7ffb41b0c61749e85876aa48fc769bca489f" Apr 22 20:09:59.217773 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:09:59.217757 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"222174c2ebec22a25f884d5e75dc7ffb41b0c61749e85876aa48fc769bca489f\": container with ID starting with 222174c2ebec22a25f884d5e75dc7ffb41b0c61749e85876aa48fc769bca489f not found: ID does not exist" containerID="222174c2ebec22a25f884d5e75dc7ffb41b0c61749e85876aa48fc769bca489f" Apr 22 20:09:59.217816 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.217777 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"222174c2ebec22a25f884d5e75dc7ffb41b0c61749e85876aa48fc769bca489f"} err="failed to get container status \"222174c2ebec22a25f884d5e75dc7ffb41b0c61749e85876aa48fc769bca489f\": rpc error: code = NotFound desc = could not find container \"222174c2ebec22a25f884d5e75dc7ffb41b0c61749e85876aa48fc769bca489f\": container with ID starting with 222174c2ebec22a25f884d5e75dc7ffb41b0c61749e85876aa48fc769bca489f not found: ID does not exist" Apr 22 20:09:59.218998 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.218983 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-tokenizer-uds\") pod \"88198d8c-94ba-43a6-a1a5-d5d633fe31a2\" (UID: \"88198d8c-94ba-43a6-a1a5-d5d633fe31a2\") " Apr 22 20:09:59.219052 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.219022 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-tokenizer-cache\") pod \"88198d8c-94ba-43a6-a1a5-d5d633fe31a2\" (UID: \"88198d8c-94ba-43a6-a1a5-d5d633fe31a2\") " Apr 22 20:09:59.219052 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.219044 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-tokenizer-tmp\") pod \"88198d8c-94ba-43a6-a1a5-d5d633fe31a2\" (UID: \"88198d8c-94ba-43a6-a1a5-d5d633fe31a2\") " Apr 22 20:09:59.219116 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.219084 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-tls-certs\") pod \"88198d8c-94ba-43a6-a1a5-d5d633fe31a2\" (UID: \"88198d8c-94ba-43a6-a1a5-d5d633fe31a2\") " Apr 22 20:09:59.219207 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.219115 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-kserve-provision-location\") pod \"88198d8c-94ba-43a6-a1a5-d5d633fe31a2\" (UID: \"88198d8c-94ba-43a6-a1a5-d5d633fe31a2\") " Apr 22 20:09:59.219207 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.219153 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6qhd\" (UniqueName: \"kubernetes.io/projected/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-kube-api-access-m6qhd\") pod \"88198d8c-94ba-43a6-a1a5-d5d633fe31a2\" (UID: \"88198d8c-94ba-43a6-a1a5-d5d633fe31a2\") " Apr 22 20:09:59.219304 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.219253 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "88198d8c-94ba-43a6-a1a5-d5d633fe31a2" (UID: "88198d8c-94ba-43a6-a1a5-d5d633fe31a2"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:09:59.219304 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.219267 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "88198d8c-94ba-43a6-a1a5-d5d633fe31a2" (UID: "88198d8c-94ba-43a6-a1a5-d5d633fe31a2"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:09:59.219444 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.219407 2575 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-tokenizer-uds\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:09:59.219444 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.219423 2575 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-tokenizer-cache\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:09:59.219549 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.219435 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "88198d8c-94ba-43a6-a1a5-d5d633fe31a2" (UID: "88198d8c-94ba-43a6-a1a5-d5d633fe31a2"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:09:59.219842 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.219817 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "88198d8c-94ba-43a6-a1a5-d5d633fe31a2" (UID: "88198d8c-94ba-43a6-a1a5-d5d633fe31a2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:09:59.221185 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.221164 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-kube-api-access-m6qhd" (OuterVolumeSpecName: "kube-api-access-m6qhd") pod "88198d8c-94ba-43a6-a1a5-d5d633fe31a2" (UID: "88198d8c-94ba-43a6-a1a5-d5d633fe31a2"). InnerVolumeSpecName "kube-api-access-m6qhd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:09:59.221252 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.221220 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "88198d8c-94ba-43a6-a1a5-d5d633fe31a2" (UID: "88198d8c-94ba-43a6-a1a5-d5d633fe31a2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:09:59.320501 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.320468 2575 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-tls-certs\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:09:59.320501 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.320495 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-kserve-provision-location\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:09:59.320501 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.320505 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m6qhd\" (UniqueName: \"kubernetes.io/projected/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-kube-api-access-m6qhd\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:09:59.320719 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.320513 2575 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/88198d8c-94ba-43a6-a1a5-d5d633fe31a2-tokenizer-tmp\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:09:59.517676 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.517642 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn"] Apr 22 20:09:59.522588 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.522562 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7c5b756jtcsn"] Apr 22 20:09:59.731840 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.731810 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ebcea6c-965f-48dc-ab7b-2e65a0f62c98" path="/var/lib/kubelet/pods/1ebcea6c-965f-48dc-ab7b-2e65a0f62c98/volumes" Apr 22 20:09:59.732212 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:09:59.732199 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88198d8c-94ba-43a6-a1a5-d5d633fe31a2" path="/var/lib/kubelet/pods/88198d8c-94ba-43a6-a1a5-d5d633fe31a2/volumes" Apr 22 20:10:08.499463 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.499429 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9"] Apr 22 20:10:08.499899 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.499721 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f351df6-25b6-4271-adc5-8c82b1828666" containerName="storage-initializer" Apr 22 20:10:08.499899 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.499732 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f351df6-25b6-4271-adc5-8c82b1828666" containerName="storage-initializer" Apr 22 20:10:08.499899 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.499740 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ebcea6c-965f-48dc-ab7b-2e65a0f62c98" containerName="main" Apr 22 20:10:08.499899 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.499746 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ebcea6c-965f-48dc-ab7b-2e65a0f62c98" containerName="main" Apr 22 20:10:08.499899 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.499757 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f351df6-25b6-4271-adc5-8c82b1828666" containerName="main" Apr 22 20:10:08.499899 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.499764 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f351df6-25b6-4271-adc5-8c82b1828666" containerName="main" Apr 22 20:10:08.499899 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.499777 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ebcea6c-965f-48dc-ab7b-2e65a0f62c98" containerName="storage-initializer" Apr 22 20:10:08.499899 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.499782 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ebcea6c-965f-48dc-ab7b-2e65a0f62c98" containerName="storage-initializer" Apr 22 20:10:08.499899 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.499789 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88198d8c-94ba-43a6-a1a5-d5d633fe31a2" containerName="storage-initializer" Apr 22 20:10:08.499899 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.499795 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="88198d8c-94ba-43a6-a1a5-d5d633fe31a2" containerName="storage-initializer" Apr 22 20:10:08.499899 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.499800 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88198d8c-94ba-43a6-a1a5-d5d633fe31a2" containerName="main" Apr 22 20:10:08.499899 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.499805 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="88198d8c-94ba-43a6-a1a5-d5d633fe31a2" containerName="main" Apr 22 20:10:08.499899 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.499811 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88198d8c-94ba-43a6-a1a5-d5d633fe31a2" containerName="tokenizer" Apr 22 20:10:08.499899 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.499815 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="88198d8c-94ba-43a6-a1a5-d5d633fe31a2" containerName="tokenizer" Apr 22 20:10:08.503914 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.500802 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f351df6-25b6-4271-adc5-8c82b1828666" containerName="main" Apr 22 20:10:08.503914 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.500848 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ebcea6c-965f-48dc-ab7b-2e65a0f62c98" containerName="main" Apr 22 20:10:08.503914 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.500873 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="88198d8c-94ba-43a6-a1a5-d5d633fe31a2" containerName="tokenizer" Apr 22 20:10:08.503914 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.500885 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="88198d8c-94ba-43a6-a1a5-d5d633fe31a2" containerName="main" Apr 22 20:10:08.507448 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.507426 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" Apr 22 20:10:08.510061 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.510036 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 20:10:08.510960 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.510939 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 20:10:08.511035 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.510939 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 22 20:10:08.511035 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.511003 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tlcdw\"" Apr 22 20:10:08.514878 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.514593 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9"] Apr 22 20:10:08.598875 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.598838 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ba05bd15-cbba-47d8-8941-bc53cc01f769-model-cache\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-lb4z9\" (UID: \"ba05bd15-cbba-47d8-8941-bc53cc01f769\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" Apr 22 20:10:08.599058 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.598886 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4qv9\" (UniqueName: \"kubernetes.io/projected/ba05bd15-cbba-47d8-8941-bc53cc01f769-kube-api-access-d4qv9\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-lb4z9\" (UID: \"ba05bd15-cbba-47d8-8941-bc53cc01f769\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" Apr 22 20:10:08.599058 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.598953 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ba05bd15-cbba-47d8-8941-bc53cc01f769-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-lb4z9\" (UID: \"ba05bd15-cbba-47d8-8941-bc53cc01f769\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" Apr 22 20:10:08.599058 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.598978 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ba05bd15-cbba-47d8-8941-bc53cc01f769-home\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-lb4z9\" (UID: \"ba05bd15-cbba-47d8-8941-bc53cc01f769\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" Apr 22 20:10:08.599058 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.599000 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ba05bd15-cbba-47d8-8941-bc53cc01f769-dshm\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-lb4z9\" (UID: \"ba05bd15-cbba-47d8-8941-bc53cc01f769\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" Apr 22 20:10:08.599058 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.599027 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ba05bd15-cbba-47d8-8941-bc53cc01f769-tls-certs\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-lb4z9\" (UID: \"ba05bd15-cbba-47d8-8941-bc53cc01f769\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" Apr 22 20:10:08.699913 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.699883 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ba05bd15-cbba-47d8-8941-bc53cc01f769-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-lb4z9\" (UID: \"ba05bd15-cbba-47d8-8941-bc53cc01f769\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" Apr 22 20:10:08.700098 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.699920 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ba05bd15-cbba-47d8-8941-bc53cc01f769-home\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-lb4z9\" (UID: \"ba05bd15-cbba-47d8-8941-bc53cc01f769\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" Apr 22 20:10:08.700098 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.699949 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ba05bd15-cbba-47d8-8941-bc53cc01f769-dshm\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-lb4z9\" (UID: \"ba05bd15-cbba-47d8-8941-bc53cc01f769\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" Apr 22 20:10:08.700098 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.699985 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ba05bd15-cbba-47d8-8941-bc53cc01f769-tls-certs\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-lb4z9\" (UID: \"ba05bd15-cbba-47d8-8941-bc53cc01f769\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" Apr 22 20:10:08.700098 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.700017 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ba05bd15-cbba-47d8-8941-bc53cc01f769-model-cache\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-lb4z9\" (UID: \"ba05bd15-cbba-47d8-8941-bc53cc01f769\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" Apr 22 20:10:08.700098 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.700060 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4qv9\" (UniqueName: \"kubernetes.io/projected/ba05bd15-cbba-47d8-8941-bc53cc01f769-kube-api-access-d4qv9\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-lb4z9\" (UID: \"ba05bd15-cbba-47d8-8941-bc53cc01f769\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" Apr 22 20:10:08.700397 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.700370 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ba05bd15-cbba-47d8-8941-bc53cc01f769-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-lb4z9\" (UID: \"ba05bd15-cbba-47d8-8941-bc53cc01f769\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" Apr 22 20:10:08.700453 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.700414 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ba05bd15-cbba-47d8-8941-bc53cc01f769-home\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-lb4z9\" (UID: \"ba05bd15-cbba-47d8-8941-bc53cc01f769\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" Apr 22 20:10:08.700453 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.700425 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ba05bd15-cbba-47d8-8941-bc53cc01f769-model-cache\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-lb4z9\" (UID: \"ba05bd15-cbba-47d8-8941-bc53cc01f769\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" Apr 22 20:10:08.702209 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.702188 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ba05bd15-cbba-47d8-8941-bc53cc01f769-dshm\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-lb4z9\" (UID: \"ba05bd15-cbba-47d8-8941-bc53cc01f769\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" Apr 22 20:10:08.702528 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.702509 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ba05bd15-cbba-47d8-8941-bc53cc01f769-tls-certs\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-lb4z9\" (UID: \"ba05bd15-cbba-47d8-8941-bc53cc01f769\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" Apr 22 20:10:08.712048 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.712023 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4qv9\" (UniqueName: \"kubernetes.io/projected/ba05bd15-cbba-47d8-8941-bc53cc01f769-kube-api-access-d4qv9\") pod \"precise-prefix-cache-test-kserve-77b55c8fc-lb4z9\" (UID: \"ba05bd15-cbba-47d8-8941-bc53cc01f769\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" Apr 22 20:10:08.733359 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.733331 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw"] Apr 22 20:10:08.737854 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.737830 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" Apr 22 20:10:08.740527 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.740507 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-8skwx\"" Apr 22 20:10:08.749608 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.749556 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw"] Apr 22 20:10:08.800981 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.800951 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0616ada0-d9d3-4a2a-917a-ba2b2da10685-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw\" (UID: \"0616ada0-d9d3-4a2a-917a-ba2b2da10685\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" Apr 22 20:10:08.801165 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.800988 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0616ada0-d9d3-4a2a-917a-ba2b2da10685-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw\" (UID: \"0616ada0-d9d3-4a2a-917a-ba2b2da10685\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" Apr 22 20:10:08.801165 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.801005 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0616ada0-d9d3-4a2a-917a-ba2b2da10685-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw\" (UID: \"0616ada0-d9d3-4a2a-917a-ba2b2da10685\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" Apr 22 20:10:08.801165 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.801066 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0616ada0-d9d3-4a2a-917a-ba2b2da10685-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw\" (UID: \"0616ada0-d9d3-4a2a-917a-ba2b2da10685\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" Apr 22 20:10:08.801165 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.801101 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0616ada0-d9d3-4a2a-917a-ba2b2da10685-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw\" (UID: \"0616ada0-d9d3-4a2a-917a-ba2b2da10685\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" Apr 22 20:10:08.801165 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.801119 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkvq2\" (UniqueName: \"kubernetes.io/projected/0616ada0-d9d3-4a2a-917a-ba2b2da10685-kube-api-access-qkvq2\") pod \"precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw\" (UID: \"0616ada0-d9d3-4a2a-917a-ba2b2da10685\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" Apr 22 20:10:08.818973 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.818942 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" Apr 22 20:10:08.902109 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.902076 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0616ada0-d9d3-4a2a-917a-ba2b2da10685-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw\" (UID: \"0616ada0-d9d3-4a2a-917a-ba2b2da10685\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" Apr 22 20:10:08.902234 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.902127 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0616ada0-d9d3-4a2a-917a-ba2b2da10685-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw\" (UID: \"0616ada0-d9d3-4a2a-917a-ba2b2da10685\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" Apr 22 20:10:08.902234 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.902157 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0616ada0-d9d3-4a2a-917a-ba2b2da10685-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw\" (UID: \"0616ada0-d9d3-4a2a-917a-ba2b2da10685\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" Apr 22 20:10:08.902234 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.902201 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0616ada0-d9d3-4a2a-917a-ba2b2da10685-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw\" (UID: \"0616ada0-d9d3-4a2a-917a-ba2b2da10685\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" Apr 22 20:10:08.902431 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.902231 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0616ada0-d9d3-4a2a-917a-ba2b2da10685-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw\" (UID: \"0616ada0-d9d3-4a2a-917a-ba2b2da10685\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" Apr 22 20:10:08.902431 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.902260 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkvq2\" (UniqueName: \"kubernetes.io/projected/0616ada0-d9d3-4a2a-917a-ba2b2da10685-kube-api-access-qkvq2\") pod \"precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw\" (UID: \"0616ada0-d9d3-4a2a-917a-ba2b2da10685\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" Apr 22 20:10:08.902665 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.902516 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0616ada0-d9d3-4a2a-917a-ba2b2da10685-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw\" (UID: \"0616ada0-d9d3-4a2a-917a-ba2b2da10685\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" Apr 22 20:10:08.902665 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.902546 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0616ada0-d9d3-4a2a-917a-ba2b2da10685-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw\" (UID: \"0616ada0-d9d3-4a2a-917a-ba2b2da10685\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" Apr 22 20:10:08.902665 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.902584 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0616ada0-d9d3-4a2a-917a-ba2b2da10685-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw\" (UID: \"0616ada0-d9d3-4a2a-917a-ba2b2da10685\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" Apr 22 20:10:08.902880 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.902860 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0616ada0-d9d3-4a2a-917a-ba2b2da10685-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw\" (UID: \"0616ada0-d9d3-4a2a-917a-ba2b2da10685\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" Apr 22 20:10:08.905102 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.905079 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0616ada0-d9d3-4a2a-917a-ba2b2da10685-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw\" (UID: \"0616ada0-d9d3-4a2a-917a-ba2b2da10685\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" Apr 22 20:10:08.912880 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.912855 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkvq2\" (UniqueName: \"kubernetes.io/projected/0616ada0-d9d3-4a2a-917a-ba2b2da10685-kube-api-access-qkvq2\") pod \"precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw\" (UID: \"0616ada0-d9d3-4a2a-917a-ba2b2da10685\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" Apr 22 20:10:08.941373 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:08.941343 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9"] Apr 22 20:10:08.944375 ip-10-0-133-60 kubenswrapper[2575]: W0422 20:10:08.944349 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba05bd15_cbba_47d8_8941_bc53cc01f769.slice/crio-fbb0651d9beb65ab0d21f56a3665435f2980afd8ec00bef6db7cf3a45299b130 WatchSource:0}: Error finding container fbb0651d9beb65ab0d21f56a3665435f2980afd8ec00bef6db7cf3a45299b130: Status 404 returned error can't find the container with id fbb0651d9beb65ab0d21f56a3665435f2980afd8ec00bef6db7cf3a45299b130 Apr 22 20:10:09.048819 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:09.048737 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" Apr 22 20:10:09.186600 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:09.186571 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw"] Apr 22 20:10:09.188304 ip-10-0-133-60 kubenswrapper[2575]: W0422 20:10:09.188277 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0616ada0_d9d3_4a2a_917a_ba2b2da10685.slice/crio-f9c5332214ebda333f034e95868d9619bd7ae9ae0c96b62297949dcd47e40bc6 WatchSource:0}: Error finding container f9c5332214ebda333f034e95868d9619bd7ae9ae0c96b62297949dcd47e40bc6: Status 404 returned error can't find the container with id f9c5332214ebda333f034e95868d9619bd7ae9ae0c96b62297949dcd47e40bc6 Apr 22 20:10:09.227392 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:09.227348 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" event={"ID":"0616ada0-d9d3-4a2a-917a-ba2b2da10685","Type":"ContainerStarted","Data":"f9c5332214ebda333f034e95868d9619bd7ae9ae0c96b62297949dcd47e40bc6"} Apr 22 20:10:09.229009 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:09.228983 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" event={"ID":"ba05bd15-cbba-47d8-8941-bc53cc01f769","Type":"ContainerStarted","Data":"417e8d101239f4cb25df778d5c148142691c8312544ed60d8efd60b4fc166d15"} Apr 22 20:10:09.229126 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:09.229013 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" event={"ID":"ba05bd15-cbba-47d8-8941-bc53cc01f769","Type":"ContainerStarted","Data":"fbb0651d9beb65ab0d21f56a3665435f2980afd8ec00bef6db7cf3a45299b130"} Apr 22 20:10:10.233907 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:10.233867 2575 generic.go:358] "Generic (PLEG): container finished" podID="0616ada0-d9d3-4a2a-917a-ba2b2da10685" containerID="225157338e23dade0a34cacb1753c5f59f2825e5cf0a9cd07920701f77231d36" exitCode=0 Apr 22 20:10:10.234330 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:10.233925 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" event={"ID":"0616ada0-d9d3-4a2a-917a-ba2b2da10685","Type":"ContainerDied","Data":"225157338e23dade0a34cacb1753c5f59f2825e5cf0a9cd07920701f77231d36"} Apr 22 20:10:11.238954 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:11.238914 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" event={"ID":"0616ada0-d9d3-4a2a-917a-ba2b2da10685","Type":"ContainerStarted","Data":"bbaaac297e392c48dcdc1b6caab5c81dc1fe87684fa0ea26925fda0d64a1a181"} Apr 22 20:10:11.239364 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:11.238959 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" event={"ID":"0616ada0-d9d3-4a2a-917a-ba2b2da10685","Type":"ContainerStarted","Data":"9959a38bc1faddfd99fe94dc654c842cceaca1efe4577c2d8117e3a9d5444fa5"} Apr 22 20:10:11.239364 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:11.238994 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" Apr 22 20:10:11.258612 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:11.258568 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" podStartSLOduration=3.258552141 podStartE2EDuration="3.258552141s" podCreationTimestamp="2026-04-22 20:10:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:10:11.256619886 +0000 UTC m=+748.070005067" watchObservedRunningTime="2026-04-22 20:10:11.258552141 +0000 UTC m=+748.071937322" Apr 22 20:10:17.260622 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:17.260584 2575 generic.go:358] "Generic (PLEG): container finished" podID="ba05bd15-cbba-47d8-8941-bc53cc01f769" containerID="417e8d101239f4cb25df778d5c148142691c8312544ed60d8efd60b4fc166d15" exitCode=0 Apr 22 20:10:17.260984 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:17.260652 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" event={"ID":"ba05bd15-cbba-47d8-8941-bc53cc01f769","Type":"ContainerDied","Data":"417e8d101239f4cb25df778d5c148142691c8312544ed60d8efd60b4fc166d15"} Apr 22 20:10:18.265022 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:18.264984 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" event={"ID":"ba05bd15-cbba-47d8-8941-bc53cc01f769","Type":"ContainerStarted","Data":"412f52886be90d73a2f1c46920954b54eb3aedbfe7e4bd567a39825ec34d3af9"} Apr 22 20:10:18.283394 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:18.283339 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" podStartSLOduration=10.283301369 podStartE2EDuration="10.283301369s" podCreationTimestamp="2026-04-22 20:10:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:10:18.281775583 +0000 UTC m=+755.095160763" watchObservedRunningTime="2026-04-22 20:10:18.283301369 +0000 UTC m=+755.096686549" Apr 22 20:10:18.819744 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:18.819702 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" Apr 22 20:10:18.819744 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:18.819748 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" Apr 22 20:10:18.837145 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:18.837115 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" Apr 22 20:10:19.049473 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:19.049440 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" Apr 22 20:10:19.049630 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:19.049596 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" Apr 22 20:10:19.050770 ip-10-0-133-60 kubenswrapper[2575]: W0422 20:10:19.050749 2575 logging.go:55] [core] [Channel #26 SubChannel #27]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.33:9003", ServerName: "10.132.0.33:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.33:9003: connect: connection refused" Apr 22 20:10:19.052047 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:19.052027 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" Apr 22 20:10:19.269848 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:19.269821 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" Apr 22 20:10:19.279961 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:19.279939 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" Apr 22 20:10:20.050235 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:20.050190 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" podUID="0616ada0-d9d3-4a2a-917a-ba2b2da10685" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.33:9003\" within 1s: context deadline exceeded" Apr 22 20:10:29.049459 ip-10-0-133-60 kubenswrapper[2575]: W0422 20:10:29.049429 2575 logging.go:55] [core] [Channel #28 SubChannel #29]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.33:9003", ServerName: "10.132.0.33:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.33:9003: connect: connection refused" Apr 22 20:10:30.049813 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:30.049764 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" podUID="0616ada0-d9d3-4a2a-917a-ba2b2da10685" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.33:9003\" within 1s: context deadline exceeded" Apr 22 20:10:30.050187 ip-10-0-133-60 kubenswrapper[2575]: W0422 20:10:30.049898 2575 logging.go:55] [core] [Channel #28 SubChannel #29]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.33:9003", ServerName: "10.132.0.33:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.33:9003: operation was canceled" Apr 22 20:10:41.277712 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:41.277679 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" Apr 22 20:10:42.424107 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:42.424075 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw"] Apr 22 20:10:42.424602 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:42.424375 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" podUID="0616ada0-d9d3-4a2a-917a-ba2b2da10685" containerName="main" containerID="cri-o://9959a38bc1faddfd99fe94dc654c842cceaca1efe4577c2d8117e3a9d5444fa5" gracePeriod=30 Apr 22 20:10:42.424602 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:42.424459 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" podUID="0616ada0-d9d3-4a2a-917a-ba2b2da10685" containerName="tokenizer" containerID="cri-o://bbaaac297e392c48dcdc1b6caab5c81dc1fe87684fa0ea26925fda0d64a1a181" gracePeriod=30 Apr 22 20:10:42.432665 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:42.432632 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9"] Apr 22 20:10:42.433041 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:42.432956 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" podUID="ba05bd15-cbba-47d8-8941-bc53cc01f769" containerName="main" containerID="cri-o://412f52886be90d73a2f1c46920954b54eb3aedbfe7e4bd567a39825ec34d3af9" gracePeriod=30 Apr 22 20:10:42.685423 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:42.685365 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" Apr 22 20:10:42.798290 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:42.798256 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4qv9\" (UniqueName: \"kubernetes.io/projected/ba05bd15-cbba-47d8-8941-bc53cc01f769-kube-api-access-d4qv9\") pod \"ba05bd15-cbba-47d8-8941-bc53cc01f769\" (UID: \"ba05bd15-cbba-47d8-8941-bc53cc01f769\") " Apr 22 20:10:42.798496 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:42.798342 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ba05bd15-cbba-47d8-8941-bc53cc01f769-home\") pod \"ba05bd15-cbba-47d8-8941-bc53cc01f769\" (UID: \"ba05bd15-cbba-47d8-8941-bc53cc01f769\") " Apr 22 20:10:42.798496 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:42.798377 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ba05bd15-cbba-47d8-8941-bc53cc01f769-model-cache\") pod \"ba05bd15-cbba-47d8-8941-bc53cc01f769\" (UID: \"ba05bd15-cbba-47d8-8941-bc53cc01f769\") " Apr 22 20:10:42.798496 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:42.798404 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ba05bd15-cbba-47d8-8941-bc53cc01f769-tls-certs\") pod \"ba05bd15-cbba-47d8-8941-bc53cc01f769\" (UID: \"ba05bd15-cbba-47d8-8941-bc53cc01f769\") " Apr 22 20:10:42.798496 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:42.798452 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ba05bd15-cbba-47d8-8941-bc53cc01f769-kserve-provision-location\") pod \"ba05bd15-cbba-47d8-8941-bc53cc01f769\" (UID: \"ba05bd15-cbba-47d8-8941-bc53cc01f769\") " Apr 22 20:10:42.798732 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:42.798504 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ba05bd15-cbba-47d8-8941-bc53cc01f769-dshm\") pod \"ba05bd15-cbba-47d8-8941-bc53cc01f769\" (UID: \"ba05bd15-cbba-47d8-8941-bc53cc01f769\") " Apr 22 20:10:42.798732 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:42.798608 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba05bd15-cbba-47d8-8941-bc53cc01f769-model-cache" (OuterVolumeSpecName: "model-cache") pod "ba05bd15-cbba-47d8-8941-bc53cc01f769" (UID: "ba05bd15-cbba-47d8-8941-bc53cc01f769"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:10:42.798732 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:42.798635 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba05bd15-cbba-47d8-8941-bc53cc01f769-home" (OuterVolumeSpecName: "home") pod "ba05bd15-cbba-47d8-8941-bc53cc01f769" (UID: "ba05bd15-cbba-47d8-8941-bc53cc01f769"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:10:42.798872 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:42.798784 2575 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ba05bd15-cbba-47d8-8941-bc53cc01f769-model-cache\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:10:42.798872 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:42.798802 2575 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ba05bd15-cbba-47d8-8941-bc53cc01f769-home\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:10:42.800570 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:42.800539 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba05bd15-cbba-47d8-8941-bc53cc01f769-kube-api-access-d4qv9" (OuterVolumeSpecName: "kube-api-access-d4qv9") pod "ba05bd15-cbba-47d8-8941-bc53cc01f769" (UID: "ba05bd15-cbba-47d8-8941-bc53cc01f769"). InnerVolumeSpecName "kube-api-access-d4qv9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:10:42.800670 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:42.800604 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba05bd15-cbba-47d8-8941-bc53cc01f769-dshm" (OuterVolumeSpecName: "dshm") pod "ba05bd15-cbba-47d8-8941-bc53cc01f769" (UID: "ba05bd15-cbba-47d8-8941-bc53cc01f769"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:10:42.800792 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:42.800774 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba05bd15-cbba-47d8-8941-bc53cc01f769-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ba05bd15-cbba-47d8-8941-bc53cc01f769" (UID: "ba05bd15-cbba-47d8-8941-bc53cc01f769"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:10:42.858291 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:42.858231 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba05bd15-cbba-47d8-8941-bc53cc01f769-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ba05bd15-cbba-47d8-8941-bc53cc01f769" (UID: "ba05bd15-cbba-47d8-8941-bc53cc01f769"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:10:42.899489 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:42.899445 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ba05bd15-cbba-47d8-8941-bc53cc01f769-kserve-provision-location\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:10:42.899489 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:42.899483 2575 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ba05bd15-cbba-47d8-8941-bc53cc01f769-dshm\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:10:42.899692 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:42.899498 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d4qv9\" (UniqueName: \"kubernetes.io/projected/ba05bd15-cbba-47d8-8941-bc53cc01f769-kube-api-access-d4qv9\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:10:42.899692 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:42.899511 2575 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ba05bd15-cbba-47d8-8941-bc53cc01f769-tls-certs\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:10:43.358388 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:43.358354 2575 generic.go:358] "Generic (PLEG): container finished" podID="ba05bd15-cbba-47d8-8941-bc53cc01f769" containerID="412f52886be90d73a2f1c46920954b54eb3aedbfe7e4bd567a39825ec34d3af9" exitCode=0 Apr 22 20:10:43.358596 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:43.358436 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" Apr 22 20:10:43.358596 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:43.358443 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" event={"ID":"ba05bd15-cbba-47d8-8941-bc53cc01f769","Type":"ContainerDied","Data":"412f52886be90d73a2f1c46920954b54eb3aedbfe7e4bd567a39825ec34d3af9"} Apr 22 20:10:43.358596 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:43.358483 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9" event={"ID":"ba05bd15-cbba-47d8-8941-bc53cc01f769","Type":"ContainerDied","Data":"fbb0651d9beb65ab0d21f56a3665435f2980afd8ec00bef6db7cf3a45299b130"} Apr 22 20:10:43.358596 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:43.358500 2575 scope.go:117] "RemoveContainer" containerID="412f52886be90d73a2f1c46920954b54eb3aedbfe7e4bd567a39825ec34d3af9" Apr 22 20:10:43.360552 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:43.360529 2575 generic.go:358] "Generic (PLEG): container finished" podID="0616ada0-d9d3-4a2a-917a-ba2b2da10685" containerID="9959a38bc1faddfd99fe94dc654c842cceaca1efe4577c2d8117e3a9d5444fa5" exitCode=0 Apr 22 20:10:43.360697 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:43.360568 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" event={"ID":"0616ada0-d9d3-4a2a-917a-ba2b2da10685","Type":"ContainerDied","Data":"9959a38bc1faddfd99fe94dc654c842cceaca1efe4577c2d8117e3a9d5444fa5"} Apr 22 20:10:43.369181 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:43.369163 2575 scope.go:117] "RemoveContainer" containerID="417e8d101239f4cb25df778d5c148142691c8312544ed60d8efd60b4fc166d15" Apr 22 20:10:43.379570 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:43.379545 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9"] Apr 22 20:10:43.382636 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:43.382606 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-77b55c8fc-lb4z9"] Apr 22 20:10:43.438139 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:43.437995 2575 scope.go:117] "RemoveContainer" containerID="412f52886be90d73a2f1c46920954b54eb3aedbfe7e4bd567a39825ec34d3af9" Apr 22 20:10:43.438518 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:10:43.438483 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"412f52886be90d73a2f1c46920954b54eb3aedbfe7e4bd567a39825ec34d3af9\": container with ID starting with 412f52886be90d73a2f1c46920954b54eb3aedbfe7e4bd567a39825ec34d3af9 not found: ID does not exist" containerID="412f52886be90d73a2f1c46920954b54eb3aedbfe7e4bd567a39825ec34d3af9" Apr 22 20:10:43.438600 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:43.438510 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"412f52886be90d73a2f1c46920954b54eb3aedbfe7e4bd567a39825ec34d3af9"} err="failed to get container status \"412f52886be90d73a2f1c46920954b54eb3aedbfe7e4bd567a39825ec34d3af9\": rpc error: code = NotFound desc = could not find container \"412f52886be90d73a2f1c46920954b54eb3aedbfe7e4bd567a39825ec34d3af9\": container with ID starting with 412f52886be90d73a2f1c46920954b54eb3aedbfe7e4bd567a39825ec34d3af9 not found: ID does not exist" Apr 22 20:10:43.438600 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:43.438530 2575 scope.go:117] "RemoveContainer" containerID="417e8d101239f4cb25df778d5c148142691c8312544ed60d8efd60b4fc166d15" Apr 22 20:10:43.438815 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:10:43.438795 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"417e8d101239f4cb25df778d5c148142691c8312544ed60d8efd60b4fc166d15\": container with ID starting with 417e8d101239f4cb25df778d5c148142691c8312544ed60d8efd60b4fc166d15 not found: ID does not exist" containerID="417e8d101239f4cb25df778d5c148142691c8312544ed60d8efd60b4fc166d15" Apr 22 20:10:43.438888 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:43.438823 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"417e8d101239f4cb25df778d5c148142691c8312544ed60d8efd60b4fc166d15"} err="failed to get container status \"417e8d101239f4cb25df778d5c148142691c8312544ed60d8efd60b4fc166d15\": rpc error: code = NotFound desc = could not find container \"417e8d101239f4cb25df778d5c148142691c8312544ed60d8efd60b4fc166d15\": container with ID starting with 417e8d101239f4cb25df778d5c148142691c8312544ed60d8efd60b4fc166d15 not found: ID does not exist" Apr 22 20:10:43.732485 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:43.732459 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba05bd15-cbba-47d8-8941-bc53cc01f769" path="/var/lib/kubelet/pods/ba05bd15-cbba-47d8-8941-bc53cc01f769/volumes" Apr 22 20:10:43.901309 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:43.901288 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" Apr 22 20:10:44.012633 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.012604 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0616ada0-d9d3-4a2a-917a-ba2b2da10685-tls-certs\") pod \"0616ada0-d9d3-4a2a-917a-ba2b2da10685\" (UID: \"0616ada0-d9d3-4a2a-917a-ba2b2da10685\") " Apr 22 20:10:44.012804 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.012651 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0616ada0-d9d3-4a2a-917a-ba2b2da10685-tokenizer-tmp\") pod \"0616ada0-d9d3-4a2a-917a-ba2b2da10685\" (UID: \"0616ada0-d9d3-4a2a-917a-ba2b2da10685\") " Apr 22 20:10:44.012804 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.012705 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0616ada0-d9d3-4a2a-917a-ba2b2da10685-kserve-provision-location\") pod \"0616ada0-d9d3-4a2a-917a-ba2b2da10685\" (UID: \"0616ada0-d9d3-4a2a-917a-ba2b2da10685\") " Apr 22 20:10:44.012804 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.012745 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkvq2\" (UniqueName: \"kubernetes.io/projected/0616ada0-d9d3-4a2a-917a-ba2b2da10685-kube-api-access-qkvq2\") pod \"0616ada0-d9d3-4a2a-917a-ba2b2da10685\" (UID: \"0616ada0-d9d3-4a2a-917a-ba2b2da10685\") " Apr 22 20:10:44.012804 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.012764 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0616ada0-d9d3-4a2a-917a-ba2b2da10685-tokenizer-cache\") pod \"0616ada0-d9d3-4a2a-917a-ba2b2da10685\" (UID: \"0616ada0-d9d3-4a2a-917a-ba2b2da10685\") " Apr 22 20:10:44.012804 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.012791 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0616ada0-d9d3-4a2a-917a-ba2b2da10685-tokenizer-uds\") pod \"0616ada0-d9d3-4a2a-917a-ba2b2da10685\" (UID: \"0616ada0-d9d3-4a2a-917a-ba2b2da10685\") " Apr 22 20:10:44.013143 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.013109 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0616ada0-d9d3-4a2a-917a-ba2b2da10685-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "0616ada0-d9d3-4a2a-917a-ba2b2da10685" (UID: "0616ada0-d9d3-4a2a-917a-ba2b2da10685"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:10:44.013143 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.013128 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0616ada0-d9d3-4a2a-917a-ba2b2da10685-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "0616ada0-d9d3-4a2a-917a-ba2b2da10685" (UID: "0616ada0-d9d3-4a2a-917a-ba2b2da10685"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:10:44.013348 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.013181 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0616ada0-d9d3-4a2a-917a-ba2b2da10685-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "0616ada0-d9d3-4a2a-917a-ba2b2da10685" (UID: "0616ada0-d9d3-4a2a-917a-ba2b2da10685"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:10:44.013556 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.013532 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0616ada0-d9d3-4a2a-917a-ba2b2da10685-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0616ada0-d9d3-4a2a-917a-ba2b2da10685" (UID: "0616ada0-d9d3-4a2a-917a-ba2b2da10685"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:10:44.014882 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.014860 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0616ada0-d9d3-4a2a-917a-ba2b2da10685-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "0616ada0-d9d3-4a2a-917a-ba2b2da10685" (UID: "0616ada0-d9d3-4a2a-917a-ba2b2da10685"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:10:44.014943 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.014894 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0616ada0-d9d3-4a2a-917a-ba2b2da10685-kube-api-access-qkvq2" (OuterVolumeSpecName: "kube-api-access-qkvq2") pod "0616ada0-d9d3-4a2a-917a-ba2b2da10685" (UID: "0616ada0-d9d3-4a2a-917a-ba2b2da10685"). InnerVolumeSpecName "kube-api-access-qkvq2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:10:44.113789 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.113756 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0616ada0-d9d3-4a2a-917a-ba2b2da10685-kserve-provision-location\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:10:44.113789 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.113785 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qkvq2\" (UniqueName: \"kubernetes.io/projected/0616ada0-d9d3-4a2a-917a-ba2b2da10685-kube-api-access-qkvq2\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:10:44.113789 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.113795 2575 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0616ada0-d9d3-4a2a-917a-ba2b2da10685-tokenizer-cache\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:10:44.114006 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.113805 2575 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0616ada0-d9d3-4a2a-917a-ba2b2da10685-tokenizer-uds\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:10:44.114006 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.113814 2575 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0616ada0-d9d3-4a2a-917a-ba2b2da10685-tls-certs\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:10:44.114006 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.113822 2575 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0616ada0-d9d3-4a2a-917a-ba2b2da10685-tokenizer-tmp\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:10:44.364940 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.364848 2575 generic.go:358] "Generic (PLEG): container finished" podID="0616ada0-d9d3-4a2a-917a-ba2b2da10685" containerID="bbaaac297e392c48dcdc1b6caab5c81dc1fe87684fa0ea26925fda0d64a1a181" exitCode=0 Apr 22 20:10:44.364940 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.364900 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" event={"ID":"0616ada0-d9d3-4a2a-917a-ba2b2da10685","Type":"ContainerDied","Data":"bbaaac297e392c48dcdc1b6caab5c81dc1fe87684fa0ea26925fda0d64a1a181"} Apr 22 20:10:44.365156 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.364949 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" Apr 22 20:10:44.365156 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.364953 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw" event={"ID":"0616ada0-d9d3-4a2a-917a-ba2b2da10685","Type":"ContainerDied","Data":"f9c5332214ebda333f034e95868d9619bd7ae9ae0c96b62297949dcd47e40bc6"} Apr 22 20:10:44.365156 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.364975 2575 scope.go:117] "RemoveContainer" containerID="bbaaac297e392c48dcdc1b6caab5c81dc1fe87684fa0ea26925fda0d64a1a181" Apr 22 20:10:44.372798 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.372783 2575 scope.go:117] "RemoveContainer" containerID="9959a38bc1faddfd99fe94dc654c842cceaca1efe4577c2d8117e3a9d5444fa5" Apr 22 20:10:44.379705 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.379685 2575 scope.go:117] "RemoveContainer" containerID="225157338e23dade0a34cacb1753c5f59f2825e5cf0a9cd07920701f77231d36" Apr 22 20:10:44.385348 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.385297 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw"] Apr 22 20:10:44.388049 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.388031 2575 scope.go:117] "RemoveContainer" containerID="bbaaac297e392c48dcdc1b6caab5c81dc1fe87684fa0ea26925fda0d64a1a181" Apr 22 20:10:44.388347 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:10:44.388306 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbaaac297e392c48dcdc1b6caab5c81dc1fe87684fa0ea26925fda0d64a1a181\": container with ID starting with bbaaac297e392c48dcdc1b6caab5c81dc1fe87684fa0ea26925fda0d64a1a181 not found: ID does not exist" containerID="bbaaac297e392c48dcdc1b6caab5c81dc1fe87684fa0ea26925fda0d64a1a181" Apr 22 20:10:44.388434 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.388357 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbaaac297e392c48dcdc1b6caab5c81dc1fe87684fa0ea26925fda0d64a1a181"} err="failed to get container status \"bbaaac297e392c48dcdc1b6caab5c81dc1fe87684fa0ea26925fda0d64a1a181\": rpc error: code = NotFound desc = could not find container \"bbaaac297e392c48dcdc1b6caab5c81dc1fe87684fa0ea26925fda0d64a1a181\": container with ID starting with bbaaac297e392c48dcdc1b6caab5c81dc1fe87684fa0ea26925fda0d64a1a181 not found: ID does not exist" Apr 22 20:10:44.388434 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.388384 2575 scope.go:117] "RemoveContainer" containerID="9959a38bc1faddfd99fe94dc654c842cceaca1efe4577c2d8117e3a9d5444fa5" Apr 22 20:10:44.388639 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.388621 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-78bd4f5fhf4xw"] Apr 22 20:10:44.388688 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:10:44.388643 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9959a38bc1faddfd99fe94dc654c842cceaca1efe4577c2d8117e3a9d5444fa5\": container with ID starting with 9959a38bc1faddfd99fe94dc654c842cceaca1efe4577c2d8117e3a9d5444fa5 not found: ID does not exist" containerID="9959a38bc1faddfd99fe94dc654c842cceaca1efe4577c2d8117e3a9d5444fa5" Apr 22 20:10:44.388688 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.388662 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9959a38bc1faddfd99fe94dc654c842cceaca1efe4577c2d8117e3a9d5444fa5"} err="failed to get container status \"9959a38bc1faddfd99fe94dc654c842cceaca1efe4577c2d8117e3a9d5444fa5\": rpc error: code = NotFound desc = could not find container \"9959a38bc1faddfd99fe94dc654c842cceaca1efe4577c2d8117e3a9d5444fa5\": container with ID starting with 9959a38bc1faddfd99fe94dc654c842cceaca1efe4577c2d8117e3a9d5444fa5 not found: ID does not exist" Apr 22 20:10:44.388688 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.388677 2575 scope.go:117] "RemoveContainer" containerID="225157338e23dade0a34cacb1753c5f59f2825e5cf0a9cd07920701f77231d36" Apr 22 20:10:44.388919 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:10:44.388902 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"225157338e23dade0a34cacb1753c5f59f2825e5cf0a9cd07920701f77231d36\": container with ID starting with 225157338e23dade0a34cacb1753c5f59f2825e5cf0a9cd07920701f77231d36 not found: ID does not exist" containerID="225157338e23dade0a34cacb1753c5f59f2825e5cf0a9cd07920701f77231d36" Apr 22 20:10:44.388955 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:44.388926 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"225157338e23dade0a34cacb1753c5f59f2825e5cf0a9cd07920701f77231d36"} err="failed to get container status \"225157338e23dade0a34cacb1753c5f59f2825e5cf0a9cd07920701f77231d36\": rpc error: code = NotFound desc = could not find container \"225157338e23dade0a34cacb1753c5f59f2825e5cf0a9cd07920701f77231d36\": container with ID starting with 225157338e23dade0a34cacb1753c5f59f2825e5cf0a9cd07920701f77231d36 not found: ID does not exist" Apr 22 20:10:45.730939 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:45.730904 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0616ada0-d9d3-4a2a-917a-ba2b2da10685" path="/var/lib/kubelet/pods/0616ada0-d9d3-4a2a-917a-ba2b2da10685/volumes" Apr 22 20:10:59.310217 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.310171 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54"] Apr 22 20:10:59.310706 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.310611 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0616ada0-d9d3-4a2a-917a-ba2b2da10685" containerName="tokenizer" Apr 22 20:10:59.310706 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.310624 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0616ada0-d9d3-4a2a-917a-ba2b2da10685" containerName="tokenizer" Apr 22 20:10:59.310706 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.310645 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba05bd15-cbba-47d8-8941-bc53cc01f769" containerName="storage-initializer" Apr 22 20:10:59.310706 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.310651 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba05bd15-cbba-47d8-8941-bc53cc01f769" containerName="storage-initializer" Apr 22 20:10:59.310706 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.310658 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0616ada0-d9d3-4a2a-917a-ba2b2da10685" containerName="storage-initializer" Apr 22 20:10:59.310706 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.310664 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0616ada0-d9d3-4a2a-917a-ba2b2da10685" containerName="storage-initializer" Apr 22 20:10:59.310706 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.310671 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0616ada0-d9d3-4a2a-917a-ba2b2da10685" containerName="main" Apr 22 20:10:59.310706 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.310676 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0616ada0-d9d3-4a2a-917a-ba2b2da10685" containerName="main" Apr 22 20:10:59.310706 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.310688 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba05bd15-cbba-47d8-8941-bc53cc01f769" containerName="main" Apr 22 20:10:59.310706 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.310693 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba05bd15-cbba-47d8-8941-bc53cc01f769" containerName="main" Apr 22 20:10:59.311084 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.310737 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="0616ada0-d9d3-4a2a-917a-ba2b2da10685" containerName="tokenizer" Apr 22 20:10:59.311084 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.310744 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba05bd15-cbba-47d8-8941-bc53cc01f769" containerName="main" Apr 22 20:10:59.311084 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.310751 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="0616ada0-d9d3-4a2a-917a-ba2b2da10685" containerName="main" Apr 22 20:10:59.313260 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.313245 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54" Apr 22 20:10:59.315745 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.315714 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 20:10:59.315871 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.315756 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tlcdw\"" Apr 22 20:10:59.315871 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.315773 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"conv-test-lora-crit-kserve-self-signed-certs\"" Apr 22 20:10:59.316484 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.316470 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 20:10:59.326348 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.326302 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54"] Apr 22 20:10:59.332098 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.332071 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqlrv\" (UniqueName: \"kubernetes.io/projected/b5d6568e-e008-4edb-b3c5-d1ef33086741-kube-api-access-bqlrv\") pod \"conv-test-lora-crit-kserve-84cd5c895-6wj54\" (UID: \"b5d6568e-e008-4edb-b3c5-d1ef33086741\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54" Apr 22 20:10:59.332224 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.332124 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b5d6568e-e008-4edb-b3c5-d1ef33086741-dshm\") pod \"conv-test-lora-crit-kserve-84cd5c895-6wj54\" (UID: \"b5d6568e-e008-4edb-b3c5-d1ef33086741\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54" Apr 22 20:10:59.332224 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.332195 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b5d6568e-e008-4edb-b3c5-d1ef33086741-home\") pod \"conv-test-lora-crit-kserve-84cd5c895-6wj54\" (UID: \"b5d6568e-e008-4edb-b3c5-d1ef33086741\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54" Apr 22 20:10:59.332329 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.332226 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5d6568e-e008-4edb-b3c5-d1ef33086741-model-cache\") pod \"conv-test-lora-crit-kserve-84cd5c895-6wj54\" (UID: \"b5d6568e-e008-4edb-b3c5-d1ef33086741\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54" Apr 22 20:10:59.332329 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.332263 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b5d6568e-e008-4edb-b3c5-d1ef33086741-tls-certs\") pod \"conv-test-lora-crit-kserve-84cd5c895-6wj54\" (UID: \"b5d6568e-e008-4edb-b3c5-d1ef33086741\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54" Apr 22 20:10:59.332329 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.332287 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5d6568e-e008-4edb-b3c5-d1ef33086741-kserve-provision-location\") pod \"conv-test-lora-crit-kserve-84cd5c895-6wj54\" (UID: \"b5d6568e-e008-4edb-b3c5-d1ef33086741\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54" Apr 22 20:10:59.432853 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.432816 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b5d6568e-e008-4edb-b3c5-d1ef33086741-tls-certs\") pod \"conv-test-lora-crit-kserve-84cd5c895-6wj54\" (UID: \"b5d6568e-e008-4edb-b3c5-d1ef33086741\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54" Apr 22 20:10:59.432853 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.432855 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5d6568e-e008-4edb-b3c5-d1ef33086741-kserve-provision-location\") pod \"conv-test-lora-crit-kserve-84cd5c895-6wj54\" (UID: \"b5d6568e-e008-4edb-b3c5-d1ef33086741\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54" Apr 22 20:10:59.433104 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.432884 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bqlrv\" (UniqueName: \"kubernetes.io/projected/b5d6568e-e008-4edb-b3c5-d1ef33086741-kube-api-access-bqlrv\") pod \"conv-test-lora-crit-kserve-84cd5c895-6wj54\" (UID: \"b5d6568e-e008-4edb-b3c5-d1ef33086741\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54" Apr 22 20:10:59.433104 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.432924 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b5d6568e-e008-4edb-b3c5-d1ef33086741-dshm\") pod \"conv-test-lora-crit-kserve-84cd5c895-6wj54\" (UID: \"b5d6568e-e008-4edb-b3c5-d1ef33086741\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54" Apr 22 20:10:59.433104 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.432951 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b5d6568e-e008-4edb-b3c5-d1ef33086741-home\") pod \"conv-test-lora-crit-kserve-84cd5c895-6wj54\" (UID: \"b5d6568e-e008-4edb-b3c5-d1ef33086741\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54" Apr 22 20:10:59.433104 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.432971 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5d6568e-e008-4edb-b3c5-d1ef33086741-model-cache\") pod \"conv-test-lora-crit-kserve-84cd5c895-6wj54\" (UID: \"b5d6568e-e008-4edb-b3c5-d1ef33086741\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54" Apr 22 20:10:59.433301 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.433266 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5d6568e-e008-4edb-b3c5-d1ef33086741-kserve-provision-location\") pod \"conv-test-lora-crit-kserve-84cd5c895-6wj54\" (UID: \"b5d6568e-e008-4edb-b3c5-d1ef33086741\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54" Apr 22 20:10:59.433399 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.433305 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5d6568e-e008-4edb-b3c5-d1ef33086741-model-cache\") pod \"conv-test-lora-crit-kserve-84cd5c895-6wj54\" (UID: \"b5d6568e-e008-4edb-b3c5-d1ef33086741\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54" Apr 22 20:10:59.433448 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.433392 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b5d6568e-e008-4edb-b3c5-d1ef33086741-home\") pod \"conv-test-lora-crit-kserve-84cd5c895-6wj54\" (UID: \"b5d6568e-e008-4edb-b3c5-d1ef33086741\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54" Apr 22 20:10:59.435385 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.435360 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b5d6568e-e008-4edb-b3c5-d1ef33086741-dshm\") pod \"conv-test-lora-crit-kserve-84cd5c895-6wj54\" (UID: \"b5d6568e-e008-4edb-b3c5-d1ef33086741\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54" Apr 22 20:10:59.435385 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.435377 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b5d6568e-e008-4edb-b3c5-d1ef33086741-tls-certs\") pod \"conv-test-lora-crit-kserve-84cd5c895-6wj54\" (UID: \"b5d6568e-e008-4edb-b3c5-d1ef33086741\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54" Apr 22 20:10:59.440955 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.440928 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqlrv\" (UniqueName: \"kubernetes.io/projected/b5d6568e-e008-4edb-b3c5-d1ef33086741-kube-api-access-bqlrv\") pod \"conv-test-lora-crit-kserve-84cd5c895-6wj54\" (UID: \"b5d6568e-e008-4edb-b3c5-d1ef33086741\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54" Apr 22 20:10:59.623558 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.623476 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54" Apr 22 20:10:59.762605 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:10:59.762577 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54"] Apr 22 20:10:59.765134 ip-10-0-133-60 kubenswrapper[2575]: W0422 20:10:59.765097 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5d6568e_e008_4edb_b3c5_d1ef33086741.slice/crio-d3eacce8642eeb0bd6267e5800648a921fa7f2ab9a2ef712667c76136165c51d WatchSource:0}: Error finding container d3eacce8642eeb0bd6267e5800648a921fa7f2ab9a2ef712667c76136165c51d: Status 404 returned error can't find the container with id d3eacce8642eeb0bd6267e5800648a921fa7f2ab9a2ef712667c76136165c51d Apr 22 20:11:00.416290 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:00.416266 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-84cd5c895-6wj54_b5d6568e-e008-4edb-b3c5-d1ef33086741/storage-initializer/0.log" Apr 22 20:11:00.416701 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:00.416303 2575 generic.go:358] "Generic (PLEG): container finished" podID="b5d6568e-e008-4edb-b3c5-d1ef33086741" containerID="20fe24762b8c98fc2822b2a22568321b1cc6161ec31535859f04c10a05fc4645" exitCode=1 Apr 22 20:11:00.416701 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:00.416355 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54" event={"ID":"b5d6568e-e008-4edb-b3c5-d1ef33086741","Type":"ContainerDied","Data":"20fe24762b8c98fc2822b2a22568321b1cc6161ec31535859f04c10a05fc4645"} Apr 22 20:11:00.416701 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:00.416390 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54" event={"ID":"b5d6568e-e008-4edb-b3c5-d1ef33086741","Type":"ContainerStarted","Data":"d3eacce8642eeb0bd6267e5800648a921fa7f2ab9a2ef712667c76136165c51d"} Apr 22 20:11:01.422597 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:01.422514 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-84cd5c895-6wj54_b5d6568e-e008-4edb-b3c5-d1ef33086741/storage-initializer/1.log" Apr 22 20:11:01.423012 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:01.422865 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-84cd5c895-6wj54_b5d6568e-e008-4edb-b3c5-d1ef33086741/storage-initializer/0.log" Apr 22 20:11:01.423012 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:01.422892 2575 generic.go:358] "Generic (PLEG): container finished" podID="b5d6568e-e008-4edb-b3c5-d1ef33086741" containerID="4df7664186981dd4d8a1ad77ea5cbfc442d817c2a70e4bee7b49f1a9e938a5bf" exitCode=1 Apr 22 20:11:01.423012 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:01.422982 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54" event={"ID":"b5d6568e-e008-4edb-b3c5-d1ef33086741","Type":"ContainerDied","Data":"4df7664186981dd4d8a1ad77ea5cbfc442d817c2a70e4bee7b49f1a9e938a5bf"} Apr 22 20:11:01.423134 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:01.423027 2575 scope.go:117] "RemoveContainer" containerID="20fe24762b8c98fc2822b2a22568321b1cc6161ec31535859f04c10a05fc4645" Apr 22 20:11:01.423242 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:01.423227 2575 scope.go:117] "RemoveContainer" containerID="20fe24762b8c98fc2822b2a22568321b1cc6161ec31535859f04c10a05fc4645" Apr 22 20:11:01.432953 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:11:01.432915 2575 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_conv-test-lora-crit-kserve-84cd5c895-6wj54_kserve-ci-e2e-test_b5d6568e-e008-4edb-b3c5-d1ef33086741_0 in pod sandbox d3eacce8642eeb0bd6267e5800648a921fa7f2ab9a2ef712667c76136165c51d from index: no such id: '20fe24762b8c98fc2822b2a22568321b1cc6161ec31535859f04c10a05fc4645'" containerID="20fe24762b8c98fc2822b2a22568321b1cc6161ec31535859f04c10a05fc4645" Apr 22 20:11:01.433036 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:01.432965 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20fe24762b8c98fc2822b2a22568321b1cc6161ec31535859f04c10a05fc4645"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_conv-test-lora-crit-kserve-84cd5c895-6wj54_kserve-ci-e2e-test_b5d6568e-e008-4edb-b3c5-d1ef33086741_0 in pod sandbox d3eacce8642eeb0bd6267e5800648a921fa7f2ab9a2ef712667c76136165c51d from index: no such id: '20fe24762b8c98fc2822b2a22568321b1cc6161ec31535859f04c10a05fc4645'" Apr 22 20:11:01.433156 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:11:01.433139 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=conv-test-lora-crit-kserve-84cd5c895-6wj54_kserve-ci-e2e-test(b5d6568e-e008-4edb-b3c5-d1ef33086741)\"" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54" podUID="b5d6568e-e008-4edb-b3c5-d1ef33086741" Apr 22 20:11:02.427331 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:02.427244 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-84cd5c895-6wj54_b5d6568e-e008-4edb-b3c5-d1ef33086741/storage-initializer/1.log" Apr 22 20:11:02.427923 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:11:02.427898 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=conv-test-lora-crit-kserve-84cd5c895-6wj54_kserve-ci-e2e-test(b5d6568e-e008-4edb-b3c5-d1ef33086741)\"" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54" podUID="b5d6568e-e008-4edb-b3c5-d1ef33086741" Apr 22 20:11:10.404551 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:10.404512 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54"] Apr 22 20:11:10.528540 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:10.528514 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-84cd5c895-6wj54_b5d6568e-e008-4edb-b3c5-d1ef33086741/storage-initializer/1.log" Apr 22 20:11:10.528653 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:10.528577 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54" Apr 22 20:11:10.634928 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:10.634888 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b5d6568e-e008-4edb-b3c5-d1ef33086741-home\") pod \"b5d6568e-e008-4edb-b3c5-d1ef33086741\" (UID: \"b5d6568e-e008-4edb-b3c5-d1ef33086741\") " Apr 22 20:11:10.635112 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:10.634943 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqlrv\" (UniqueName: \"kubernetes.io/projected/b5d6568e-e008-4edb-b3c5-d1ef33086741-kube-api-access-bqlrv\") pod \"b5d6568e-e008-4edb-b3c5-d1ef33086741\" (UID: \"b5d6568e-e008-4edb-b3c5-d1ef33086741\") " Apr 22 20:11:10.635112 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:10.634980 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5d6568e-e008-4edb-b3c5-d1ef33086741-model-cache\") pod \"b5d6568e-e008-4edb-b3c5-d1ef33086741\" (UID: \"b5d6568e-e008-4edb-b3c5-d1ef33086741\") " Apr 22 20:11:10.635112 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:10.635011 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b5d6568e-e008-4edb-b3c5-d1ef33086741-tls-certs\") pod \"b5d6568e-e008-4edb-b3c5-d1ef33086741\" (UID: \"b5d6568e-e008-4edb-b3c5-d1ef33086741\") " Apr 22 20:11:10.635269 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:10.635140 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b5d6568e-e008-4edb-b3c5-d1ef33086741-dshm\") pod \"b5d6568e-e008-4edb-b3c5-d1ef33086741\" (UID: \"b5d6568e-e008-4edb-b3c5-d1ef33086741\") " Apr 22 20:11:10.635269 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:10.635184 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5d6568e-e008-4edb-b3c5-d1ef33086741-kserve-provision-location\") pod \"b5d6568e-e008-4edb-b3c5-d1ef33086741\" (UID: \"b5d6568e-e008-4edb-b3c5-d1ef33086741\") " Apr 22 20:11:10.635402 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:10.635181 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5d6568e-e008-4edb-b3c5-d1ef33086741-home" (OuterVolumeSpecName: "home") pod "b5d6568e-e008-4edb-b3c5-d1ef33086741" (UID: "b5d6568e-e008-4edb-b3c5-d1ef33086741"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:11:10.635402 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:10.635279 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5d6568e-e008-4edb-b3c5-d1ef33086741-model-cache" (OuterVolumeSpecName: "model-cache") pod "b5d6568e-e008-4edb-b3c5-d1ef33086741" (UID: "b5d6568e-e008-4edb-b3c5-d1ef33086741"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:11:10.635561 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:10.635545 2575 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b5d6568e-e008-4edb-b3c5-d1ef33086741-home\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:11:10.635613 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:10.635567 2575 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5d6568e-e008-4edb-b3c5-d1ef33086741-model-cache\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:11:10.635668 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:10.635644 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5d6568e-e008-4edb-b3c5-d1ef33086741-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b5d6568e-e008-4edb-b3c5-d1ef33086741" (UID: "b5d6568e-e008-4edb-b3c5-d1ef33086741"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:11:10.637167 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:10.637142 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5d6568e-e008-4edb-b3c5-d1ef33086741-kube-api-access-bqlrv" (OuterVolumeSpecName: "kube-api-access-bqlrv") pod "b5d6568e-e008-4edb-b3c5-d1ef33086741" (UID: "b5d6568e-e008-4edb-b3c5-d1ef33086741"). InnerVolumeSpecName "kube-api-access-bqlrv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:11:10.637291 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:10.637264 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5d6568e-e008-4edb-b3c5-d1ef33086741-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b5d6568e-e008-4edb-b3c5-d1ef33086741" (UID: "b5d6568e-e008-4edb-b3c5-d1ef33086741"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:11:10.637534 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:10.637508 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5d6568e-e008-4edb-b3c5-d1ef33086741-dshm" (OuterVolumeSpecName: "dshm") pod "b5d6568e-e008-4edb-b3c5-d1ef33086741" (UID: "b5d6568e-e008-4edb-b3c5-d1ef33086741"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:11:10.736187 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:10.736097 2575 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b5d6568e-e008-4edb-b3c5-d1ef33086741-tls-certs\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:11:10.736187 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:10.736129 2575 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b5d6568e-e008-4edb-b3c5-d1ef33086741-dshm\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:11:10.736187 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:10.736139 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5d6568e-e008-4edb-b3c5-d1ef33086741-kserve-provision-location\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:11:10.736187 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:10.736147 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bqlrv\" (UniqueName: \"kubernetes.io/projected/b5d6568e-e008-4edb-b3c5-d1ef33086741-kube-api-access-bqlrv\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:11:11.455038 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:11.455011 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-84cd5c895-6wj54_b5d6568e-e008-4edb-b3c5-d1ef33086741/storage-initializer/1.log" Apr 22 20:11:11.455446 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:11.455092 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54" event={"ID":"b5d6568e-e008-4edb-b3c5-d1ef33086741","Type":"ContainerDied","Data":"d3eacce8642eeb0bd6267e5800648a921fa7f2ab9a2ef712667c76136165c51d"} Apr 22 20:11:11.455446 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:11.455130 2575 scope.go:117] "RemoveContainer" containerID="4df7664186981dd4d8a1ad77ea5cbfc442d817c2a70e4bee7b49f1a9e938a5bf" Apr 22 20:11:11.455446 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:11.455131 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54" Apr 22 20:11:11.489790 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:11.489759 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54"] Apr 22 20:11:11.493595 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:11.493566 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-84cd5c895-6wj54"] Apr 22 20:11:11.731561 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:11:11.731484 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5d6568e-e008-4edb-b3c5-d1ef33086741" path="/var/lib/kubelet/pods/b5d6568e-e008-4edb-b3c5-d1ef33086741/volumes" Apr 22 20:12:26.510634 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.510595 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr"] Apr 22 20:12:26.511084 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.510926 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5d6568e-e008-4edb-b3c5-d1ef33086741" containerName="storage-initializer" Apr 22 20:12:26.511084 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.510937 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5d6568e-e008-4edb-b3c5-d1ef33086741" containerName="storage-initializer" Apr 22 20:12:26.511084 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.510944 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5d6568e-e008-4edb-b3c5-d1ef33086741" containerName="storage-initializer" Apr 22 20:12:26.511084 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.510950 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5d6568e-e008-4edb-b3c5-d1ef33086741" containerName="storage-initializer" Apr 22 20:12:26.511084 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.510998 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5d6568e-e008-4edb-b3c5-d1ef33086741" containerName="storage-initializer" Apr 22 20:12:26.511084 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.511007 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5d6568e-e008-4edb-b3c5-d1ef33086741" containerName="storage-initializer" Apr 22 20:12:26.514003 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.513987 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" Apr 22 20:12:26.516491 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.516461 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 20:12:26.516491 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.516485 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tlcdw\"" Apr 22 20:12:26.517383 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.517364 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 20:12:26.517448 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.517404 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-nx4kj\"" Apr 22 20:12:26.517448 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.517415 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 22 20:12:26.524992 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.524957 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr"] Apr 22 20:12:26.673858 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.673816 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr\" (UID: \"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" Apr 22 20:12:26.674028 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.673864 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr\" (UID: \"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" Apr 22 20:12:26.674028 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.673896 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frwgj\" (UniqueName: \"kubernetes.io/projected/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-kube-api-access-frwgj\") pod \"custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr\" (UID: \"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" Apr 22 20:12:26.674028 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.673955 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr\" (UID: \"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" Apr 22 20:12:26.674028 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.673976 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr\" (UID: \"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" Apr 22 20:12:26.674028 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.673999 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr\" (UID: \"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" Apr 22 20:12:26.774841 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.774754 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr\" (UID: \"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" Apr 22 20:12:26.774841 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.774795 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr\" (UID: \"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" Apr 22 20:12:26.774841 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.774824 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr\" (UID: \"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" Apr 22 20:12:26.775078 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.774859 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr\" (UID: \"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" Apr 22 20:12:26.775078 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.774880 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr\" (UID: \"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" Apr 22 20:12:26.775078 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.774905 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frwgj\" (UniqueName: \"kubernetes.io/projected/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-kube-api-access-frwgj\") pod \"custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr\" (UID: \"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" Apr 22 20:12:26.775257 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.775215 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr\" (UID: \"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" Apr 22 20:12:26.775417 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.775267 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr\" (UID: \"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" Apr 22 20:12:26.775417 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.775362 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr\" (UID: \"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" Apr 22 20:12:26.775417 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.775404 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr\" (UID: \"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" Apr 22 20:12:26.777339 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.777306 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr\" (UID: \"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" Apr 22 20:12:26.782570 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.782546 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frwgj\" (UniqueName: \"kubernetes.io/projected/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-kube-api-access-frwgj\") pod \"custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr\" (UID: \"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" Apr 22 20:12:26.846467 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.846424 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" Apr 22 20:12:26.975283 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:26.975253 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr"] Apr 22 20:12:26.977981 ip-10-0-133-60 kubenswrapper[2575]: W0422 20:12:26.977946 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddca1aa47_7f66_4b0a_897d_8d5a895e8ec0.slice/crio-9f2e8235e06978e92d201ea5f008260410c6ad15aced6b60fe4fb7be4e634e9c WatchSource:0}: Error finding container 9f2e8235e06978e92d201ea5f008260410c6ad15aced6b60fe4fb7be4e634e9c: Status 404 returned error can't find the container with id 9f2e8235e06978e92d201ea5f008260410c6ad15aced6b60fe4fb7be4e634e9c Apr 22 20:12:27.690479 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:27.690439 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" event={"ID":"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0","Type":"ContainerStarted","Data":"fb6b756e3a6a1c66407163d75cfda94d40e10edcbe54a1ce67e80123a6fface4"} Apr 22 20:12:27.690479 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:27.690480 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" event={"ID":"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0","Type":"ContainerStarted","Data":"9f2e8235e06978e92d201ea5f008260410c6ad15aced6b60fe4fb7be4e634e9c"} Apr 22 20:12:28.694629 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:28.694597 2575 generic.go:358] "Generic (PLEG): container finished" podID="dca1aa47-7f66-4b0a-897d-8d5a895e8ec0" containerID="fb6b756e3a6a1c66407163d75cfda94d40e10edcbe54a1ce67e80123a6fface4" exitCode=0 Apr 22 20:12:28.695028 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:28.694683 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" event={"ID":"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0","Type":"ContainerDied","Data":"fb6b756e3a6a1c66407163d75cfda94d40e10edcbe54a1ce67e80123a6fface4"} Apr 22 20:12:29.705546 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:29.705515 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" event={"ID":"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0","Type":"ContainerStarted","Data":"2b6316667e8973076e60d9bb7099096891ee1d02ed8618b3d67619f7ab6269ca"} Apr 22 20:12:29.705546 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:29.705551 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" event={"ID":"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0","Type":"ContainerStarted","Data":"5a7fb9c55f1e14c0b95d7dcec55c94387c0a5ec45c5a7d1acf9e8de7460a7a57"} Apr 22 20:12:29.705956 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:29.705682 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" Apr 22 20:12:29.727236 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:29.727190 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" podStartSLOduration=3.727177498 podStartE2EDuration="3.727177498s" podCreationTimestamp="2026-04-22 20:12:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:12:29.724902017 +0000 UTC m=+886.538287199" watchObservedRunningTime="2026-04-22 20:12:29.727177498 +0000 UTC m=+886.540562678" Apr 22 20:12:36.846729 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:36.846638 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" Apr 22 20:12:36.846729 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:36.846692 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" Apr 22 20:12:36.849656 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:36.849629 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" Apr 22 20:12:37.730750 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:37.730722 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" Apr 22 20:12:43.688869 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:43.688840 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vfhht_338d333b-8626-46b3-b450-fbdd521183d8/console-operator/1.log" Apr 22 20:12:43.689457 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:43.689434 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vfhht_338d333b-8626-46b3-b450-fbdd521183d8/console-operator/1.log" Apr 22 20:12:58.733983 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:12:58.733952 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" Apr 22 20:14:22.751135 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:22.751102 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr"] Apr 22 20:14:22.751653 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:22.751415 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" podUID="dca1aa47-7f66-4b0a-897d-8d5a895e8ec0" containerName="main" containerID="cri-o://5a7fb9c55f1e14c0b95d7dcec55c94387c0a5ec45c5a7d1acf9e8de7460a7a57" gracePeriod=30 Apr 22 20:14:22.751653 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:22.751487 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" podUID="dca1aa47-7f66-4b0a-897d-8d5a895e8ec0" containerName="tokenizer" containerID="cri-o://2b6316667e8973076e60d9bb7099096891ee1d02ed8618b3d67619f7ab6269ca" gracePeriod=30 Apr 22 20:14:23.051745 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:23.051665 2575 generic.go:358] "Generic (PLEG): container finished" podID="dca1aa47-7f66-4b0a-897d-8d5a895e8ec0" containerID="5a7fb9c55f1e14c0b95d7dcec55c94387c0a5ec45c5a7d1acf9e8de7460a7a57" exitCode=0 Apr 22 20:14:23.051887 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:23.051742 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" event={"ID":"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0","Type":"ContainerDied","Data":"5a7fb9c55f1e14c0b95d7dcec55c94387c0a5ec45c5a7d1acf9e8de7460a7a57"} Apr 22 20:14:24.002679 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.002658 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" Apr 22 20:14:24.056094 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.056059 2575 generic.go:358] "Generic (PLEG): container finished" podID="dca1aa47-7f66-4b0a-897d-8d5a895e8ec0" containerID="2b6316667e8973076e60d9bb7099096891ee1d02ed8618b3d67619f7ab6269ca" exitCode=0 Apr 22 20:14:24.056233 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.056130 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" Apr 22 20:14:24.056233 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.056125 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" event={"ID":"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0","Type":"ContainerDied","Data":"2b6316667e8973076e60d9bb7099096891ee1d02ed8618b3d67619f7ab6269ca"} Apr 22 20:14:24.056233 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.056226 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr" event={"ID":"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0","Type":"ContainerDied","Data":"9f2e8235e06978e92d201ea5f008260410c6ad15aced6b60fe4fb7be4e634e9c"} Apr 22 20:14:24.056385 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.056242 2575 scope.go:117] "RemoveContainer" containerID="2b6316667e8973076e60d9bb7099096891ee1d02ed8618b3d67619f7ab6269ca" Apr 22 20:14:24.064087 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.064062 2575 scope.go:117] "RemoveContainer" containerID="5a7fb9c55f1e14c0b95d7dcec55c94387c0a5ec45c5a7d1acf9e8de7460a7a57" Apr 22 20:14:24.070837 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.070822 2575 scope.go:117] "RemoveContainer" containerID="fb6b756e3a6a1c66407163d75cfda94d40e10edcbe54a1ce67e80123a6fface4" Apr 22 20:14:24.077929 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.077908 2575 scope.go:117] "RemoveContainer" containerID="2b6316667e8973076e60d9bb7099096891ee1d02ed8618b3d67619f7ab6269ca" Apr 22 20:14:24.078178 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:14:24.078158 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b6316667e8973076e60d9bb7099096891ee1d02ed8618b3d67619f7ab6269ca\": container with ID starting with 2b6316667e8973076e60d9bb7099096891ee1d02ed8618b3d67619f7ab6269ca not found: ID does not exist" containerID="2b6316667e8973076e60d9bb7099096891ee1d02ed8618b3d67619f7ab6269ca" Apr 22 20:14:24.078257 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.078192 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b6316667e8973076e60d9bb7099096891ee1d02ed8618b3d67619f7ab6269ca"} err="failed to get container status \"2b6316667e8973076e60d9bb7099096891ee1d02ed8618b3d67619f7ab6269ca\": rpc error: code = NotFound desc = could not find container \"2b6316667e8973076e60d9bb7099096891ee1d02ed8618b3d67619f7ab6269ca\": container with ID starting with 2b6316667e8973076e60d9bb7099096891ee1d02ed8618b3d67619f7ab6269ca not found: ID does not exist" Apr 22 20:14:24.078257 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.078219 2575 scope.go:117] "RemoveContainer" containerID="5a7fb9c55f1e14c0b95d7dcec55c94387c0a5ec45c5a7d1acf9e8de7460a7a57" Apr 22 20:14:24.078499 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:14:24.078481 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a7fb9c55f1e14c0b95d7dcec55c94387c0a5ec45c5a7d1acf9e8de7460a7a57\": container with ID starting with 5a7fb9c55f1e14c0b95d7dcec55c94387c0a5ec45c5a7d1acf9e8de7460a7a57 not found: ID does not exist" containerID="5a7fb9c55f1e14c0b95d7dcec55c94387c0a5ec45c5a7d1acf9e8de7460a7a57" Apr 22 20:14:24.078543 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.078506 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a7fb9c55f1e14c0b95d7dcec55c94387c0a5ec45c5a7d1acf9e8de7460a7a57"} err="failed to get container status \"5a7fb9c55f1e14c0b95d7dcec55c94387c0a5ec45c5a7d1acf9e8de7460a7a57\": rpc error: code = NotFound desc = could not find container \"5a7fb9c55f1e14c0b95d7dcec55c94387c0a5ec45c5a7d1acf9e8de7460a7a57\": container with ID starting with 5a7fb9c55f1e14c0b95d7dcec55c94387c0a5ec45c5a7d1acf9e8de7460a7a57 not found: ID does not exist" Apr 22 20:14:24.078543 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.078523 2575 scope.go:117] "RemoveContainer" containerID="fb6b756e3a6a1c66407163d75cfda94d40e10edcbe54a1ce67e80123a6fface4" Apr 22 20:14:24.078731 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:14:24.078713 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb6b756e3a6a1c66407163d75cfda94d40e10edcbe54a1ce67e80123a6fface4\": container with ID starting with fb6b756e3a6a1c66407163d75cfda94d40e10edcbe54a1ce67e80123a6fface4 not found: ID does not exist" containerID="fb6b756e3a6a1c66407163d75cfda94d40e10edcbe54a1ce67e80123a6fface4" Apr 22 20:14:24.078775 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.078737 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb6b756e3a6a1c66407163d75cfda94d40e10edcbe54a1ce67e80123a6fface4"} err="failed to get container status \"fb6b756e3a6a1c66407163d75cfda94d40e10edcbe54a1ce67e80123a6fface4\": rpc error: code = NotFound desc = could not find container \"fb6b756e3a6a1c66407163d75cfda94d40e10edcbe54a1ce67e80123a6fface4\": container with ID starting with fb6b756e3a6a1c66407163d75cfda94d40e10edcbe54a1ce67e80123a6fface4 not found: ID does not exist" Apr 22 20:14:24.158355 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.158302 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-tokenizer-cache\") pod \"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0\" (UID: \"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0\") " Apr 22 20:14:24.158530 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.158363 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-tls-certs\") pod \"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0\" (UID: \"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0\") " Apr 22 20:14:24.158530 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.158393 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-tokenizer-tmp\") pod \"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0\" (UID: \"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0\") " Apr 22 20:14:24.158530 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.158427 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frwgj\" (UniqueName: \"kubernetes.io/projected/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-kube-api-access-frwgj\") pod \"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0\" (UID: \"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0\") " Apr 22 20:14:24.158530 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.158461 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-tokenizer-uds\") pod \"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0\" (UID: \"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0\") " Apr 22 20:14:24.158530 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.158501 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-kserve-provision-location\") pod \"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0\" (UID: \"dca1aa47-7f66-4b0a-897d-8d5a895e8ec0\") " Apr 22 20:14:24.158780 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.158663 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "dca1aa47-7f66-4b0a-897d-8d5a895e8ec0" (UID: "dca1aa47-7f66-4b0a-897d-8d5a895e8ec0"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:14:24.158780 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.158737 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "dca1aa47-7f66-4b0a-897d-8d5a895e8ec0" (UID: "dca1aa47-7f66-4b0a-897d-8d5a895e8ec0"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:14:24.158866 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.158784 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "dca1aa47-7f66-4b0a-897d-8d5a895e8ec0" (UID: "dca1aa47-7f66-4b0a-897d-8d5a895e8ec0"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:14:24.158866 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.158794 2575 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-tokenizer-cache\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:14:24.159250 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.159230 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dca1aa47-7f66-4b0a-897d-8d5a895e8ec0" (UID: "dca1aa47-7f66-4b0a-897d-8d5a895e8ec0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:14:24.160623 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.160603 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "dca1aa47-7f66-4b0a-897d-8d5a895e8ec0" (UID: "dca1aa47-7f66-4b0a-897d-8d5a895e8ec0"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:14:24.160696 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.160668 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-kube-api-access-frwgj" (OuterVolumeSpecName: "kube-api-access-frwgj") pod "dca1aa47-7f66-4b0a-897d-8d5a895e8ec0" (UID: "dca1aa47-7f66-4b0a-897d-8d5a895e8ec0"). InnerVolumeSpecName "kube-api-access-frwgj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:14:24.259427 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.259387 2575 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-tls-certs\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:14:24.259427 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.259423 2575 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-tokenizer-tmp\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:14:24.259427 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.259433 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-frwgj\" (UniqueName: \"kubernetes.io/projected/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-kube-api-access-frwgj\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:14:24.259647 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.259442 2575 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-tokenizer-uds\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:14:24.259647 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.259451 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0-kserve-provision-location\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:14:24.378284 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.378248 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr"] Apr 22 20:14:24.381999 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:24.381974 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-dbdb7c5b6ddnr"] Apr 22 20:14:25.730582 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:25.730551 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dca1aa47-7f66-4b0a-897d-8d5a895e8ec0" path="/var/lib/kubelet/pods/dca1aa47-7f66-4b0a-897d-8d5a895e8ec0/volumes" Apr 22 20:14:45.878780 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:45.878746 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x"] Apr 22 20:14:45.879186 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:45.879042 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dca1aa47-7f66-4b0a-897d-8d5a895e8ec0" containerName="main" Apr 22 20:14:45.879186 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:45.879052 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="dca1aa47-7f66-4b0a-897d-8d5a895e8ec0" containerName="main" Apr 22 20:14:45.879186 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:45.879062 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dca1aa47-7f66-4b0a-897d-8d5a895e8ec0" containerName="tokenizer" Apr 22 20:14:45.879186 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:45.879068 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="dca1aa47-7f66-4b0a-897d-8d5a895e8ec0" containerName="tokenizer" Apr 22 20:14:45.879186 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:45.879078 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dca1aa47-7f66-4b0a-897d-8d5a895e8ec0" containerName="storage-initializer" Apr 22 20:14:45.879186 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:45.879083 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="dca1aa47-7f66-4b0a-897d-8d5a895e8ec0" containerName="storage-initializer" Apr 22 20:14:45.879186 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:45.879133 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="dca1aa47-7f66-4b0a-897d-8d5a895e8ec0" containerName="main" Apr 22 20:14:45.879186 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:45.879141 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="dca1aa47-7f66-4b0a-897d-8d5a895e8ec0" containerName="tokenizer" Apr 22 20:14:45.883188 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:45.883167 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" Apr 22 20:14:45.885740 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:45.885585 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 20:14:45.885740 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:45.885620 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-epp-sa-dockercfg-m8flb\"" Apr 22 20:14:45.885884 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:45.885794 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 20:14:45.886251 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:45.886232 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 22 20:14:45.886356 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:45.886285 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tlcdw\"" Apr 22 20:14:45.890335 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:45.890299 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x"] Apr 22 20:14:45.920853 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:45.920825 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsjl8\" (UniqueName: \"kubernetes.io/projected/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-kube-api-access-rsjl8\") pod \"router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x\" (UID: \"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" Apr 22 20:14:45.920993 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:45.920875 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x\" (UID: \"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" Apr 22 20:14:45.920993 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:45.920892 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x\" (UID: \"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" Apr 22 20:14:45.920993 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:45.920953 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x\" (UID: \"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" Apr 22 20:14:45.920993 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:45.920979 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x\" (UID: \"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" Apr 22 20:14:45.921124 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:45.921020 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x\" (UID: \"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" Apr 22 20:14:46.022295 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:46.022250 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x\" (UID: \"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" Apr 22 20:14:46.022295 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:46.022285 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rsjl8\" (UniqueName: \"kubernetes.io/projected/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-kube-api-access-rsjl8\") pod \"router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x\" (UID: \"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" Apr 22 20:14:46.022543 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:46.022346 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x\" (UID: \"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" Apr 22 20:14:46.022543 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:46.022365 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x\" (UID: \"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" Apr 22 20:14:46.022543 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:46.022391 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x\" (UID: \"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" Apr 22 20:14:46.022543 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:46.022414 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x\" (UID: \"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" Apr 22 20:14:46.022750 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:46.022725 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x\" (UID: \"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" Apr 22 20:14:46.022750 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:46.022740 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x\" (UID: \"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" Apr 22 20:14:46.022849 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:46.022781 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x\" (UID: \"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" Apr 22 20:14:46.022849 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:46.022842 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x\" (UID: \"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" Apr 22 20:14:46.024815 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:46.024794 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x\" (UID: \"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" Apr 22 20:14:46.029931 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:46.029913 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsjl8\" (UniqueName: \"kubernetes.io/projected/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-kube-api-access-rsjl8\") pod \"router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x\" (UID: \"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" Apr 22 20:14:46.194228 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:46.194137 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" Apr 22 20:14:46.316909 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:46.316881 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x"] Apr 22 20:14:46.319968 ip-10-0-133-60 kubenswrapper[2575]: W0422 20:14:46.319871 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a0fa4a2_8ba8_485e_a891_e6bffe5c7eb8.slice/crio-8770d79210bd1902e07440cbb50752da2571dc79b74569dcf13973e2066705d9 WatchSource:0}: Error finding container 8770d79210bd1902e07440cbb50752da2571dc79b74569dcf13973e2066705d9: Status 404 returned error can't find the container with id 8770d79210bd1902e07440cbb50752da2571dc79b74569dcf13973e2066705d9 Apr 22 20:14:46.322090 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:46.322073 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:14:47.123293 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:47.123254 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" event={"ID":"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8","Type":"ContainerStarted","Data":"b70ba7e1a8a3450cce9a91aa9925157300a78247d5c6161fc789ea29da8d6132"} Apr 22 20:14:47.123293 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:47.123295 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" event={"ID":"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8","Type":"ContainerStarted","Data":"8770d79210bd1902e07440cbb50752da2571dc79b74569dcf13973e2066705d9"} Apr 22 20:14:48.127241 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:48.127197 2575 generic.go:358] "Generic (PLEG): container finished" podID="8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8" containerID="b70ba7e1a8a3450cce9a91aa9925157300a78247d5c6161fc789ea29da8d6132" exitCode=0 Apr 22 20:14:48.127655 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:48.127287 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" event={"ID":"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8","Type":"ContainerDied","Data":"b70ba7e1a8a3450cce9a91aa9925157300a78247d5c6161fc789ea29da8d6132"} Apr 22 20:14:49.132386 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:49.132347 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" event={"ID":"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8","Type":"ContainerStarted","Data":"5743a0e1a2449abb431d367cb9b9c21c8b641260dfa1abeb8783515c3c68e618"} Apr 22 20:14:49.132808 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:49.132393 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" event={"ID":"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8","Type":"ContainerStarted","Data":"6345cef0c9479826aa1480eae21de6e2a32027f4c07add57b72c33e3d4bebd92"} Apr 22 20:14:49.132808 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:49.132484 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" Apr 22 20:14:49.153146 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:49.153097 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" podStartSLOduration=4.153081642 podStartE2EDuration="4.153081642s" podCreationTimestamp="2026-04-22 20:14:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:14:49.15162283 +0000 UTC m=+1025.965008023" watchObservedRunningTime="2026-04-22 20:14:49.153081642 +0000 UTC m=+1025.966466827" Apr 22 20:14:56.194573 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:56.194528 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" Apr 22 20:14:56.194573 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:56.194579 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" Apr 22 20:14:56.197278 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:56.197254 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" Apr 22 20:14:57.160928 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:14:57.160899 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" Apr 22 20:15:18.164505 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:15:18.164474 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" Apr 22 20:15:46.743667 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:15:46.743634 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-55f79dccc6-fvrwp"] Apr 22 20:15:46.744171 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:15:46.743923 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-55f79dccc6-fvrwp" podUID="0f2e04ae-561b-4e42-abd9-76587f41da06" containerName="manager" containerID="cri-o://fe93a825496f44c15310b75a40b3556a51afc3f5623b9569fa6228717c78ba92" gracePeriod=30 Apr 22 20:15:46.997643 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:15:46.997621 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-55f79dccc6-fvrwp" Apr 22 20:15:47.017733 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:15:47.017709 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f2e04ae-561b-4e42-abd9-76587f41da06-cert\") pod \"0f2e04ae-561b-4e42-abd9-76587f41da06\" (UID: \"0f2e04ae-561b-4e42-abd9-76587f41da06\") " Apr 22 20:15:47.017864 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:15:47.017768 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bczgx\" (UniqueName: \"kubernetes.io/projected/0f2e04ae-561b-4e42-abd9-76587f41da06-kube-api-access-bczgx\") pod \"0f2e04ae-561b-4e42-abd9-76587f41da06\" (UID: \"0f2e04ae-561b-4e42-abd9-76587f41da06\") " Apr 22 20:15:47.019991 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:15:47.019962 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f2e04ae-561b-4e42-abd9-76587f41da06-cert" (OuterVolumeSpecName: "cert") pod "0f2e04ae-561b-4e42-abd9-76587f41da06" (UID: "0f2e04ae-561b-4e42-abd9-76587f41da06"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:15:47.019991 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:15:47.019963 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f2e04ae-561b-4e42-abd9-76587f41da06-kube-api-access-bczgx" (OuterVolumeSpecName: "kube-api-access-bczgx") pod "0f2e04ae-561b-4e42-abd9-76587f41da06" (UID: "0f2e04ae-561b-4e42-abd9-76587f41da06"). InnerVolumeSpecName "kube-api-access-bczgx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:15:47.118286 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:15:47.118228 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bczgx\" (UniqueName: \"kubernetes.io/projected/0f2e04ae-561b-4e42-abd9-76587f41da06-kube-api-access-bczgx\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:15:47.118286 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:15:47.118280 2575 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f2e04ae-561b-4e42-abd9-76587f41da06-cert\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:15:47.307929 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:15:47.307843 2575 generic.go:358] "Generic (PLEG): container finished" podID="0f2e04ae-561b-4e42-abd9-76587f41da06" containerID="fe93a825496f44c15310b75a40b3556a51afc3f5623b9569fa6228717c78ba92" exitCode=0 Apr 22 20:15:47.307929 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:15:47.307912 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-55f79dccc6-fvrwp" Apr 22 20:15:47.308123 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:15:47.307910 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-55f79dccc6-fvrwp" event={"ID":"0f2e04ae-561b-4e42-abd9-76587f41da06","Type":"ContainerDied","Data":"fe93a825496f44c15310b75a40b3556a51afc3f5623b9569fa6228717c78ba92"} Apr 22 20:15:47.308123 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:15:47.308018 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-55f79dccc6-fvrwp" event={"ID":"0f2e04ae-561b-4e42-abd9-76587f41da06","Type":"ContainerDied","Data":"5c92b59c87d227121312a2376770e0aefd9279784f01fb95307341269d059983"} Apr 22 20:15:47.308123 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:15:47.308038 2575 scope.go:117] "RemoveContainer" containerID="fe93a825496f44c15310b75a40b3556a51afc3f5623b9569fa6228717c78ba92" Apr 22 20:15:47.319872 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:15:47.319846 2575 scope.go:117] "RemoveContainer" containerID="fe93a825496f44c15310b75a40b3556a51afc3f5623b9569fa6228717c78ba92" Apr 22 20:15:47.320163 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:15:47.320146 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe93a825496f44c15310b75a40b3556a51afc3f5623b9569fa6228717c78ba92\": container with ID starting with fe93a825496f44c15310b75a40b3556a51afc3f5623b9569fa6228717c78ba92 not found: ID does not exist" containerID="fe93a825496f44c15310b75a40b3556a51afc3f5623b9569fa6228717c78ba92" Apr 22 20:15:47.320240 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:15:47.320169 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe93a825496f44c15310b75a40b3556a51afc3f5623b9569fa6228717c78ba92"} err="failed to get container status \"fe93a825496f44c15310b75a40b3556a51afc3f5623b9569fa6228717c78ba92\": rpc error: code = NotFound desc = could not find container \"fe93a825496f44c15310b75a40b3556a51afc3f5623b9569fa6228717c78ba92\": container with ID starting with fe93a825496f44c15310b75a40b3556a51afc3f5623b9569fa6228717c78ba92 not found: ID does not exist" Apr 22 20:15:47.327762 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:15:47.327740 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-55f79dccc6-fvrwp"] Apr 22 20:15:47.333599 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:15:47.333576 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-55f79dccc6-fvrwp"] Apr 22 20:15:47.731364 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:15:47.731240 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f2e04ae-561b-4e42-abd9-76587f41da06" path="/var/lib/kubelet/pods/0f2e04ae-561b-4e42-abd9-76587f41da06/volumes" Apr 22 20:16:39.899992 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:39.899952 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x"] Apr 22 20:16:39.900538 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:39.900292 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" podUID="8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8" containerName="main" containerID="cri-o://6345cef0c9479826aa1480eae21de6e2a32027f4c07add57b72c33e3d4bebd92" gracePeriod=30 Apr 22 20:16:39.901056 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:39.900425 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" podUID="8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8" containerName="tokenizer" containerID="cri-o://5743a0e1a2449abb431d367cb9b9c21c8b641260dfa1abeb8783515c3c68e618" gracePeriod=30 Apr 22 20:16:40.466248 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:40.466212 2575 generic.go:358] "Generic (PLEG): container finished" podID="8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8" containerID="6345cef0c9479826aa1480eae21de6e2a32027f4c07add57b72c33e3d4bebd92" exitCode=0 Apr 22 20:16:40.466457 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:40.466288 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" event={"ID":"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8","Type":"ContainerDied","Data":"6345cef0c9479826aa1480eae21de6e2a32027f4c07add57b72c33e3d4bebd92"} Apr 22 20:16:41.142737 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.142713 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" Apr 22 20:16:41.257296 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.257265 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-tokenizer-uds\") pod \"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8\" (UID: \"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8\") " Apr 22 20:16:41.257480 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.257383 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsjl8\" (UniqueName: \"kubernetes.io/projected/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-kube-api-access-rsjl8\") pod \"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8\" (UID: \"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8\") " Apr 22 20:16:41.257480 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.257411 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-tls-certs\") pod \"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8\" (UID: \"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8\") " Apr 22 20:16:41.257480 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.257433 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-kserve-provision-location\") pod \"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8\" (UID: \"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8\") " Apr 22 20:16:41.257480 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.257457 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-tokenizer-cache\") pod \"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8\" (UID: \"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8\") " Apr 22 20:16:41.257480 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.257474 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-tokenizer-tmp\") pod \"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8\" (UID: \"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8\") " Apr 22 20:16:41.257740 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.257563 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8" (UID: "8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:16:41.257740 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.257697 2575 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-tokenizer-uds\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:16:41.257844 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.257742 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8" (UID: "8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:16:41.257906 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.257881 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8" (UID: "8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:16:41.258211 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.258187 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8" (UID: "8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:16:41.259479 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.259458 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-kube-api-access-rsjl8" (OuterVolumeSpecName: "kube-api-access-rsjl8") pod "8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8" (UID: "8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8"). InnerVolumeSpecName "kube-api-access-rsjl8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:16:41.259736 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.259717 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8" (UID: "8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:16:41.358592 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.358556 2575 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-tokenizer-cache\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:16:41.358592 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.358587 2575 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-tokenizer-tmp\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:16:41.358592 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.358597 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rsjl8\" (UniqueName: \"kubernetes.io/projected/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-kube-api-access-rsjl8\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:16:41.358819 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.358608 2575 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-tls-certs\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:16:41.358819 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.358618 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8-kserve-provision-location\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:16:41.471883 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.471843 2575 generic.go:358] "Generic (PLEG): container finished" podID="8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8" containerID="5743a0e1a2449abb431d367cb9b9c21c8b641260dfa1abeb8783515c3c68e618" exitCode=0 Apr 22 20:16:41.472072 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.471931 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" event={"ID":"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8","Type":"ContainerDied","Data":"5743a0e1a2449abb431d367cb9b9c21c8b641260dfa1abeb8783515c3c68e618"} Apr 22 20:16:41.472072 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.471980 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" event={"ID":"8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8","Type":"ContainerDied","Data":"8770d79210bd1902e07440cbb50752da2571dc79b74569dcf13973e2066705d9"} Apr 22 20:16:41.472072 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.472001 2575 scope.go:117] "RemoveContainer" containerID="5743a0e1a2449abb431d367cb9b9c21c8b641260dfa1abeb8783515c3c68e618" Apr 22 20:16:41.472072 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.471948 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x" Apr 22 20:16:41.481154 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.481137 2575 scope.go:117] "RemoveContainer" containerID="6345cef0c9479826aa1480eae21de6e2a32027f4c07add57b72c33e3d4bebd92" Apr 22 20:16:41.488307 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.488289 2575 scope.go:117] "RemoveContainer" containerID="b70ba7e1a8a3450cce9a91aa9925157300a78247d5c6161fc789ea29da8d6132" Apr 22 20:16:41.493208 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.493184 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x"] Apr 22 20:16:41.496807 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.496787 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-75c866f69d-2xq9x"] Apr 22 20:16:41.496877 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.496819 2575 scope.go:117] "RemoveContainer" containerID="5743a0e1a2449abb431d367cb9b9c21c8b641260dfa1abeb8783515c3c68e618" Apr 22 20:16:41.497103 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:16:41.497083 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5743a0e1a2449abb431d367cb9b9c21c8b641260dfa1abeb8783515c3c68e618\": container with ID starting with 5743a0e1a2449abb431d367cb9b9c21c8b641260dfa1abeb8783515c3c68e618 not found: ID does not exist" containerID="5743a0e1a2449abb431d367cb9b9c21c8b641260dfa1abeb8783515c3c68e618" Apr 22 20:16:41.497184 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.497114 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5743a0e1a2449abb431d367cb9b9c21c8b641260dfa1abeb8783515c3c68e618"} err="failed to get container status \"5743a0e1a2449abb431d367cb9b9c21c8b641260dfa1abeb8783515c3c68e618\": rpc error: code = NotFound desc = could not find container \"5743a0e1a2449abb431d367cb9b9c21c8b641260dfa1abeb8783515c3c68e618\": container with ID starting with 5743a0e1a2449abb431d367cb9b9c21c8b641260dfa1abeb8783515c3c68e618 not found: ID does not exist" Apr 22 20:16:41.497184 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.497140 2575 scope.go:117] "RemoveContainer" containerID="6345cef0c9479826aa1480eae21de6e2a32027f4c07add57b72c33e3d4bebd92" Apr 22 20:16:41.497391 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:16:41.497371 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6345cef0c9479826aa1480eae21de6e2a32027f4c07add57b72c33e3d4bebd92\": container with ID starting with 6345cef0c9479826aa1480eae21de6e2a32027f4c07add57b72c33e3d4bebd92 not found: ID does not exist" containerID="6345cef0c9479826aa1480eae21de6e2a32027f4c07add57b72c33e3d4bebd92" Apr 22 20:16:41.497449 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.497397 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6345cef0c9479826aa1480eae21de6e2a32027f4c07add57b72c33e3d4bebd92"} err="failed to get container status \"6345cef0c9479826aa1480eae21de6e2a32027f4c07add57b72c33e3d4bebd92\": rpc error: code = NotFound desc = could not find container \"6345cef0c9479826aa1480eae21de6e2a32027f4c07add57b72c33e3d4bebd92\": container with ID starting with 6345cef0c9479826aa1480eae21de6e2a32027f4c07add57b72c33e3d4bebd92 not found: ID does not exist" Apr 22 20:16:41.497449 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.497413 2575 scope.go:117] "RemoveContainer" containerID="b70ba7e1a8a3450cce9a91aa9925157300a78247d5c6161fc789ea29da8d6132" Apr 22 20:16:41.497640 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:16:41.497624 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b70ba7e1a8a3450cce9a91aa9925157300a78247d5c6161fc789ea29da8d6132\": container with ID starting with b70ba7e1a8a3450cce9a91aa9925157300a78247d5c6161fc789ea29da8d6132 not found: ID does not exist" containerID="b70ba7e1a8a3450cce9a91aa9925157300a78247d5c6161fc789ea29da8d6132" Apr 22 20:16:41.497684 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.497645 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b70ba7e1a8a3450cce9a91aa9925157300a78247d5c6161fc789ea29da8d6132"} err="failed to get container status \"b70ba7e1a8a3450cce9a91aa9925157300a78247d5c6161fc789ea29da8d6132\": rpc error: code = NotFound desc = could not find container \"b70ba7e1a8a3450cce9a91aa9925157300a78247d5c6161fc789ea29da8d6132\": container with ID starting with b70ba7e1a8a3450cce9a91aa9925157300a78247d5c6161fc789ea29da8d6132 not found: ID does not exist" Apr 22 20:16:41.731618 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:41.731541 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8" path="/var/lib/kubelet/pods/8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8/volumes" Apr 22 20:16:56.193198 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.193156 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7"] Apr 22 20:16:56.193806 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.193627 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8" containerName="storage-initializer" Apr 22 20:16:56.193806 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.193645 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8" containerName="storage-initializer" Apr 22 20:16:56.193806 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.193660 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8" containerName="main" Apr 22 20:16:56.193806 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.193669 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8" containerName="main" Apr 22 20:16:56.193806 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.193689 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8" containerName="tokenizer" Apr 22 20:16:56.193806 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.193698 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8" containerName="tokenizer" Apr 22 20:16:56.193806 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.193709 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f2e04ae-561b-4e42-abd9-76587f41da06" containerName="manager" Apr 22 20:16:56.193806 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.193717 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f2e04ae-561b-4e42-abd9-76587f41da06" containerName="manager" Apr 22 20:16:56.194213 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.193815 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="0f2e04ae-561b-4e42-abd9-76587f41da06" containerName="manager" Apr 22 20:16:56.194213 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.193828 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8" containerName="main" Apr 22 20:16:56.194213 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.193839 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="8a0fa4a2-8ba8-485e-a891-e6bffe5c7eb8" containerName="tokenizer" Apr 22 20:16:56.197272 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.197253 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" Apr 22 20:16:56.200394 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.200368 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 20:16:56.200516 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.200397 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 22 20:16:56.200516 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.200419 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-t9rkk\"" Apr 22 20:16:56.200516 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.200435 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tlcdw\"" Apr 22 20:16:56.200516 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.200371 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 20:16:56.207868 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.207839 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7"] Apr 22 20:16:56.276209 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.276172 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfppq\" (UniqueName: \"kubernetes.io/projected/3eecbc8a-7732-440d-a638-359abba456b9-kube-api-access-pfppq\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7\" (UID: \"3eecbc8a-7732-440d-a638-359abba456b9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" Apr 22 20:16:56.276408 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.276222 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3eecbc8a-7732-440d-a638-359abba456b9-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7\" (UID: \"3eecbc8a-7732-440d-a638-359abba456b9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" Apr 22 20:16:56.276408 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.276246 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3eecbc8a-7732-440d-a638-359abba456b9-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7\" (UID: \"3eecbc8a-7732-440d-a638-359abba456b9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" Apr 22 20:16:56.276408 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.276303 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3eecbc8a-7732-440d-a638-359abba456b9-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7\" (UID: \"3eecbc8a-7732-440d-a638-359abba456b9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" Apr 22 20:16:56.276609 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.276456 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3eecbc8a-7732-440d-a638-359abba456b9-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7\" (UID: \"3eecbc8a-7732-440d-a638-359abba456b9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" Apr 22 20:16:56.276609 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.276515 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3eecbc8a-7732-440d-a638-359abba456b9-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7\" (UID: \"3eecbc8a-7732-440d-a638-359abba456b9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" Apr 22 20:16:56.377817 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.377763 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3eecbc8a-7732-440d-a638-359abba456b9-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7\" (UID: \"3eecbc8a-7732-440d-a638-359abba456b9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" Apr 22 20:16:56.377817 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.377824 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3eecbc8a-7732-440d-a638-359abba456b9-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7\" (UID: \"3eecbc8a-7732-440d-a638-359abba456b9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" Apr 22 20:16:56.378091 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.377850 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pfppq\" (UniqueName: \"kubernetes.io/projected/3eecbc8a-7732-440d-a638-359abba456b9-kube-api-access-pfppq\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7\" (UID: \"3eecbc8a-7732-440d-a638-359abba456b9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" Apr 22 20:16:56.378091 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.377886 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3eecbc8a-7732-440d-a638-359abba456b9-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7\" (UID: \"3eecbc8a-7732-440d-a638-359abba456b9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" Apr 22 20:16:56.378091 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.377907 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3eecbc8a-7732-440d-a638-359abba456b9-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7\" (UID: \"3eecbc8a-7732-440d-a638-359abba456b9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" Apr 22 20:16:56.378091 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.377925 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3eecbc8a-7732-440d-a638-359abba456b9-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7\" (UID: \"3eecbc8a-7732-440d-a638-359abba456b9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" Apr 22 20:16:56.378297 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.378264 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3eecbc8a-7732-440d-a638-359abba456b9-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7\" (UID: \"3eecbc8a-7732-440d-a638-359abba456b9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" Apr 22 20:16:56.378384 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.378274 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3eecbc8a-7732-440d-a638-359abba456b9-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7\" (UID: \"3eecbc8a-7732-440d-a638-359abba456b9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" Apr 22 20:16:56.378384 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.378342 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3eecbc8a-7732-440d-a638-359abba456b9-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7\" (UID: \"3eecbc8a-7732-440d-a638-359abba456b9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" Apr 22 20:16:56.378384 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.378365 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3eecbc8a-7732-440d-a638-359abba456b9-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7\" (UID: \"3eecbc8a-7732-440d-a638-359abba456b9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" Apr 22 20:16:56.380410 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.380387 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3eecbc8a-7732-440d-a638-359abba456b9-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7\" (UID: \"3eecbc8a-7732-440d-a638-359abba456b9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" Apr 22 20:16:56.385351 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.385325 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfppq\" (UniqueName: \"kubernetes.io/projected/3eecbc8a-7732-440d-a638-359abba456b9-kube-api-access-pfppq\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7\" (UID: \"3eecbc8a-7732-440d-a638-359abba456b9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" Apr 22 20:16:56.507502 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.507472 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" Apr 22 20:16:56.632667 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:56.632643 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7"] Apr 22 20:16:56.635371 ip-10-0-133-60 kubenswrapper[2575]: W0422 20:16:56.635337 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3eecbc8a_7732_440d_a638_359abba456b9.slice/crio-9dc221aa2792dafb053049023ff7e36c4aeb9f2f65071e3700ecec468f31966d WatchSource:0}: Error finding container 9dc221aa2792dafb053049023ff7e36c4aeb9f2f65071e3700ecec468f31966d: Status 404 returned error can't find the container with id 9dc221aa2792dafb053049023ff7e36c4aeb9f2f65071e3700ecec468f31966d Apr 22 20:16:57.516590 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:57.516552 2575 generic.go:358] "Generic (PLEG): container finished" podID="3eecbc8a-7732-440d-a638-359abba456b9" containerID="d891329e0917e04602ca2083d4377893b7c41eaeb9cd71a73467944033b3386b" exitCode=0 Apr 22 20:16:57.516963 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:57.516634 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" event={"ID":"3eecbc8a-7732-440d-a638-359abba456b9","Type":"ContainerDied","Data":"d891329e0917e04602ca2083d4377893b7c41eaeb9cd71a73467944033b3386b"} Apr 22 20:16:57.516963 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:57.516669 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" event={"ID":"3eecbc8a-7732-440d-a638-359abba456b9","Type":"ContainerStarted","Data":"9dc221aa2792dafb053049023ff7e36c4aeb9f2f65071e3700ecec468f31966d"} Apr 22 20:16:58.522036 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:58.521999 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" event={"ID":"3eecbc8a-7732-440d-a638-359abba456b9","Type":"ContainerStarted","Data":"e05c07b1f9e56f43ae4aa20fead4ba45f5bc3f96af630a961b551395b1d4f25c"} Apr 22 20:16:58.522431 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:58.522042 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" event={"ID":"3eecbc8a-7732-440d-a638-359abba456b9","Type":"ContainerStarted","Data":"f03a2d0a4833f5a1ba02637d6888be5d235047639e5e500af0c042205c22cfa7"} Apr 22 20:16:58.522431 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:58.522112 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" Apr 22 20:16:58.545932 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:16:58.545862 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" podStartSLOduration=2.545841407 podStartE2EDuration="2.545841407s" podCreationTimestamp="2026-04-22 20:16:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:16:58.541245182 +0000 UTC m=+1155.354630383" watchObservedRunningTime="2026-04-22 20:16:58.545841407 +0000 UTC m=+1155.359226581" Apr 22 20:17:06.507899 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:17:06.507798 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" Apr 22 20:17:06.508348 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:17:06.507955 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" Apr 22 20:17:06.510604 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:17:06.510582 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" Apr 22 20:17:06.551124 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:17:06.551092 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" Apr 22 20:17:28.557472 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:17:28.557444 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" Apr 22 20:17:43.711162 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:17:43.711128 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vfhht_338d333b-8626-46b3-b450-fbdd521183d8/console-operator/1.log" Apr 22 20:17:43.711964 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:17:43.711945 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vfhht_338d333b-8626-46b3-b450-fbdd521183d8/console-operator/1.log" Apr 22 20:18:26.934026 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:18:26.933993 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 20:18:26.937240 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:18:26.937221 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 20:18:26.939656 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:18:26.939632 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 22 20:18:26.939656 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:18:26.939645 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-ssqtz\"" Apr 22 20:18:26.947866 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:18:26.947836 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 20:18:27.047501 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:18:27.047465 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/08565dad-48f1-4088-aed7-29e9df70f4d4-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"08565dad-48f1-4088-aed7-29e9df70f4d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 20:18:27.047501 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:18:27.047500 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/08565dad-48f1-4088-aed7-29e9df70f4d4-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"08565dad-48f1-4088-aed7-29e9df70f4d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 20:18:27.047709 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:18:27.047545 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08565dad-48f1-4088-aed7-29e9df70f4d4-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"08565dad-48f1-4088-aed7-29e9df70f4d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 20:18:27.047709 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:18:27.047577 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/08565dad-48f1-4088-aed7-29e9df70f4d4-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"08565dad-48f1-4088-aed7-29e9df70f4d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 20:18:27.047709 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:18:27.047592 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/08565dad-48f1-4088-aed7-29e9df70f4d4-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"08565dad-48f1-4088-aed7-29e9df70f4d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 20:18:27.047709 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:18:27.047648 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v56h2\" (UniqueName: \"kubernetes.io/projected/08565dad-48f1-4088-aed7-29e9df70f4d4-kube-api-access-v56h2\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"08565dad-48f1-4088-aed7-29e9df70f4d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 20:18:27.148458 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:18:27.148408 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/08565dad-48f1-4088-aed7-29e9df70f4d4-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"08565dad-48f1-4088-aed7-29e9df70f4d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 20:18:27.148458 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:18:27.148460 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/08565dad-48f1-4088-aed7-29e9df70f4d4-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"08565dad-48f1-4088-aed7-29e9df70f4d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 20:18:27.148690 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:18:27.148505 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08565dad-48f1-4088-aed7-29e9df70f4d4-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"08565dad-48f1-4088-aed7-29e9df70f4d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 20:18:27.148690 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:18:27.148538 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/08565dad-48f1-4088-aed7-29e9df70f4d4-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"08565dad-48f1-4088-aed7-29e9df70f4d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 20:18:27.148690 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:18:27.148561 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/08565dad-48f1-4088-aed7-29e9df70f4d4-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"08565dad-48f1-4088-aed7-29e9df70f4d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 20:18:27.148690 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:18:27.148592 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v56h2\" (UniqueName: \"kubernetes.io/projected/08565dad-48f1-4088-aed7-29e9df70f4d4-kube-api-access-v56h2\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"08565dad-48f1-4088-aed7-29e9df70f4d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 20:18:27.148894 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:18:27.148868 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08565dad-48f1-4088-aed7-29e9df70f4d4-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"08565dad-48f1-4088-aed7-29e9df70f4d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 20:18:27.148938 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:18:27.148875 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/08565dad-48f1-4088-aed7-29e9df70f4d4-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"08565dad-48f1-4088-aed7-29e9df70f4d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 20:18:27.148938 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:18:27.148920 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/08565dad-48f1-4088-aed7-29e9df70f4d4-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"08565dad-48f1-4088-aed7-29e9df70f4d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 20:18:27.150819 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:18:27.150797 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/08565dad-48f1-4088-aed7-29e9df70f4d4-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"08565dad-48f1-4088-aed7-29e9df70f4d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 20:18:27.150916 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:18:27.150899 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/08565dad-48f1-4088-aed7-29e9df70f4d4-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"08565dad-48f1-4088-aed7-29e9df70f4d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 20:18:27.155727 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:18:27.155703 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v56h2\" (UniqueName: \"kubernetes.io/projected/08565dad-48f1-4088-aed7-29e9df70f4d4-kube-api-access-v56h2\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"08565dad-48f1-4088-aed7-29e9df70f4d4\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 20:18:27.248896 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:18:27.248867 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 20:18:27.401814 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:18:27.401765 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 20:18:27.788253 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:18:27.788214 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"08565dad-48f1-4088-aed7-29e9df70f4d4","Type":"ContainerStarted","Data":"19ef7162093cdeb1297e028108e8a74a4232b72b2cbb6c899d9cb0dc04a27379"} Apr 22 20:18:27.788253 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:18:27.788254 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"08565dad-48f1-4088-aed7-29e9df70f4d4","Type":"ContainerStarted","Data":"ee81d0e10246e3a89321fb72115327ffd01c7101e5b06598c872e48259bec021"} Apr 22 20:18:31.803823 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:18:31.803788 2575 generic.go:358] "Generic (PLEG): container finished" podID="08565dad-48f1-4088-aed7-29e9df70f4d4" containerID="19ef7162093cdeb1297e028108e8a74a4232b72b2cbb6c899d9cb0dc04a27379" exitCode=0 Apr 22 20:18:31.804182 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:18:31.803842 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"08565dad-48f1-4088-aed7-29e9df70f4d4","Type":"ContainerDied","Data":"19ef7162093cdeb1297e028108e8a74a4232b72b2cbb6c899d9cb0dc04a27379"} Apr 22 20:19:19.967341 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:19.967285 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"08565dad-48f1-4088-aed7-29e9df70f4d4","Type":"ContainerStarted","Data":"a0a000459796902f41d9f1bcb7cb4da6dbc40797ec507432ba95ebce5029a781"} Apr 22 20:19:19.990784 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:19.990734 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podStartSLOduration=6.623759326 podStartE2EDuration="53.990718742s" podCreationTimestamp="2026-04-22 20:18:26 +0000 UTC" firstStartedPulling="2026-04-22 20:18:31.804955569 +0000 UTC m=+1248.618340728" lastFinishedPulling="2026-04-22 20:19:19.171914984 +0000 UTC m=+1295.985300144" observedRunningTime="2026-04-22 20:19:19.988693906 +0000 UTC m=+1296.802079086" watchObservedRunningTime="2026-04-22 20:19:19.990718742 +0000 UTC m=+1296.804103923" Apr 22 20:19:54.258500 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:54.258464 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7"] Apr 22 20:19:54.259027 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:54.258786 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" podUID="3eecbc8a-7732-440d-a638-359abba456b9" containerName="main" containerID="cri-o://f03a2d0a4833f5a1ba02637d6888be5d235047639e5e500af0c042205c22cfa7" gracePeriod=30 Apr 22 20:19:54.259027 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:54.258851 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" podUID="3eecbc8a-7732-440d-a638-359abba456b9" containerName="tokenizer" containerID="cri-o://e05c07b1f9e56f43ae4aa20fead4ba45f5bc3f96af630a961b551395b1d4f25c" gracePeriod=30 Apr 22 20:19:55.080169 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:55.080136 2575 generic.go:358] "Generic (PLEG): container finished" podID="3eecbc8a-7732-440d-a638-359abba456b9" containerID="f03a2d0a4833f5a1ba02637d6888be5d235047639e5e500af0c042205c22cfa7" exitCode=0 Apr 22 20:19:55.080370 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:55.080217 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" event={"ID":"3eecbc8a-7732-440d-a638-359abba456b9","Type":"ContainerDied","Data":"f03a2d0a4833f5a1ba02637d6888be5d235047639e5e500af0c042205c22cfa7"} Apr 22 20:19:55.611952 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:55.611923 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" Apr 22 20:19:55.746511 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:55.746484 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfppq\" (UniqueName: \"kubernetes.io/projected/3eecbc8a-7732-440d-a638-359abba456b9-kube-api-access-pfppq\") pod \"3eecbc8a-7732-440d-a638-359abba456b9\" (UID: \"3eecbc8a-7732-440d-a638-359abba456b9\") " Apr 22 20:19:55.746672 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:55.746527 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3eecbc8a-7732-440d-a638-359abba456b9-tls-certs\") pod \"3eecbc8a-7732-440d-a638-359abba456b9\" (UID: \"3eecbc8a-7732-440d-a638-359abba456b9\") " Apr 22 20:19:55.746672 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:55.746576 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3eecbc8a-7732-440d-a638-359abba456b9-tokenizer-uds\") pod \"3eecbc8a-7732-440d-a638-359abba456b9\" (UID: \"3eecbc8a-7732-440d-a638-359abba456b9\") " Apr 22 20:19:55.746672 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:55.746616 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3eecbc8a-7732-440d-a638-359abba456b9-tokenizer-cache\") pod \"3eecbc8a-7732-440d-a638-359abba456b9\" (UID: \"3eecbc8a-7732-440d-a638-359abba456b9\") " Apr 22 20:19:55.746672 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:55.746632 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3eecbc8a-7732-440d-a638-359abba456b9-kserve-provision-location\") pod \"3eecbc8a-7732-440d-a638-359abba456b9\" (UID: \"3eecbc8a-7732-440d-a638-359abba456b9\") " Apr 22 20:19:55.746883 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:55.746697 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3eecbc8a-7732-440d-a638-359abba456b9-tokenizer-tmp\") pod \"3eecbc8a-7732-440d-a638-359abba456b9\" (UID: \"3eecbc8a-7732-440d-a638-359abba456b9\") " Apr 22 20:19:55.746942 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:55.746919 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3eecbc8a-7732-440d-a638-359abba456b9-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "3eecbc8a-7732-440d-a638-359abba456b9" (UID: "3eecbc8a-7732-440d-a638-359abba456b9"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:19:55.746995 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:55.746963 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3eecbc8a-7732-440d-a638-359abba456b9-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "3eecbc8a-7732-440d-a638-359abba456b9" (UID: "3eecbc8a-7732-440d-a638-359abba456b9"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:19:55.747126 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:55.747104 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3eecbc8a-7732-440d-a638-359abba456b9-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "3eecbc8a-7732-440d-a638-359abba456b9" (UID: "3eecbc8a-7732-440d-a638-359abba456b9"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:19:55.747459 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:55.747437 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3eecbc8a-7732-440d-a638-359abba456b9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3eecbc8a-7732-440d-a638-359abba456b9" (UID: "3eecbc8a-7732-440d-a638-359abba456b9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:19:55.748715 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:55.748690 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eecbc8a-7732-440d-a638-359abba456b9-kube-api-access-pfppq" (OuterVolumeSpecName: "kube-api-access-pfppq") pod "3eecbc8a-7732-440d-a638-359abba456b9" (UID: "3eecbc8a-7732-440d-a638-359abba456b9"). InnerVolumeSpecName "kube-api-access-pfppq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:19:55.749116 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:55.749099 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eecbc8a-7732-440d-a638-359abba456b9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "3eecbc8a-7732-440d-a638-359abba456b9" (UID: "3eecbc8a-7732-440d-a638-359abba456b9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:19:55.847506 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:55.847466 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pfppq\" (UniqueName: \"kubernetes.io/projected/3eecbc8a-7732-440d-a638-359abba456b9-kube-api-access-pfppq\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:19:55.847686 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:55.847528 2575 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3eecbc8a-7732-440d-a638-359abba456b9-tls-certs\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:19:55.847686 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:55.847543 2575 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/3eecbc8a-7732-440d-a638-359abba456b9-tokenizer-uds\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:19:55.847686 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:55.847556 2575 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/3eecbc8a-7732-440d-a638-359abba456b9-tokenizer-cache\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:19:55.847686 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:55.847568 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3eecbc8a-7732-440d-a638-359abba456b9-kserve-provision-location\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:19:55.847686 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:55.847579 2575 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/3eecbc8a-7732-440d-a638-359abba456b9-tokenizer-tmp\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:19:56.084565 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:56.084459 2575 generic.go:358] "Generic (PLEG): container finished" podID="3eecbc8a-7732-440d-a638-359abba456b9" containerID="e05c07b1f9e56f43ae4aa20fead4ba45f5bc3f96af630a961b551395b1d4f25c" exitCode=0 Apr 22 20:19:56.084565 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:56.084493 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" event={"ID":"3eecbc8a-7732-440d-a638-359abba456b9","Type":"ContainerDied","Data":"e05c07b1f9e56f43ae4aa20fead4ba45f5bc3f96af630a961b551395b1d4f25c"} Apr 22 20:19:56.084565 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:56.084536 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" event={"ID":"3eecbc8a-7732-440d-a638-359abba456b9","Type":"ContainerDied","Data":"9dc221aa2792dafb053049023ff7e36c4aeb9f2f65071e3700ecec468f31966d"} Apr 22 20:19:56.084565 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:56.084561 2575 scope.go:117] "RemoveContainer" containerID="e05c07b1f9e56f43ae4aa20fead4ba45f5bc3f96af630a961b551395b1d4f25c" Apr 22 20:19:56.084864 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:56.084568 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7" Apr 22 20:19:56.094408 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:56.094383 2575 scope.go:117] "RemoveContainer" containerID="f03a2d0a4833f5a1ba02637d6888be5d235047639e5e500af0c042205c22cfa7" Apr 22 20:19:56.103529 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:56.103505 2575 scope.go:117] "RemoveContainer" containerID="d891329e0917e04602ca2083d4377893b7c41eaeb9cd71a73467944033b3386b" Apr 22 20:19:56.111215 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:56.111169 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7"] Apr 22 20:19:56.112766 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:56.112737 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scherqwz7"] Apr 22 20:19:56.113090 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:56.113068 2575 scope.go:117] "RemoveContainer" containerID="e05c07b1f9e56f43ae4aa20fead4ba45f5bc3f96af630a961b551395b1d4f25c" Apr 22 20:19:56.113431 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:19:56.113400 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e05c07b1f9e56f43ae4aa20fead4ba45f5bc3f96af630a961b551395b1d4f25c\": container with ID starting with e05c07b1f9e56f43ae4aa20fead4ba45f5bc3f96af630a961b551395b1d4f25c not found: ID does not exist" containerID="e05c07b1f9e56f43ae4aa20fead4ba45f5bc3f96af630a961b551395b1d4f25c" Apr 22 20:19:56.113554 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:56.113436 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e05c07b1f9e56f43ae4aa20fead4ba45f5bc3f96af630a961b551395b1d4f25c"} err="failed to get container status \"e05c07b1f9e56f43ae4aa20fead4ba45f5bc3f96af630a961b551395b1d4f25c\": rpc error: code = NotFound desc = could not find container \"e05c07b1f9e56f43ae4aa20fead4ba45f5bc3f96af630a961b551395b1d4f25c\": container with ID starting with e05c07b1f9e56f43ae4aa20fead4ba45f5bc3f96af630a961b551395b1d4f25c not found: ID does not exist" Apr 22 20:19:56.113554 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:56.113455 2575 scope.go:117] "RemoveContainer" containerID="f03a2d0a4833f5a1ba02637d6888be5d235047639e5e500af0c042205c22cfa7" Apr 22 20:19:56.113733 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:19:56.113715 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f03a2d0a4833f5a1ba02637d6888be5d235047639e5e500af0c042205c22cfa7\": container with ID starting with f03a2d0a4833f5a1ba02637d6888be5d235047639e5e500af0c042205c22cfa7 not found: ID does not exist" containerID="f03a2d0a4833f5a1ba02637d6888be5d235047639e5e500af0c042205c22cfa7" Apr 22 20:19:56.113798 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:56.113736 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f03a2d0a4833f5a1ba02637d6888be5d235047639e5e500af0c042205c22cfa7"} err="failed to get container status \"f03a2d0a4833f5a1ba02637d6888be5d235047639e5e500af0c042205c22cfa7\": rpc error: code = NotFound desc = could not find container \"f03a2d0a4833f5a1ba02637d6888be5d235047639e5e500af0c042205c22cfa7\": container with ID starting with f03a2d0a4833f5a1ba02637d6888be5d235047639e5e500af0c042205c22cfa7 not found: ID does not exist" Apr 22 20:19:56.113798 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:56.113749 2575 scope.go:117] "RemoveContainer" containerID="d891329e0917e04602ca2083d4377893b7c41eaeb9cd71a73467944033b3386b" Apr 22 20:19:56.114024 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:19:56.113999 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d891329e0917e04602ca2083d4377893b7c41eaeb9cd71a73467944033b3386b\": container with ID starting with d891329e0917e04602ca2083d4377893b7c41eaeb9cd71a73467944033b3386b not found: ID does not exist" containerID="d891329e0917e04602ca2083d4377893b7c41eaeb9cd71a73467944033b3386b" Apr 22 20:19:56.114093 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:56.114032 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d891329e0917e04602ca2083d4377893b7c41eaeb9cd71a73467944033b3386b"} err="failed to get container status \"d891329e0917e04602ca2083d4377893b7c41eaeb9cd71a73467944033b3386b\": rpc error: code = NotFound desc = could not find container \"d891329e0917e04602ca2083d4377893b7c41eaeb9cd71a73467944033b3386b\": container with ID starting with d891329e0917e04602ca2083d4377893b7c41eaeb9cd71a73467944033b3386b not found: ID does not exist" Apr 22 20:19:57.731158 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:19:57.731114 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3eecbc8a-7732-440d-a638-359abba456b9" path="/var/lib/kubelet/pods/3eecbc8a-7732-440d-a638-359abba456b9/volumes" Apr 22 20:20:08.613368 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.613266 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6"] Apr 22 20:20:08.613921 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.613563 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3eecbc8a-7732-440d-a638-359abba456b9" containerName="main" Apr 22 20:20:08.613921 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.613574 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eecbc8a-7732-440d-a638-359abba456b9" containerName="main" Apr 22 20:20:08.613921 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.613591 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3eecbc8a-7732-440d-a638-359abba456b9" containerName="storage-initializer" Apr 22 20:20:08.613921 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.613596 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eecbc8a-7732-440d-a638-359abba456b9" containerName="storage-initializer" Apr 22 20:20:08.613921 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.613606 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3eecbc8a-7732-440d-a638-359abba456b9" containerName="tokenizer" Apr 22 20:20:08.613921 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.613612 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eecbc8a-7732-440d-a638-359abba456b9" containerName="tokenizer" Apr 22 20:20:08.613921 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.613662 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3eecbc8a-7732-440d-a638-359abba456b9" containerName="tokenizer" Apr 22 20:20:08.613921 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.613672 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3eecbc8a-7732-440d-a638-359abba456b9" containerName="main" Apr 22 20:20:08.618194 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.618166 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw"] Apr 22 20:20:08.618357 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.618334 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" Apr 22 20:20:08.620689 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.620668 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 22 20:20:08.621087 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.621064 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-dockercfg-vxbgg\"" Apr 22 20:20:08.621248 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.621229 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" Apr 22 20:20:08.629530 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.629507 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6"] Apr 22 20:20:08.644283 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.644250 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw"] Apr 22 20:20:08.652884 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.652853 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/67bd64cc-880a-4f89-90ba-c8d803f5850a-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw\" (UID: \"67bd64cc-880a-4f89-90ba-c8d803f5850a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" Apr 22 20:20:08.653026 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.652890 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67bd64cc-880a-4f89-90ba-c8d803f5850a-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw\" (UID: \"67bd64cc-880a-4f89-90ba-c8d803f5850a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" Apr 22 20:20:08.653026 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.652925 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9fmk\" (UniqueName: \"kubernetes.io/projected/b08cb87a-d147-4183-b35c-6cffb7b46ba4-kube-api-access-x9fmk\") pod \"custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6\" (UID: \"b08cb87a-d147-4183-b35c-6cffb7b46ba4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" Apr 22 20:20:08.653104 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.653046 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b08cb87a-d147-4183-b35c-6cffb7b46ba4-model-cache\") pod \"custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6\" (UID: \"b08cb87a-d147-4183-b35c-6cffb7b46ba4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" Apr 22 20:20:08.653104 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.653073 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/67bd64cc-880a-4f89-90ba-c8d803f5850a-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw\" (UID: \"67bd64cc-880a-4f89-90ba-c8d803f5850a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" Apr 22 20:20:08.653104 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.653089 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/67bd64cc-880a-4f89-90ba-c8d803f5850a-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw\" (UID: \"67bd64cc-880a-4f89-90ba-c8d803f5850a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" Apr 22 20:20:08.653197 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.653111 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zb9g\" (UniqueName: \"kubernetes.io/projected/67bd64cc-880a-4f89-90ba-c8d803f5850a-kube-api-access-7zb9g\") pod \"custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw\" (UID: \"67bd64cc-880a-4f89-90ba-c8d803f5850a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" Apr 22 20:20:08.653197 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.653156 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b08cb87a-d147-4183-b35c-6cffb7b46ba4-dshm\") pod \"custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6\" (UID: \"b08cb87a-d147-4183-b35c-6cffb7b46ba4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" Apr 22 20:20:08.653261 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.653193 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/67bd64cc-880a-4f89-90ba-c8d803f5850a-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw\" (UID: \"67bd64cc-880a-4f89-90ba-c8d803f5850a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" Apr 22 20:20:08.653261 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.653218 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b08cb87a-d147-4183-b35c-6cffb7b46ba4-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6\" (UID: \"b08cb87a-d147-4183-b35c-6cffb7b46ba4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" Apr 22 20:20:08.653261 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.653253 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b08cb87a-d147-4183-b35c-6cffb7b46ba4-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6\" (UID: \"b08cb87a-d147-4183-b35c-6cffb7b46ba4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" Apr 22 20:20:08.653385 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.653273 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b08cb87a-d147-4183-b35c-6cffb7b46ba4-home\") pod \"custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6\" (UID: \"b08cb87a-d147-4183-b35c-6cffb7b46ba4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" Apr 22 20:20:08.753619 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.753584 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b08cb87a-d147-4183-b35c-6cffb7b46ba4-model-cache\") pod \"custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6\" (UID: \"b08cb87a-d147-4183-b35c-6cffb7b46ba4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" Apr 22 20:20:08.753619 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.753620 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/67bd64cc-880a-4f89-90ba-c8d803f5850a-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw\" (UID: \"67bd64cc-880a-4f89-90ba-c8d803f5850a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" Apr 22 20:20:08.753836 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.753643 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/67bd64cc-880a-4f89-90ba-c8d803f5850a-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw\" (UID: \"67bd64cc-880a-4f89-90ba-c8d803f5850a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" Apr 22 20:20:08.753836 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.753745 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zb9g\" (UniqueName: \"kubernetes.io/projected/67bd64cc-880a-4f89-90ba-c8d803f5850a-kube-api-access-7zb9g\") pod \"custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw\" (UID: \"67bd64cc-880a-4f89-90ba-c8d803f5850a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" Apr 22 20:20:08.753836 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.753787 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b08cb87a-d147-4183-b35c-6cffb7b46ba4-dshm\") pod \"custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6\" (UID: \"b08cb87a-d147-4183-b35c-6cffb7b46ba4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" Apr 22 20:20:08.753836 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.753820 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/67bd64cc-880a-4f89-90ba-c8d803f5850a-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw\" (UID: \"67bd64cc-880a-4f89-90ba-c8d803f5850a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" Apr 22 20:20:08.754051 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.753844 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b08cb87a-d147-4183-b35c-6cffb7b46ba4-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6\" (UID: \"b08cb87a-d147-4183-b35c-6cffb7b46ba4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" Apr 22 20:20:08.754051 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.753890 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b08cb87a-d147-4183-b35c-6cffb7b46ba4-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6\" (UID: \"b08cb87a-d147-4183-b35c-6cffb7b46ba4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" Apr 22 20:20:08.754051 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.753920 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b08cb87a-d147-4183-b35c-6cffb7b46ba4-home\") pod \"custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6\" (UID: \"b08cb87a-d147-4183-b35c-6cffb7b46ba4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" Apr 22 20:20:08.754051 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.753962 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/67bd64cc-880a-4f89-90ba-c8d803f5850a-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw\" (UID: \"67bd64cc-880a-4f89-90ba-c8d803f5850a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" Apr 22 20:20:08.754051 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.753988 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67bd64cc-880a-4f89-90ba-c8d803f5850a-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw\" (UID: \"67bd64cc-880a-4f89-90ba-c8d803f5850a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" Apr 22 20:20:08.754051 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.754007 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/67bd64cc-880a-4f89-90ba-c8d803f5850a-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw\" (UID: \"67bd64cc-880a-4f89-90ba-c8d803f5850a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" Apr 22 20:20:08.754051 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.754033 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9fmk\" (UniqueName: \"kubernetes.io/projected/b08cb87a-d147-4183-b35c-6cffb7b46ba4-kube-api-access-x9fmk\") pod \"custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6\" (UID: \"b08cb87a-d147-4183-b35c-6cffb7b46ba4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" Apr 22 20:20:08.754627 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.754063 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b08cb87a-d147-4183-b35c-6cffb7b46ba4-model-cache\") pod \"custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6\" (UID: \"b08cb87a-d147-4183-b35c-6cffb7b46ba4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" Apr 22 20:20:08.754627 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.754394 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b08cb87a-d147-4183-b35c-6cffb7b46ba4-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6\" (UID: \"b08cb87a-d147-4183-b35c-6cffb7b46ba4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" Apr 22 20:20:08.754627 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.754416 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/67bd64cc-880a-4f89-90ba-c8d803f5850a-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw\" (UID: \"67bd64cc-880a-4f89-90ba-c8d803f5850a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" Apr 22 20:20:08.754627 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.754587 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b08cb87a-d147-4183-b35c-6cffb7b46ba4-home\") pod \"custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6\" (UID: \"b08cb87a-d147-4183-b35c-6cffb7b46ba4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" Apr 22 20:20:08.755048 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.755022 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67bd64cc-880a-4f89-90ba-c8d803f5850a-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw\" (UID: \"67bd64cc-880a-4f89-90ba-c8d803f5850a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" Apr 22 20:20:08.756287 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.756264 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b08cb87a-d147-4183-b35c-6cffb7b46ba4-dshm\") pod \"custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6\" (UID: \"b08cb87a-d147-4183-b35c-6cffb7b46ba4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" Apr 22 20:20:08.756402 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.756287 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/67bd64cc-880a-4f89-90ba-c8d803f5850a-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw\" (UID: \"67bd64cc-880a-4f89-90ba-c8d803f5850a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" Apr 22 20:20:08.756810 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.756785 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b08cb87a-d147-4183-b35c-6cffb7b46ba4-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6\" (UID: \"b08cb87a-d147-4183-b35c-6cffb7b46ba4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" Apr 22 20:20:08.756892 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.756835 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/67bd64cc-880a-4f89-90ba-c8d803f5850a-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw\" (UID: \"67bd64cc-880a-4f89-90ba-c8d803f5850a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" Apr 22 20:20:08.762089 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.762066 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9fmk\" (UniqueName: \"kubernetes.io/projected/b08cb87a-d147-4183-b35c-6cffb7b46ba4-kube-api-access-x9fmk\") pod \"custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6\" (UID: \"b08cb87a-d147-4183-b35c-6cffb7b46ba4\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" Apr 22 20:20:08.762744 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.762724 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zb9g\" (UniqueName: \"kubernetes.io/projected/67bd64cc-880a-4f89-90ba-c8d803f5850a-kube-api-access-7zb9g\") pod \"custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw\" (UID: \"67bd64cc-880a-4f89-90ba-c8d803f5850a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" Apr 22 20:20:08.931384 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.931259 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" Apr 22 20:20:08.941247 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:08.941217 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" Apr 22 20:20:09.086363 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:09.086332 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6"] Apr 22 20:20:09.089092 ip-10-0-133-60 kubenswrapper[2575]: W0422 20:20:09.089063 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb08cb87a_d147_4183_b35c_6cffb7b46ba4.slice/crio-ae2efc2bc2653526bb80b598639b46c260a69c1fa10008b74643b784f172d76b WatchSource:0}: Error finding container ae2efc2bc2653526bb80b598639b46c260a69c1fa10008b74643b784f172d76b: Status 404 returned error can't find the container with id ae2efc2bc2653526bb80b598639b46c260a69c1fa10008b74643b784f172d76b Apr 22 20:20:09.090935 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:09.090912 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:20:09.102211 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:09.102191 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw"] Apr 22 20:20:09.104428 ip-10-0-133-60 kubenswrapper[2575]: W0422 20:20:09.104403 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67bd64cc_880a_4f89_90ba_c8d803f5850a.slice/crio-eb4b6eef0aaabe64bfad6d20dadb693db306f366c93a851a15758a03fa8c60dc WatchSource:0}: Error finding container eb4b6eef0aaabe64bfad6d20dadb693db306f366c93a851a15758a03fa8c60dc: Status 404 returned error can't find the container with id eb4b6eef0aaabe64bfad6d20dadb693db306f366c93a851a15758a03fa8c60dc Apr 22 20:20:09.135604 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:09.135573 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" event={"ID":"b08cb87a-d147-4183-b35c-6cffb7b46ba4","Type":"ContainerStarted","Data":"ae2efc2bc2653526bb80b598639b46c260a69c1fa10008b74643b784f172d76b"} Apr 22 20:20:09.136773 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:09.136744 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" event={"ID":"67bd64cc-880a-4f89-90ba-c8d803f5850a","Type":"ContainerStarted","Data":"eb4b6eef0aaabe64bfad6d20dadb693db306f366c93a851a15758a03fa8c60dc"} Apr 22 20:20:10.144020 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:10.143984 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" event={"ID":"67bd64cc-880a-4f89-90ba-c8d803f5850a","Type":"ContainerStarted","Data":"a8847aa45d3e6a7c1e0902b244d105a08a91301d6a14852f92702e9bc95c886b"} Apr 22 20:20:11.150389 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:11.150350 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" event={"ID":"b08cb87a-d147-4183-b35c-6cffb7b46ba4","Type":"ContainerStarted","Data":"83b396a0b91174a14ad8701bed4a6f15edd9105ebd948cfac19df626a7499f38"} Apr 22 20:20:11.150829 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:11.150483 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" Apr 22 20:20:12.158498 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:12.158440 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" event={"ID":"b08cb87a-d147-4183-b35c-6cffb7b46ba4","Type":"ContainerStarted","Data":"47c89d8b95894f6ee482f04ff749d38188394f457565d82d252edaf2209d9dd1"} Apr 22 20:20:14.169070 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:14.169025 2575 generic.go:358] "Generic (PLEG): container finished" podID="67bd64cc-880a-4f89-90ba-c8d803f5850a" containerID="a8847aa45d3e6a7c1e0902b244d105a08a91301d6a14852f92702e9bc95c886b" exitCode=0 Apr 22 20:20:14.169543 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:14.169094 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" event={"ID":"67bd64cc-880a-4f89-90ba-c8d803f5850a","Type":"ContainerDied","Data":"a8847aa45d3e6a7c1e0902b244d105a08a91301d6a14852f92702e9bc95c886b"} Apr 22 20:20:15.175828 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:15.175792 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" event={"ID":"67bd64cc-880a-4f89-90ba-c8d803f5850a","Type":"ContainerStarted","Data":"554d8dbe6914c6bc1b3c62f0c26ad44b24c25cb4894c379fc86087b66d8fa263"} Apr 22 20:20:15.198710 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:15.198642 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" podStartSLOduration=7.19861968 podStartE2EDuration="7.19861968s" podCreationTimestamp="2026-04-22 20:20:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:20:15.196342197 +0000 UTC m=+1352.009727378" watchObservedRunningTime="2026-04-22 20:20:15.19861968 +0000 UTC m=+1352.012004864" Apr 22 20:20:16.181766 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:16.181734 2575 generic.go:358] "Generic (PLEG): container finished" podID="b08cb87a-d147-4183-b35c-6cffb7b46ba4" containerID="47c89d8b95894f6ee482f04ff749d38188394f457565d82d252edaf2209d9dd1" exitCode=0 Apr 22 20:20:16.182170 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:16.181806 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" event={"ID":"b08cb87a-d147-4183-b35c-6cffb7b46ba4","Type":"ContainerDied","Data":"47c89d8b95894f6ee482f04ff749d38188394f457565d82d252edaf2209d9dd1"} Apr 22 20:20:17.188037 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:17.187998 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" event={"ID":"b08cb87a-d147-4183-b35c-6cffb7b46ba4","Type":"ContainerStarted","Data":"be029aa9b76c22357b67117dd5eb526b3c8a4dcc62f8d33a4218ba17bbfdfeeb"} Apr 22 20:20:17.213099 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:17.213035 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" podStartSLOduration=8.208581304 podStartE2EDuration="9.213015333s" podCreationTimestamp="2026-04-22 20:20:08 +0000 UTC" firstStartedPulling="2026-04-22 20:20:09.091033686 +0000 UTC m=+1345.904418845" lastFinishedPulling="2026-04-22 20:20:10.095467701 +0000 UTC m=+1346.908852874" observedRunningTime="2026-04-22 20:20:17.211356807 +0000 UTC m=+1354.024741988" watchObservedRunningTime="2026-04-22 20:20:17.213015333 +0000 UTC m=+1354.026400521" Apr 22 20:20:18.931630 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:18.931583 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" Apr 22 20:20:18.931630 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:18.931632 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" Apr 22 20:20:18.933084 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:18.933040 2575 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" podUID="b08cb87a-d147-4183-b35c-6cffb7b46ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8001/health\": dial tcp 10.132.0.39:8001: connect: connection refused" Apr 22 20:20:18.941953 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:18.941910 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" Apr 22 20:20:18.942134 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:18.941966 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" Apr 22 20:20:18.943078 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:18.943042 2575 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" podUID="67bd64cc-880a-4f89-90ba-c8d803f5850a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8000/health\": dial tcp 10.132.0.40:8000: connect: connection refused" Apr 22 20:20:28.932108 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:28.932058 2575 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" podUID="b08cb87a-d147-4183-b35c-6cffb7b46ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8001/health\": dial tcp 10.132.0.39:8001: connect: connection refused" Apr 22 20:20:28.942696 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:28.942652 2575 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" podUID="67bd64cc-880a-4f89-90ba-c8d803f5850a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8000/health\": dial tcp 10.132.0.40:8000: connect: connection refused" Apr 22 20:20:28.950626 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:28.950598 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" Apr 22 20:20:35.545077 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:35.545035 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 20:20:35.545487 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:35.545421 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podUID="08565dad-48f1-4088-aed7-29e9df70f4d4" containerName="main" containerID="cri-o://a0a000459796902f41d9f1bcb7cb4da6dbc40797ec507432ba95ebce5029a781" gracePeriod=30 Apr 22 20:20:38.932052 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:38.931995 2575 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" podUID="b08cb87a-d147-4183-b35c-6cffb7b46ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8001/health\": dial tcp 10.132.0.39:8001: connect: connection refused" Apr 22 20:20:38.942137 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:38.942088 2575 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" podUID="67bd64cc-880a-4f89-90ba-c8d803f5850a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8000/health\": dial tcp 10.132.0.40:8000: connect: connection refused" Apr 22 20:20:48.932369 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:48.932268 2575 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" podUID="b08cb87a-d147-4183-b35c-6cffb7b46ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8001/health\": dial tcp 10.132.0.39:8001: connect: connection refused" Apr 22 20:20:48.941869 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:48.941827 2575 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" podUID="67bd64cc-880a-4f89-90ba-c8d803f5850a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8000/health\": dial tcp 10.132.0.40:8000: connect: connection refused" Apr 22 20:20:58.932556 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:58.932493 2575 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" podUID="b08cb87a-d147-4183-b35c-6cffb7b46ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8001/health\": dial tcp 10.132.0.39:8001: connect: connection refused" Apr 22 20:20:58.942629 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:20:58.942581 2575 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" podUID="67bd64cc-880a-4f89-90ba-c8d803f5850a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8000/health\": dial tcp 10.132.0.40:8000: connect: connection refused" Apr 22 20:21:06.198294 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.198260 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1_08565dad-48f1-4088-aed7-29e9df70f4d4/main/0.log" Apr 22 20:21:06.198715 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.198685 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 20:21:06.293212 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.293167 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08565dad-48f1-4088-aed7-29e9df70f4d4-kserve-provision-location\") pod \"08565dad-48f1-4088-aed7-29e9df70f4d4\" (UID: \"08565dad-48f1-4088-aed7-29e9df70f4d4\") " Apr 22 20:21:06.293439 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.293257 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/08565dad-48f1-4088-aed7-29e9df70f4d4-model-cache\") pod \"08565dad-48f1-4088-aed7-29e9df70f4d4\" (UID: \"08565dad-48f1-4088-aed7-29e9df70f4d4\") " Apr 22 20:21:06.293439 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.293278 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v56h2\" (UniqueName: \"kubernetes.io/projected/08565dad-48f1-4088-aed7-29e9df70f4d4-kube-api-access-v56h2\") pod \"08565dad-48f1-4088-aed7-29e9df70f4d4\" (UID: \"08565dad-48f1-4088-aed7-29e9df70f4d4\") " Apr 22 20:21:06.293439 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.293298 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/08565dad-48f1-4088-aed7-29e9df70f4d4-tls-certs\") pod \"08565dad-48f1-4088-aed7-29e9df70f4d4\" (UID: \"08565dad-48f1-4088-aed7-29e9df70f4d4\") " Apr 22 20:21:06.293439 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.293354 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/08565dad-48f1-4088-aed7-29e9df70f4d4-home\") pod \"08565dad-48f1-4088-aed7-29e9df70f4d4\" (UID: \"08565dad-48f1-4088-aed7-29e9df70f4d4\") " Apr 22 20:21:06.293439 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.293392 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/08565dad-48f1-4088-aed7-29e9df70f4d4-dshm\") pod \"08565dad-48f1-4088-aed7-29e9df70f4d4\" (UID: \"08565dad-48f1-4088-aed7-29e9df70f4d4\") " Apr 22 20:21:06.293696 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.293593 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08565dad-48f1-4088-aed7-29e9df70f4d4-model-cache" (OuterVolumeSpecName: "model-cache") pod "08565dad-48f1-4088-aed7-29e9df70f4d4" (UID: "08565dad-48f1-4088-aed7-29e9df70f4d4"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:21:06.294013 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.293975 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08565dad-48f1-4088-aed7-29e9df70f4d4-home" (OuterVolumeSpecName: "home") pod "08565dad-48f1-4088-aed7-29e9df70f4d4" (UID: "08565dad-48f1-4088-aed7-29e9df70f4d4"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:21:06.295650 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.295620 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08565dad-48f1-4088-aed7-29e9df70f4d4-dshm" (OuterVolumeSpecName: "dshm") pod "08565dad-48f1-4088-aed7-29e9df70f4d4" (UID: "08565dad-48f1-4088-aed7-29e9df70f4d4"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:21:06.295921 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.295895 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08565dad-48f1-4088-aed7-29e9df70f4d4-kube-api-access-v56h2" (OuterVolumeSpecName: "kube-api-access-v56h2") pod "08565dad-48f1-4088-aed7-29e9df70f4d4" (UID: "08565dad-48f1-4088-aed7-29e9df70f4d4"). InnerVolumeSpecName "kube-api-access-v56h2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:21:06.295921 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.295897 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08565dad-48f1-4088-aed7-29e9df70f4d4-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "08565dad-48f1-4088-aed7-29e9df70f4d4" (UID: "08565dad-48f1-4088-aed7-29e9df70f4d4"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:21:06.347506 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.347394 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08565dad-48f1-4088-aed7-29e9df70f4d4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "08565dad-48f1-4088-aed7-29e9df70f4d4" (UID: "08565dad-48f1-4088-aed7-29e9df70f4d4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:21:06.371114 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.371077 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1_08565dad-48f1-4088-aed7-29e9df70f4d4/main/0.log" Apr 22 20:21:06.371417 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.371378 2575 generic.go:358] "Generic (PLEG): container finished" podID="08565dad-48f1-4088-aed7-29e9df70f4d4" containerID="a0a000459796902f41d9f1bcb7cb4da6dbc40797ec507432ba95ebce5029a781" exitCode=137 Apr 22 20:21:06.371541 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.371470 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 20:21:06.371541 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.371473 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"08565dad-48f1-4088-aed7-29e9df70f4d4","Type":"ContainerDied","Data":"a0a000459796902f41d9f1bcb7cb4da6dbc40797ec507432ba95ebce5029a781"} Apr 22 20:21:06.371541 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.371524 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"08565dad-48f1-4088-aed7-29e9df70f4d4","Type":"ContainerDied","Data":"ee81d0e10246e3a89321fb72115327ffd01c7101e5b06598c872e48259bec021"} Apr 22 20:21:06.371837 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.371546 2575 scope.go:117] "RemoveContainer" containerID="a0a000459796902f41d9f1bcb7cb4da6dbc40797ec507432ba95ebce5029a781" Apr 22 20:21:06.393571 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.393536 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 20:21:06.393995 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.393976 2575 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/08565dad-48f1-4088-aed7-29e9df70f4d4-model-cache\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:21:06.394041 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.393999 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v56h2\" (UniqueName: \"kubernetes.io/projected/08565dad-48f1-4088-aed7-29e9df70f4d4-kube-api-access-v56h2\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:21:06.394041 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.394010 2575 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/08565dad-48f1-4088-aed7-29e9df70f4d4-tls-certs\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:21:06.394041 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.394021 2575 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/08565dad-48f1-4088-aed7-29e9df70f4d4-home\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:21:06.394041 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.394029 2575 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/08565dad-48f1-4088-aed7-29e9df70f4d4-dshm\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:21:06.394187 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.394041 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/08565dad-48f1-4088-aed7-29e9df70f4d4-kserve-provision-location\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:21:06.398352 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.398305 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 20:21:06.399112 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.399090 2575 scope.go:117] "RemoveContainer" containerID="19ef7162093cdeb1297e028108e8a74a4232b72b2cbb6c899d9cb0dc04a27379" Apr 22 20:21:06.460235 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.460210 2575 scope.go:117] "RemoveContainer" containerID="a0a000459796902f41d9f1bcb7cb4da6dbc40797ec507432ba95ebce5029a781" Apr 22 20:21:06.460601 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:21:06.460578 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0a000459796902f41d9f1bcb7cb4da6dbc40797ec507432ba95ebce5029a781\": container with ID starting with a0a000459796902f41d9f1bcb7cb4da6dbc40797ec507432ba95ebce5029a781 not found: ID does not exist" containerID="a0a000459796902f41d9f1bcb7cb4da6dbc40797ec507432ba95ebce5029a781" Apr 22 20:21:06.460696 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.460614 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0a000459796902f41d9f1bcb7cb4da6dbc40797ec507432ba95ebce5029a781"} err="failed to get container status \"a0a000459796902f41d9f1bcb7cb4da6dbc40797ec507432ba95ebce5029a781\": rpc error: code = NotFound desc = could not find container \"a0a000459796902f41d9f1bcb7cb4da6dbc40797ec507432ba95ebce5029a781\": container with ID starting with a0a000459796902f41d9f1bcb7cb4da6dbc40797ec507432ba95ebce5029a781 not found: ID does not exist" Apr 22 20:21:06.460696 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.460640 2575 scope.go:117] "RemoveContainer" containerID="19ef7162093cdeb1297e028108e8a74a4232b72b2cbb6c899d9cb0dc04a27379" Apr 22 20:21:06.460905 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:21:06.460886 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19ef7162093cdeb1297e028108e8a74a4232b72b2cbb6c899d9cb0dc04a27379\": container with ID starting with 19ef7162093cdeb1297e028108e8a74a4232b72b2cbb6c899d9cb0dc04a27379 not found: ID does not exist" containerID="19ef7162093cdeb1297e028108e8a74a4232b72b2cbb6c899d9cb0dc04a27379" Apr 22 20:21:06.460964 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:06.460914 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19ef7162093cdeb1297e028108e8a74a4232b72b2cbb6c899d9cb0dc04a27379"} err="failed to get container status \"19ef7162093cdeb1297e028108e8a74a4232b72b2cbb6c899d9cb0dc04a27379\": rpc error: code = NotFound desc = could not find container \"19ef7162093cdeb1297e028108e8a74a4232b72b2cbb6c899d9cb0dc04a27379\": container with ID starting with 19ef7162093cdeb1297e028108e8a74a4232b72b2cbb6c899d9cb0dc04a27379 not found: ID does not exist" Apr 22 20:21:07.731259 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:07.731226 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08565dad-48f1-4088-aed7-29e9df70f4d4" path="/var/lib/kubelet/pods/08565dad-48f1-4088-aed7-29e9df70f4d4/volumes" Apr 22 20:21:08.932395 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:08.932337 2575 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" podUID="b08cb87a-d147-4183-b35c-6cffb7b46ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8001/health\": dial tcp 10.132.0.39:8001: connect: connection refused" Apr 22 20:21:08.942230 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:08.942190 2575 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" podUID="67bd64cc-880a-4f89-90ba-c8d803f5850a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8000/health\": dial tcp 10.132.0.40:8000: connect: connection refused" Apr 22 20:21:18.932295 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:18.932227 2575 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" podUID="b08cb87a-d147-4183-b35c-6cffb7b46ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8001/health\": dial tcp 10.132.0.39:8001: connect: connection refused" Apr 22 20:21:18.941668 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:18.941639 2575 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" podUID="67bd64cc-880a-4f89-90ba-c8d803f5850a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8000/health\": dial tcp 10.132.0.40:8000: connect: connection refused" Apr 22 20:21:28.932400 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:28.932349 2575 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" podUID="b08cb87a-d147-4183-b35c-6cffb7b46ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8001/health\": dial tcp 10.132.0.39:8001: connect: connection refused" Apr 22 20:21:28.941728 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:28.941698 2575 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" podUID="67bd64cc-880a-4f89-90ba-c8d803f5850a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8000/health\": dial tcp 10.132.0.40:8000: connect: connection refused" Apr 22 20:21:38.932543 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:38.932437 2575 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" podUID="b08cb87a-d147-4183-b35c-6cffb7b46ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8001/health\": dial tcp 10.132.0.39:8001: connect: connection refused" Apr 22 20:21:38.941810 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:38.941779 2575 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" podUID="67bd64cc-880a-4f89-90ba-c8d803f5850a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8000/health\": dial tcp 10.132.0.40:8000: connect: connection refused" Apr 22 20:21:48.932566 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:48.932506 2575 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" podUID="b08cb87a-d147-4183-b35c-6cffb7b46ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8001/health\": dial tcp 10.132.0.39:8001: connect: connection refused" Apr 22 20:21:48.942048 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:48.942014 2575 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" podUID="67bd64cc-880a-4f89-90ba-c8d803f5850a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8000/health\": dial tcp 10.132.0.40:8000: connect: connection refused" Apr 22 20:21:58.932439 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:58.932381 2575 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" podUID="b08cb87a-d147-4183-b35c-6cffb7b46ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8001/health\": dial tcp 10.132.0.39:8001: connect: connection refused" Apr 22 20:21:58.942191 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:21:58.942158 2575 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" podUID="67bd64cc-880a-4f89-90ba-c8d803f5850a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8000/health\": dial tcp 10.132.0.40:8000: connect: connection refused" Apr 22 20:22:08.932697 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:22:08.932642 2575 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" podUID="b08cb87a-d147-4183-b35c-6cffb7b46ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8001/health\": dial tcp 10.132.0.39:8001: connect: connection refused" Apr 22 20:22:08.941768 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:22:08.941729 2575 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" podUID="67bd64cc-880a-4f89-90ba-c8d803f5850a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8000/health\": dial tcp 10.132.0.40:8000: connect: connection refused" Apr 22 20:22:18.932446 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:22:18.932393 2575 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" podUID="b08cb87a-d147-4183-b35c-6cffb7b46ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8001/health\": dial tcp 10.132.0.39:8001: connect: connection refused" Apr 22 20:22:18.942056 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:22:18.942013 2575 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" podUID="67bd64cc-880a-4f89-90ba-c8d803f5850a" containerName="main" probeResult="failure" output="Get \"https://10.132.0.40:8000/health\": dial tcp 10.132.0.40:8000: connect: connection refused" Apr 22 20:22:28.932486 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:22:28.932437 2575 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" podUID="b08cb87a-d147-4183-b35c-6cffb7b46ba4" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8001/health\": dial tcp 10.132.0.39:8001: connect: connection refused" Apr 22 20:22:28.951939 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:22:28.951911 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" Apr 22 20:22:28.959783 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:22:28.959748 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" Apr 22 20:22:38.947113 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:22:38.947084 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" Apr 22 20:22:38.961298 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:22:38.961275 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" Apr 22 20:22:43.734193 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:22:43.734165 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vfhht_338d333b-8626-46b3-b450-fbdd521183d8/console-operator/1.log" Apr 22 20:22:43.736596 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:22:43.736568 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vfhht_338d333b-8626-46b3-b450-fbdd521183d8/console-operator/1.log" Apr 22 20:22:50.013777 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:22:50.013738 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw"] Apr 22 20:22:50.014225 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:22:50.014061 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" podUID="67bd64cc-880a-4f89-90ba-c8d803f5850a" containerName="main" containerID="cri-o://554d8dbe6914c6bc1b3c62f0c26ad44b24c25cb4894c379fc86087b66d8fa263" gracePeriod=30 Apr 22 20:22:50.019273 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:22:50.019249 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6"] Apr 22 20:22:50.019588 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:22:50.019565 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" podUID="b08cb87a-d147-4183-b35c-6cffb7b46ba4" containerName="main" containerID="cri-o://be029aa9b76c22357b67117dd5eb526b3c8a4dcc62f8d33a4218ba17bbfdfeeb" gracePeriod=30 Apr 22 20:23:20.019864 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.019779 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" podUID="b08cb87a-d147-4183-b35c-6cffb7b46ba4" containerName="llm-d-routing-sidecar" containerID="cri-o://83b396a0b91174a14ad8701bed4a6f15edd9105ebd948cfac19df626a7499f38" gracePeriod=2 Apr 22 20:23:20.451067 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.451035 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6_b08cb87a-d147-4183-b35c-6cffb7b46ba4/main/0.log" Apr 22 20:23:20.451741 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.451722 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" Apr 22 20:23:20.454280 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.454266 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" Apr 22 20:23:20.543602 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.543514 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b08cb87a-d147-4183-b35c-6cffb7b46ba4-kserve-provision-location\") pod \"b08cb87a-d147-4183-b35c-6cffb7b46ba4\" (UID: \"b08cb87a-d147-4183-b35c-6cffb7b46ba4\") " Apr 22 20:23:20.543602 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.543574 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/67bd64cc-880a-4f89-90ba-c8d803f5850a-model-cache\") pod \"67bd64cc-880a-4f89-90ba-c8d803f5850a\" (UID: \"67bd64cc-880a-4f89-90ba-c8d803f5850a\") " Apr 22 20:23:20.543602 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.543596 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b08cb87a-d147-4183-b35c-6cffb7b46ba4-home\") pod \"b08cb87a-d147-4183-b35c-6cffb7b46ba4\" (UID: \"b08cb87a-d147-4183-b35c-6cffb7b46ba4\") " Apr 22 20:23:20.543908 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.543633 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67bd64cc-880a-4f89-90ba-c8d803f5850a-kserve-provision-location\") pod \"67bd64cc-880a-4f89-90ba-c8d803f5850a\" (UID: \"67bd64cc-880a-4f89-90ba-c8d803f5850a\") " Apr 22 20:23:20.543908 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.543657 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/67bd64cc-880a-4f89-90ba-c8d803f5850a-dshm\") pod \"67bd64cc-880a-4f89-90ba-c8d803f5850a\" (UID: \"67bd64cc-880a-4f89-90ba-c8d803f5850a\") " Apr 22 20:23:20.543908 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.543687 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9fmk\" (UniqueName: \"kubernetes.io/projected/b08cb87a-d147-4183-b35c-6cffb7b46ba4-kube-api-access-x9fmk\") pod \"b08cb87a-d147-4183-b35c-6cffb7b46ba4\" (UID: \"b08cb87a-d147-4183-b35c-6cffb7b46ba4\") " Apr 22 20:23:20.543908 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.543713 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b08cb87a-d147-4183-b35c-6cffb7b46ba4-tls-certs\") pod \"b08cb87a-d147-4183-b35c-6cffb7b46ba4\" (UID: \"b08cb87a-d147-4183-b35c-6cffb7b46ba4\") " Apr 22 20:23:20.543908 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.543756 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b08cb87a-d147-4183-b35c-6cffb7b46ba4-dshm\") pod \"b08cb87a-d147-4183-b35c-6cffb7b46ba4\" (UID: \"b08cb87a-d147-4183-b35c-6cffb7b46ba4\") " Apr 22 20:23:20.543908 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.543809 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b08cb87a-d147-4183-b35c-6cffb7b46ba4-model-cache\") pod \"b08cb87a-d147-4183-b35c-6cffb7b46ba4\" (UID: \"b08cb87a-d147-4183-b35c-6cffb7b46ba4\") " Apr 22 20:23:20.543908 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.543833 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zb9g\" (UniqueName: \"kubernetes.io/projected/67bd64cc-880a-4f89-90ba-c8d803f5850a-kube-api-access-7zb9g\") pod \"67bd64cc-880a-4f89-90ba-c8d803f5850a\" (UID: \"67bd64cc-880a-4f89-90ba-c8d803f5850a\") " Apr 22 20:23:20.543908 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.543861 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/67bd64cc-880a-4f89-90ba-c8d803f5850a-tls-certs\") pod \"67bd64cc-880a-4f89-90ba-c8d803f5850a\" (UID: \"67bd64cc-880a-4f89-90ba-c8d803f5850a\") " Apr 22 20:23:20.543908 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.543885 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/67bd64cc-880a-4f89-90ba-c8d803f5850a-home\") pod \"67bd64cc-880a-4f89-90ba-c8d803f5850a\" (UID: \"67bd64cc-880a-4f89-90ba-c8d803f5850a\") " Apr 22 20:23:20.543908 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.543903 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67bd64cc-880a-4f89-90ba-c8d803f5850a-model-cache" (OuterVolumeSpecName: "model-cache") pod "67bd64cc-880a-4f89-90ba-c8d803f5850a" (UID: "67bd64cc-880a-4f89-90ba-c8d803f5850a"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:23:20.544450 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.544124 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b08cb87a-d147-4183-b35c-6cffb7b46ba4-model-cache" (OuterVolumeSpecName: "model-cache") pod "b08cb87a-d147-4183-b35c-6cffb7b46ba4" (UID: "b08cb87a-d147-4183-b35c-6cffb7b46ba4"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:23:20.544450 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.544130 2575 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/67bd64cc-880a-4f89-90ba-c8d803f5850a-model-cache\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:23:20.544450 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.544402 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b08cb87a-d147-4183-b35c-6cffb7b46ba4-home" (OuterVolumeSpecName: "home") pod "b08cb87a-d147-4183-b35c-6cffb7b46ba4" (UID: "b08cb87a-d147-4183-b35c-6cffb7b46ba4"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:23:20.544612 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.544530 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67bd64cc-880a-4f89-90ba-c8d803f5850a-home" (OuterVolumeSpecName: "home") pod "67bd64cc-880a-4f89-90ba-c8d803f5850a" (UID: "67bd64cc-880a-4f89-90ba-c8d803f5850a"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:23:20.546298 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.546257 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b08cb87a-d147-4183-b35c-6cffb7b46ba4-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b08cb87a-d147-4183-b35c-6cffb7b46ba4" (UID: "b08cb87a-d147-4183-b35c-6cffb7b46ba4"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:23:20.546298 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.546269 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67bd64cc-880a-4f89-90ba-c8d803f5850a-dshm" (OuterVolumeSpecName: "dshm") pod "67bd64cc-880a-4f89-90ba-c8d803f5850a" (UID: "67bd64cc-880a-4f89-90ba-c8d803f5850a"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:23:20.547353 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.546644 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b08cb87a-d147-4183-b35c-6cffb7b46ba4-dshm" (OuterVolumeSpecName: "dshm") pod "b08cb87a-d147-4183-b35c-6cffb7b46ba4" (UID: "b08cb87a-d147-4183-b35c-6cffb7b46ba4"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:23:20.547353 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.546750 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67bd64cc-880a-4f89-90ba-c8d803f5850a-kube-api-access-7zb9g" (OuterVolumeSpecName: "kube-api-access-7zb9g") pod "67bd64cc-880a-4f89-90ba-c8d803f5850a" (UID: "67bd64cc-880a-4f89-90ba-c8d803f5850a"). InnerVolumeSpecName "kube-api-access-7zb9g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:23:20.547353 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.546877 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b08cb87a-d147-4183-b35c-6cffb7b46ba4-kube-api-access-x9fmk" (OuterVolumeSpecName: "kube-api-access-x9fmk") pod "b08cb87a-d147-4183-b35c-6cffb7b46ba4" (UID: "b08cb87a-d147-4183-b35c-6cffb7b46ba4"). InnerVolumeSpecName "kube-api-access-x9fmk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:23:20.547353 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.546925 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67bd64cc-880a-4f89-90ba-c8d803f5850a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "67bd64cc-880a-4f89-90ba-c8d803f5850a" (UID: "67bd64cc-880a-4f89-90ba-c8d803f5850a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:23:20.604955 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.604911 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b08cb87a-d147-4183-b35c-6cffb7b46ba4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b08cb87a-d147-4183-b35c-6cffb7b46ba4" (UID: "b08cb87a-d147-4183-b35c-6cffb7b46ba4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:23:20.606290 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.606262 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67bd64cc-880a-4f89-90ba-c8d803f5850a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "67bd64cc-880a-4f89-90ba-c8d803f5850a" (UID: "67bd64cc-880a-4f89-90ba-c8d803f5850a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:23:20.644827 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.644793 2575 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b08cb87a-d147-4183-b35c-6cffb7b46ba4-dshm\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:23:20.644827 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.644821 2575 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b08cb87a-d147-4183-b35c-6cffb7b46ba4-model-cache\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:23:20.644991 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.644835 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7zb9g\" (UniqueName: \"kubernetes.io/projected/67bd64cc-880a-4f89-90ba-c8d803f5850a-kube-api-access-7zb9g\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:23:20.644991 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.644845 2575 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/67bd64cc-880a-4f89-90ba-c8d803f5850a-tls-certs\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:23:20.644991 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.644855 2575 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/67bd64cc-880a-4f89-90ba-c8d803f5850a-home\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:23:20.644991 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.644864 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b08cb87a-d147-4183-b35c-6cffb7b46ba4-kserve-provision-location\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:23:20.644991 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.644872 2575 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b08cb87a-d147-4183-b35c-6cffb7b46ba4-home\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:23:20.644991 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.644880 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67bd64cc-880a-4f89-90ba-c8d803f5850a-kserve-provision-location\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:23:20.644991 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.644888 2575 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/67bd64cc-880a-4f89-90ba-c8d803f5850a-dshm\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:23:20.644991 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.644897 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x9fmk\" (UniqueName: \"kubernetes.io/projected/b08cb87a-d147-4183-b35c-6cffb7b46ba4-kube-api-access-x9fmk\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:23:20.644991 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.644905 2575 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b08cb87a-d147-4183-b35c-6cffb7b46ba4-tls-certs\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:23:20.807569 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.807479 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6_b08cb87a-d147-4183-b35c-6cffb7b46ba4/main/0.log" Apr 22 20:23:20.808090 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.808068 2575 generic.go:358] "Generic (PLEG): container finished" podID="b08cb87a-d147-4183-b35c-6cffb7b46ba4" containerID="be029aa9b76c22357b67117dd5eb526b3c8a4dcc62f8d33a4218ba17bbfdfeeb" exitCode=137 Apr 22 20:23:20.808090 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.808089 2575 generic.go:358] "Generic (PLEG): container finished" podID="b08cb87a-d147-4183-b35c-6cffb7b46ba4" containerID="83b396a0b91174a14ad8701bed4a6f15edd9105ebd948cfac19df626a7499f38" exitCode=0 Apr 22 20:23:20.808231 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.808140 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" Apr 22 20:23:20.808231 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.808158 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" event={"ID":"b08cb87a-d147-4183-b35c-6cffb7b46ba4","Type":"ContainerDied","Data":"be029aa9b76c22357b67117dd5eb526b3c8a4dcc62f8d33a4218ba17bbfdfeeb"} Apr 22 20:23:20.808231 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.808193 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" event={"ID":"b08cb87a-d147-4183-b35c-6cffb7b46ba4","Type":"ContainerDied","Data":"83b396a0b91174a14ad8701bed4a6f15edd9105ebd948cfac19df626a7499f38"} Apr 22 20:23:20.808231 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.808210 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6" event={"ID":"b08cb87a-d147-4183-b35c-6cffb7b46ba4","Type":"ContainerDied","Data":"ae2efc2bc2653526bb80b598639b46c260a69c1fa10008b74643b784f172d76b"} Apr 22 20:23:20.808231 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.808229 2575 scope.go:117] "RemoveContainer" containerID="be029aa9b76c22357b67117dd5eb526b3c8a4dcc62f8d33a4218ba17bbfdfeeb" Apr 22 20:23:20.809743 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.809711 2575 generic.go:358] "Generic (PLEG): container finished" podID="67bd64cc-880a-4f89-90ba-c8d803f5850a" containerID="554d8dbe6914c6bc1b3c62f0c26ad44b24c25cb4894c379fc86087b66d8fa263" exitCode=137 Apr 22 20:23:20.809863 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.809749 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" event={"ID":"67bd64cc-880a-4f89-90ba-c8d803f5850a","Type":"ContainerDied","Data":"554d8dbe6914c6bc1b3c62f0c26ad44b24c25cb4894c379fc86087b66d8fa263"} Apr 22 20:23:20.809863 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.809773 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" event={"ID":"67bd64cc-880a-4f89-90ba-c8d803f5850a","Type":"ContainerDied","Data":"eb4b6eef0aaabe64bfad6d20dadb693db306f366c93a851a15758a03fa8c60dc"} Apr 22 20:23:20.809863 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.809795 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw" Apr 22 20:23:20.829368 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.829348 2575 scope.go:117] "RemoveContainer" containerID="47c89d8b95894f6ee482f04ff749d38188394f457565d82d252edaf2209d9dd1" Apr 22 20:23:20.834393 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.834369 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6"] Apr 22 20:23:20.837521 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.837500 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-6b6978488d-7z2x6"] Apr 22 20:23:20.847374 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.847350 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw"] Apr 22 20:23:20.852219 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.852187 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-7584c7665-gs4hw"] Apr 22 20:23:20.892230 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.892208 2575 scope.go:117] "RemoveContainer" containerID="83b396a0b91174a14ad8701bed4a6f15edd9105ebd948cfac19df626a7499f38" Apr 22 20:23:20.899428 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.899403 2575 scope.go:117] "RemoveContainer" containerID="be029aa9b76c22357b67117dd5eb526b3c8a4dcc62f8d33a4218ba17bbfdfeeb" Apr 22 20:23:20.899666 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:23:20.899648 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be029aa9b76c22357b67117dd5eb526b3c8a4dcc62f8d33a4218ba17bbfdfeeb\": container with ID starting with be029aa9b76c22357b67117dd5eb526b3c8a4dcc62f8d33a4218ba17bbfdfeeb not found: ID does not exist" containerID="be029aa9b76c22357b67117dd5eb526b3c8a4dcc62f8d33a4218ba17bbfdfeeb" Apr 22 20:23:20.899731 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.899677 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be029aa9b76c22357b67117dd5eb526b3c8a4dcc62f8d33a4218ba17bbfdfeeb"} err="failed to get container status \"be029aa9b76c22357b67117dd5eb526b3c8a4dcc62f8d33a4218ba17bbfdfeeb\": rpc error: code = NotFound desc = could not find container \"be029aa9b76c22357b67117dd5eb526b3c8a4dcc62f8d33a4218ba17bbfdfeeb\": container with ID starting with be029aa9b76c22357b67117dd5eb526b3c8a4dcc62f8d33a4218ba17bbfdfeeb not found: ID does not exist" Apr 22 20:23:20.899731 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.899702 2575 scope.go:117] "RemoveContainer" containerID="47c89d8b95894f6ee482f04ff749d38188394f457565d82d252edaf2209d9dd1" Apr 22 20:23:20.899939 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:23:20.899920 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47c89d8b95894f6ee482f04ff749d38188394f457565d82d252edaf2209d9dd1\": container with ID starting with 47c89d8b95894f6ee482f04ff749d38188394f457565d82d252edaf2209d9dd1 not found: ID does not exist" containerID="47c89d8b95894f6ee482f04ff749d38188394f457565d82d252edaf2209d9dd1" Apr 22 20:23:20.899987 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.899946 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47c89d8b95894f6ee482f04ff749d38188394f457565d82d252edaf2209d9dd1"} err="failed to get container status \"47c89d8b95894f6ee482f04ff749d38188394f457565d82d252edaf2209d9dd1\": rpc error: code = NotFound desc = could not find container \"47c89d8b95894f6ee482f04ff749d38188394f457565d82d252edaf2209d9dd1\": container with ID starting with 47c89d8b95894f6ee482f04ff749d38188394f457565d82d252edaf2209d9dd1 not found: ID does not exist" Apr 22 20:23:20.899987 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.899966 2575 scope.go:117] "RemoveContainer" containerID="83b396a0b91174a14ad8701bed4a6f15edd9105ebd948cfac19df626a7499f38" Apr 22 20:23:20.900174 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:23:20.900158 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83b396a0b91174a14ad8701bed4a6f15edd9105ebd948cfac19df626a7499f38\": container with ID starting with 83b396a0b91174a14ad8701bed4a6f15edd9105ebd948cfac19df626a7499f38 not found: ID does not exist" containerID="83b396a0b91174a14ad8701bed4a6f15edd9105ebd948cfac19df626a7499f38" Apr 22 20:23:20.900214 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.900179 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83b396a0b91174a14ad8701bed4a6f15edd9105ebd948cfac19df626a7499f38"} err="failed to get container status \"83b396a0b91174a14ad8701bed4a6f15edd9105ebd948cfac19df626a7499f38\": rpc error: code = NotFound desc = could not find container \"83b396a0b91174a14ad8701bed4a6f15edd9105ebd948cfac19df626a7499f38\": container with ID starting with 83b396a0b91174a14ad8701bed4a6f15edd9105ebd948cfac19df626a7499f38 not found: ID does not exist" Apr 22 20:23:20.900214 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.900192 2575 scope.go:117] "RemoveContainer" containerID="be029aa9b76c22357b67117dd5eb526b3c8a4dcc62f8d33a4218ba17bbfdfeeb" Apr 22 20:23:20.900471 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.900438 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be029aa9b76c22357b67117dd5eb526b3c8a4dcc62f8d33a4218ba17bbfdfeeb"} err="failed to get container status \"be029aa9b76c22357b67117dd5eb526b3c8a4dcc62f8d33a4218ba17bbfdfeeb\": rpc error: code = NotFound desc = could not find container \"be029aa9b76c22357b67117dd5eb526b3c8a4dcc62f8d33a4218ba17bbfdfeeb\": container with ID starting with be029aa9b76c22357b67117dd5eb526b3c8a4dcc62f8d33a4218ba17bbfdfeeb not found: ID does not exist" Apr 22 20:23:20.900522 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.900472 2575 scope.go:117] "RemoveContainer" containerID="47c89d8b95894f6ee482f04ff749d38188394f457565d82d252edaf2209d9dd1" Apr 22 20:23:20.900665 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.900650 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47c89d8b95894f6ee482f04ff749d38188394f457565d82d252edaf2209d9dd1"} err="failed to get container status \"47c89d8b95894f6ee482f04ff749d38188394f457565d82d252edaf2209d9dd1\": rpc error: code = NotFound desc = could not find container \"47c89d8b95894f6ee482f04ff749d38188394f457565d82d252edaf2209d9dd1\": container with ID starting with 47c89d8b95894f6ee482f04ff749d38188394f457565d82d252edaf2209d9dd1 not found: ID does not exist" Apr 22 20:23:20.900710 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.900666 2575 scope.go:117] "RemoveContainer" containerID="83b396a0b91174a14ad8701bed4a6f15edd9105ebd948cfac19df626a7499f38" Apr 22 20:23:20.900855 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.900839 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83b396a0b91174a14ad8701bed4a6f15edd9105ebd948cfac19df626a7499f38"} err="failed to get container status \"83b396a0b91174a14ad8701bed4a6f15edd9105ebd948cfac19df626a7499f38\": rpc error: code = NotFound desc = could not find container \"83b396a0b91174a14ad8701bed4a6f15edd9105ebd948cfac19df626a7499f38\": container with ID starting with 83b396a0b91174a14ad8701bed4a6f15edd9105ebd948cfac19df626a7499f38 not found: ID does not exist" Apr 22 20:23:20.900903 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.900855 2575 scope.go:117] "RemoveContainer" containerID="554d8dbe6914c6bc1b3c62f0c26ad44b24c25cb4894c379fc86087b66d8fa263" Apr 22 20:23:20.918136 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.918121 2575 scope.go:117] "RemoveContainer" containerID="a8847aa45d3e6a7c1e0902b244d105a08a91301d6a14852f92702e9bc95c886b" Apr 22 20:23:20.978082 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.978026 2575 scope.go:117] "RemoveContainer" containerID="554d8dbe6914c6bc1b3c62f0c26ad44b24c25cb4894c379fc86087b66d8fa263" Apr 22 20:23:20.978362 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:23:20.978339 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"554d8dbe6914c6bc1b3c62f0c26ad44b24c25cb4894c379fc86087b66d8fa263\": container with ID starting with 554d8dbe6914c6bc1b3c62f0c26ad44b24c25cb4894c379fc86087b66d8fa263 not found: ID does not exist" containerID="554d8dbe6914c6bc1b3c62f0c26ad44b24c25cb4894c379fc86087b66d8fa263" Apr 22 20:23:20.978425 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.978371 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"554d8dbe6914c6bc1b3c62f0c26ad44b24c25cb4894c379fc86087b66d8fa263"} err="failed to get container status \"554d8dbe6914c6bc1b3c62f0c26ad44b24c25cb4894c379fc86087b66d8fa263\": rpc error: code = NotFound desc = could not find container \"554d8dbe6914c6bc1b3c62f0c26ad44b24c25cb4894c379fc86087b66d8fa263\": container with ID starting with 554d8dbe6914c6bc1b3c62f0c26ad44b24c25cb4894c379fc86087b66d8fa263 not found: ID does not exist" Apr 22 20:23:20.978425 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.978392 2575 scope.go:117] "RemoveContainer" containerID="a8847aa45d3e6a7c1e0902b244d105a08a91301d6a14852f92702e9bc95c886b" Apr 22 20:23:20.978679 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:23:20.978661 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8847aa45d3e6a7c1e0902b244d105a08a91301d6a14852f92702e9bc95c886b\": container with ID starting with a8847aa45d3e6a7c1e0902b244d105a08a91301d6a14852f92702e9bc95c886b not found: ID does not exist" containerID="a8847aa45d3e6a7c1e0902b244d105a08a91301d6a14852f92702e9bc95c886b" Apr 22 20:23:20.978719 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:20.978686 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8847aa45d3e6a7c1e0902b244d105a08a91301d6a14852f92702e9bc95c886b"} err="failed to get container status \"a8847aa45d3e6a7c1e0902b244d105a08a91301d6a14852f92702e9bc95c886b\": rpc error: code = NotFound desc = could not find container \"a8847aa45d3e6a7c1e0902b244d105a08a91301d6a14852f92702e9bc95c886b\": container with ID starting with a8847aa45d3e6a7c1e0902b244d105a08a91301d6a14852f92702e9bc95c886b not found: ID does not exist" Apr 22 20:23:21.730814 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:21.730778 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67bd64cc-880a-4f89-90ba-c8d803f5850a" path="/var/lib/kubelet/pods/67bd64cc-880a-4f89-90ba-c8d803f5850a/volumes" Apr 22 20:23:21.731211 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:23:21.731189 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b08cb87a-d147-4183-b35c-6cffb7b46ba4" path="/var/lib/kubelet/pods/b08cb87a-d147-4183-b35c-6cffb7b46ba4/volumes" Apr 22 20:25:44.448091 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.448053 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6mnvm/must-gather-5mrr9"] Apr 22 20:25:44.448597 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.448351 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b08cb87a-d147-4183-b35c-6cffb7b46ba4" containerName="storage-initializer" Apr 22 20:25:44.448597 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.448365 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b08cb87a-d147-4183-b35c-6cffb7b46ba4" containerName="storage-initializer" Apr 22 20:25:44.448597 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.448376 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67bd64cc-880a-4f89-90ba-c8d803f5850a" containerName="storage-initializer" Apr 22 20:25:44.448597 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.448383 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="67bd64cc-880a-4f89-90ba-c8d803f5850a" containerName="storage-initializer" Apr 22 20:25:44.448597 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.448391 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b08cb87a-d147-4183-b35c-6cffb7b46ba4" containerName="llm-d-routing-sidecar" Apr 22 20:25:44.448597 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.448397 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b08cb87a-d147-4183-b35c-6cffb7b46ba4" containerName="llm-d-routing-sidecar" Apr 22 20:25:44.448597 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.448407 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b08cb87a-d147-4183-b35c-6cffb7b46ba4" containerName="main" Apr 22 20:25:44.448597 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.448412 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b08cb87a-d147-4183-b35c-6cffb7b46ba4" containerName="main" Apr 22 20:25:44.448597 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.448424 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08565dad-48f1-4088-aed7-29e9df70f4d4" containerName="storage-initializer" Apr 22 20:25:44.448597 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.448428 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="08565dad-48f1-4088-aed7-29e9df70f4d4" containerName="storage-initializer" Apr 22 20:25:44.448597 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.448434 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08565dad-48f1-4088-aed7-29e9df70f4d4" containerName="main" Apr 22 20:25:44.448597 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.448439 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="08565dad-48f1-4088-aed7-29e9df70f4d4" containerName="main" Apr 22 20:25:44.448597 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.448448 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67bd64cc-880a-4f89-90ba-c8d803f5850a" containerName="main" Apr 22 20:25:44.448597 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.448453 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="67bd64cc-880a-4f89-90ba-c8d803f5850a" containerName="main" Apr 22 20:25:44.448597 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.448499 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="67bd64cc-880a-4f89-90ba-c8d803f5850a" containerName="main" Apr 22 20:25:44.448597 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.448505 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="b08cb87a-d147-4183-b35c-6cffb7b46ba4" containerName="llm-d-routing-sidecar" Apr 22 20:25:44.448597 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.448513 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="08565dad-48f1-4088-aed7-29e9df70f4d4" containerName="main" Apr 22 20:25:44.448597 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.448520 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="b08cb87a-d147-4183-b35c-6cffb7b46ba4" containerName="main" Apr 22 20:25:44.451262 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.451247 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mnvm/must-gather-5mrr9" Apr 22 20:25:44.453558 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.453537 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6mnvm\"/\"kube-root-ca.crt\"" Apr 22 20:25:44.454394 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.454376 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-6mnvm\"/\"default-dockercfg-5nh74\"" Apr 22 20:25:44.454462 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.454425 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6mnvm\"/\"openshift-service-ca.crt\"" Apr 22 20:25:44.459423 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.459398 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6mnvm/must-gather-5mrr9"] Apr 22 20:25:44.638015 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.637971 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2-must-gather-output\") pod \"must-gather-5mrr9\" (UID: \"dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2\") " pod="openshift-must-gather-6mnvm/must-gather-5mrr9" Apr 22 20:25:44.638206 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.638077 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsg99\" (UniqueName: \"kubernetes.io/projected/dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2-kube-api-access-wsg99\") pod \"must-gather-5mrr9\" (UID: \"dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2\") " pod="openshift-must-gather-6mnvm/must-gather-5mrr9" Apr 22 20:25:44.738759 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.738666 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2-must-gather-output\") pod \"must-gather-5mrr9\" (UID: \"dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2\") " pod="openshift-must-gather-6mnvm/must-gather-5mrr9" Apr 22 20:25:44.738759 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.738732 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wsg99\" (UniqueName: \"kubernetes.io/projected/dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2-kube-api-access-wsg99\") pod \"must-gather-5mrr9\" (UID: \"dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2\") " pod="openshift-must-gather-6mnvm/must-gather-5mrr9" Apr 22 20:25:44.739089 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.739065 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2-must-gather-output\") pod \"must-gather-5mrr9\" (UID: \"dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2\") " pod="openshift-must-gather-6mnvm/must-gather-5mrr9" Apr 22 20:25:44.746304 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.746276 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsg99\" (UniqueName: \"kubernetes.io/projected/dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2-kube-api-access-wsg99\") pod \"must-gather-5mrr9\" (UID: \"dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2\") " pod="openshift-must-gather-6mnvm/must-gather-5mrr9" Apr 22 20:25:44.760175 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.760147 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mnvm/must-gather-5mrr9" Apr 22 20:25:44.880557 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.880522 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6mnvm/must-gather-5mrr9"] Apr 22 20:25:44.883709 ip-10-0-133-60 kubenswrapper[2575]: W0422 20:25:44.883678 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd3b514c_ec9e_44b3_99c2_6dc2d2c896b2.slice/crio-1e58cb75204d2245d4a59e2bca41f41fb6cc352e41805fbd67651feb111879a1 WatchSource:0}: Error finding container 1e58cb75204d2245d4a59e2bca41f41fb6cc352e41805fbd67651feb111879a1: Status 404 returned error can't find the container with id 1e58cb75204d2245d4a59e2bca41f41fb6cc352e41805fbd67651feb111879a1 Apr 22 20:25:44.885490 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:44.885469 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:25:45.238296 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:45.238261 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6mnvm/must-gather-5mrr9" event={"ID":"dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2","Type":"ContainerStarted","Data":"1e58cb75204d2245d4a59e2bca41f41fb6cc352e41805fbd67651feb111879a1"} Apr 22 20:25:49.253636 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:49.253600 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6mnvm/must-gather-5mrr9" event={"ID":"dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2","Type":"ContainerStarted","Data":"10cce1361ed6dd48fd1b9814430a9938efacd2170b94064b7648e098370ef6e0"} Apr 22 20:25:49.253636 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:25:49.253640 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6mnvm/must-gather-5mrr9" event={"ID":"dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2","Type":"ContainerStarted","Data":"9944a3b2e598e1d4e16d571fcfbbe02865fa23ed783d1932da4088ed2f392717"} Apr 22 20:26:12.138987 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:12.138908 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5ff55498bb-hwmkq_9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9/router/0.log" Apr 22 20:26:12.935248 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:12.935216 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5ff55498bb-hwmkq_9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9/router/0.log" Apr 22 20:26:13.693023 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:13.692991 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-8xzwg_041e72fc-4d39-42df-a5e4-ebae82c88495/authorino/0.log" Apr 22 20:26:13.759029 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:13.759005 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-k62gr_aa225954-d141-4b64-9c9d-08f387ce34e0/kuadrant-console-plugin/0.log" Apr 22 20:26:15.340176 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:15.340145 2575 generic.go:358] "Generic (PLEG): container finished" podID="dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2" containerID="9944a3b2e598e1d4e16d571fcfbbe02865fa23ed783d1932da4088ed2f392717" exitCode=0 Apr 22 20:26:15.340625 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:15.340221 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6mnvm/must-gather-5mrr9" event={"ID":"dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2","Type":"ContainerDied","Data":"9944a3b2e598e1d4e16d571fcfbbe02865fa23ed783d1932da4088ed2f392717"} Apr 22 20:26:15.340625 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:15.340579 2575 scope.go:117] "RemoveContainer" containerID="9944a3b2e598e1d4e16d571fcfbbe02865fa23ed783d1932da4088ed2f392717" Apr 22 20:26:15.524331 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:15.524289 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6mnvm_must-gather-5mrr9_dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2/gather/0.log" Apr 22 20:26:16.126998 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:16.126965 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7cxtv/must-gather-dpwtm"] Apr 22 20:26:16.131529 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:16.131499 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7cxtv/must-gather-dpwtm" Apr 22 20:26:16.133798 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:16.133776 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7cxtv\"/\"kube-root-ca.crt\"" Apr 22 20:26:16.134549 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:16.134532 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7cxtv\"/\"default-dockercfg-phjf8\"" Apr 22 20:26:16.134619 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:16.134548 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7cxtv\"/\"openshift-service-ca.crt\"" Apr 22 20:26:16.136993 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:16.136971 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7cxtv/must-gather-dpwtm"] Apr 22 20:26:16.211497 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:16.211462 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c84c17c1-625c-42c9-a961-77336e35cedb-must-gather-output\") pod \"must-gather-dpwtm\" (UID: \"c84c17c1-625c-42c9-a961-77336e35cedb\") " pod="openshift-must-gather-7cxtv/must-gather-dpwtm" Apr 22 20:26:16.211497 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:16.211503 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n2vd\" (UniqueName: \"kubernetes.io/projected/c84c17c1-625c-42c9-a961-77336e35cedb-kube-api-access-7n2vd\") pod \"must-gather-dpwtm\" (UID: \"c84c17c1-625c-42c9-a961-77336e35cedb\") " pod="openshift-must-gather-7cxtv/must-gather-dpwtm" Apr 22 20:26:16.312224 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:16.312185 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7n2vd\" (UniqueName: \"kubernetes.io/projected/c84c17c1-625c-42c9-a961-77336e35cedb-kube-api-access-7n2vd\") pod \"must-gather-dpwtm\" (UID: \"c84c17c1-625c-42c9-a961-77336e35cedb\") " pod="openshift-must-gather-7cxtv/must-gather-dpwtm" Apr 22 20:26:16.312444 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:16.312278 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c84c17c1-625c-42c9-a961-77336e35cedb-must-gather-output\") pod \"must-gather-dpwtm\" (UID: \"c84c17c1-625c-42c9-a961-77336e35cedb\") " pod="openshift-must-gather-7cxtv/must-gather-dpwtm" Apr 22 20:26:16.312578 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:16.312561 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c84c17c1-625c-42c9-a961-77336e35cedb-must-gather-output\") pod \"must-gather-dpwtm\" (UID: \"c84c17c1-625c-42c9-a961-77336e35cedb\") " pod="openshift-must-gather-7cxtv/must-gather-dpwtm" Apr 22 20:26:16.324680 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:16.324649 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n2vd\" (UniqueName: \"kubernetes.io/projected/c84c17c1-625c-42c9-a961-77336e35cedb-kube-api-access-7n2vd\") pod \"must-gather-dpwtm\" (UID: \"c84c17c1-625c-42c9-a961-77336e35cedb\") " pod="openshift-must-gather-7cxtv/must-gather-dpwtm" Apr 22 20:26:16.442235 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:16.442152 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7cxtv/must-gather-dpwtm" Apr 22 20:26:16.564390 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:16.564358 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7cxtv/must-gather-dpwtm"] Apr 22 20:26:16.567483 ip-10-0-133-60 kubenswrapper[2575]: W0422 20:26:16.567450 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc84c17c1_625c_42c9_a961_77336e35cedb.slice/crio-3684926400fb03e13f579e265d83572faef767e06674c69306b2779858ce518c WatchSource:0}: Error finding container 3684926400fb03e13f579e265d83572faef767e06674c69306b2779858ce518c: Status 404 returned error can't find the container with id 3684926400fb03e13f579e265d83572faef767e06674c69306b2779858ce518c Apr 22 20:26:17.347430 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:17.347388 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7cxtv/must-gather-dpwtm" event={"ID":"c84c17c1-625c-42c9-a961-77336e35cedb","Type":"ContainerStarted","Data":"3684926400fb03e13f579e265d83572faef767e06674c69306b2779858ce518c"} Apr 22 20:26:18.353367 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:18.353306 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7cxtv/must-gather-dpwtm" event={"ID":"c84c17c1-625c-42c9-a961-77336e35cedb","Type":"ContainerStarted","Data":"fabe575a076171c7f1ddada5b4c1023dbcb04d6a79a05c04c07064b605cdb31b"} Apr 22 20:26:18.353367 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:18.353376 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7cxtv/must-gather-dpwtm" event={"ID":"c84c17c1-625c-42c9-a961-77336e35cedb","Type":"ContainerStarted","Data":"0a46151c7ac31ab11851406a5275b8190fde85f8b3fc65a3536a005ec845fab7"} Apr 22 20:26:18.370288 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:18.370213 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7cxtv/must-gather-dpwtm" podStartSLOduration=1.661461052 podStartE2EDuration="2.370192635s" podCreationTimestamp="2026-04-22 20:26:16 +0000 UTC" firstStartedPulling="2026-04-22 20:26:16.569369656 +0000 UTC m=+1713.382754814" lastFinishedPulling="2026-04-22 20:26:17.278101224 +0000 UTC m=+1714.091486397" observedRunningTime="2026-04-22 20:26:18.367035291 +0000 UTC m=+1715.180420473" watchObservedRunningTime="2026-04-22 20:26:18.370192635 +0000 UTC m=+1715.183577815" Apr 22 20:26:18.762008 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:18.761916 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-cvg2s_cba227e9-dcf5-4cf7-9c4b-83013a0b20fb/global-pull-secret-syncer/0.log" Apr 22 20:26:18.873415 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:18.873384 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-w7bfn_0961e7e6-a8cb-43cc-82a5-7a080e47aae5/konnectivity-agent/0.log" Apr 22 20:26:18.916439 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:18.916406 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-60.ec2.internal_cd407a21e2749828c73fd67e5ee13311/haproxy/0.log" Apr 22 20:26:20.981037 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:20.981001 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6mnvm/must-gather-5mrr9"] Apr 22 20:26:20.981946 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:20.981891 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-6mnvm/must-gather-5mrr9" podUID="dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2" containerName="copy" containerID="cri-o://10cce1361ed6dd48fd1b9814430a9938efacd2170b94064b7648e098370ef6e0" gracePeriod=2 Apr 22 20:26:20.983536 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:20.982840 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6mnvm/must-gather-5mrr9"] Apr 22 20:26:20.984053 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:20.984014 2575 status_manager.go:895] "Failed to get status for pod" podUID="dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2" pod="openshift-must-gather-6mnvm/must-gather-5mrr9" err="pods \"must-gather-5mrr9\" is forbidden: User \"system:node:ip-10-0-133-60.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-6mnvm\": no relationship found between node 'ip-10-0-133-60.ec2.internal' and this object" Apr 22 20:26:21.345195 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:21.345098 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6mnvm_must-gather-5mrr9_dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2/copy/0.log" Apr 22 20:26:21.346036 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:21.345730 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mnvm/must-gather-5mrr9" Apr 22 20:26:21.353341 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:21.352516 2575 status_manager.go:895] "Failed to get status for pod" podUID="dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2" pod="openshift-must-gather-6mnvm/must-gather-5mrr9" err="pods \"must-gather-5mrr9\" is forbidden: User \"system:node:ip-10-0-133-60.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-6mnvm\": no relationship found between node 'ip-10-0-133-60.ec2.internal' and this object" Apr 22 20:26:21.367113 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:21.367020 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6mnvm_must-gather-5mrr9_dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2/copy/0.log" Apr 22 20:26:21.368249 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:21.367662 2575 generic.go:358] "Generic (PLEG): container finished" podID="dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2" containerID="10cce1361ed6dd48fd1b9814430a9938efacd2170b94064b7648e098370ef6e0" exitCode=143 Apr 22 20:26:21.368249 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:21.367786 2575 scope.go:117] "RemoveContainer" containerID="10cce1361ed6dd48fd1b9814430a9938efacd2170b94064b7648e098370ef6e0" Apr 22 20:26:21.368249 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:21.367924 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6mnvm/must-gather-5mrr9" Apr 22 20:26:21.370040 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:21.369993 2575 status_manager.go:895] "Failed to get status for pod" podUID="dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2" pod="openshift-must-gather-6mnvm/must-gather-5mrr9" err="pods \"must-gather-5mrr9\" is forbidden: User \"system:node:ip-10-0-133-60.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-6mnvm\": no relationship found between node 'ip-10-0-133-60.ec2.internal' and this object" Apr 22 20:26:21.386548 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:21.386525 2575 scope.go:117] "RemoveContainer" containerID="9944a3b2e598e1d4e16d571fcfbbe02865fa23ed783d1932da4088ed2f392717" Apr 22 20:26:21.417020 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:21.416985 2575 scope.go:117] "RemoveContainer" containerID="10cce1361ed6dd48fd1b9814430a9938efacd2170b94064b7648e098370ef6e0" Apr 22 20:26:21.417479 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:26:21.417447 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10cce1361ed6dd48fd1b9814430a9938efacd2170b94064b7648e098370ef6e0\": container with ID starting with 10cce1361ed6dd48fd1b9814430a9938efacd2170b94064b7648e098370ef6e0 not found: ID does not exist" containerID="10cce1361ed6dd48fd1b9814430a9938efacd2170b94064b7648e098370ef6e0" Apr 22 20:26:21.417676 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:21.417490 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10cce1361ed6dd48fd1b9814430a9938efacd2170b94064b7648e098370ef6e0"} err="failed to get container status \"10cce1361ed6dd48fd1b9814430a9938efacd2170b94064b7648e098370ef6e0\": rpc error: code = NotFound desc = could not find container \"10cce1361ed6dd48fd1b9814430a9938efacd2170b94064b7648e098370ef6e0\": container with ID starting with 10cce1361ed6dd48fd1b9814430a9938efacd2170b94064b7648e098370ef6e0 not found: ID does not exist" Apr 22 20:26:21.417676 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:21.417519 2575 scope.go:117] "RemoveContainer" containerID="9944a3b2e598e1d4e16d571fcfbbe02865fa23ed783d1932da4088ed2f392717" Apr 22 20:26:21.417842 ip-10-0-133-60 kubenswrapper[2575]: E0422 20:26:21.417787 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9944a3b2e598e1d4e16d571fcfbbe02865fa23ed783d1932da4088ed2f392717\": container with ID starting with 9944a3b2e598e1d4e16d571fcfbbe02865fa23ed783d1932da4088ed2f392717 not found: ID does not exist" containerID="9944a3b2e598e1d4e16d571fcfbbe02865fa23ed783d1932da4088ed2f392717" Apr 22 20:26:21.417842 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:21.417815 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9944a3b2e598e1d4e16d571fcfbbe02865fa23ed783d1932da4088ed2f392717"} err="failed to get container status \"9944a3b2e598e1d4e16d571fcfbbe02865fa23ed783d1932da4088ed2f392717\": rpc error: code = NotFound desc = could not find container \"9944a3b2e598e1d4e16d571fcfbbe02865fa23ed783d1932da4088ed2f392717\": container with ID starting with 9944a3b2e598e1d4e16d571fcfbbe02865fa23ed783d1932da4088ed2f392717 not found: ID does not exist" Apr 22 20:26:21.458439 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:21.458384 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsg99\" (UniqueName: \"kubernetes.io/projected/dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2-kube-api-access-wsg99\") pod \"dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2\" (UID: \"dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2\") " Apr 22 20:26:21.458641 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:21.458480 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2-must-gather-output\") pod \"dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2\" (UID: \"dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2\") " Apr 22 20:26:21.473823 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:21.468119 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2" (UID: "dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:26:21.480275 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:21.480233 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2-kube-api-access-wsg99" (OuterVolumeSpecName: "kube-api-access-wsg99") pod "dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2" (UID: "dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2"). InnerVolumeSpecName "kube-api-access-wsg99". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:26:21.559661 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:21.559586 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wsg99\" (UniqueName: \"kubernetes.io/projected/dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2-kube-api-access-wsg99\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:26:21.559661 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:21.559630 2575 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2-must-gather-output\") on node \"ip-10-0-133-60.ec2.internal\" DevicePath \"\"" Apr 22 20:26:21.682818 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:21.682772 2575 status_manager.go:895] "Failed to get status for pod" podUID="dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2" pod="openshift-must-gather-6mnvm/must-gather-5mrr9" err="pods \"must-gather-5mrr9\" is forbidden: User \"system:node:ip-10-0-133-60.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-6mnvm\": no relationship found between node 'ip-10-0-133-60.ec2.internal' and this object" Apr 22 20:26:21.746698 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:21.746653 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2" path="/var/lib/kubelet/pods/dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2/volumes" Apr 22 20:26:23.148480 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:23.148447 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-8xzwg_041e72fc-4d39-42df-a5e4-ebae82c88495/authorino/0.log" Apr 22 20:26:23.246514 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:23.246483 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-k62gr_aa225954-d141-4b64-9c9d-08f387ce34e0/kuadrant-console-plugin/0.log" Apr 22 20:26:24.488506 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:24.488465 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-b8884_ab3165ee-b810-41ae-b9dc-8e3198db6bc1/cluster-monitoring-operator/0.log" Apr 22 20:26:24.726034 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:24.726002 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mmt2k_b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e/node-exporter/0.log" Apr 22 20:26:24.746789 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:24.746746 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mmt2k_b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e/kube-rbac-proxy/0.log" Apr 22 20:26:24.766146 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:24.766111 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-mmt2k_b5e55ed1-358f-40a9-b0e5-1ce0a7ed253e/init-textfile/0.log" Apr 22 20:26:26.586515 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:26.586479 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-f7qdl_ec41f5a3-a7a6-4607-b14e-49402afefbe2/networking-console-plugin/0.log" Apr 22 20:26:27.108816 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:27.108789 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vfhht_338d333b-8626-46b3-b450-fbdd521183d8/console-operator/1.log" Apr 22 20:26:27.114198 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:27.114152 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-vfhht_338d333b-8626-46b3-b450-fbdd521183d8/console-operator/2.log" Apr 22 20:26:27.760577 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:27.760539 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7cxtv/perf-node-gather-daemonset-v96fl"] Apr 22 20:26:27.761047 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:27.760968 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2" containerName="copy" Apr 22 20:26:27.761047 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:27.760984 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2" containerName="copy" Apr 22 20:26:27.761047 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:27.760998 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2" containerName="gather" Apr 22 20:26:27.761047 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:27.761005 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2" containerName="gather" Apr 22 20:26:27.761276 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:27.761088 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2" containerName="gather" Apr 22 20:26:27.761276 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:27.761098 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd3b514c-ec9e-44b3-99c2-6dc2d2c896b2" containerName="copy" Apr 22 20:26:27.764945 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:27.764924 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-v96fl" Apr 22 20:26:27.771635 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:27.771606 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7cxtv/perf-node-gather-daemonset-v96fl"] Apr 22 20:26:27.920784 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:27.920736 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hwcg\" (UniqueName: \"kubernetes.io/projected/cc3688bf-9063-4f00-8073-b38da6cc3aeb-kube-api-access-2hwcg\") pod \"perf-node-gather-daemonset-v96fl\" (UID: \"cc3688bf-9063-4f00-8073-b38da6cc3aeb\") " pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-v96fl" Apr 22 20:26:27.920971 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:27.920814 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cc3688bf-9063-4f00-8073-b38da6cc3aeb-proc\") pod \"perf-node-gather-daemonset-v96fl\" (UID: \"cc3688bf-9063-4f00-8073-b38da6cc3aeb\") " pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-v96fl" Apr 22 20:26:27.920971 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:27.920838 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cc3688bf-9063-4f00-8073-b38da6cc3aeb-podres\") pod \"perf-node-gather-daemonset-v96fl\" (UID: \"cc3688bf-9063-4f00-8073-b38da6cc3aeb\") " pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-v96fl" Apr 22 20:26:27.920971 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:27.920879 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cc3688bf-9063-4f00-8073-b38da6cc3aeb-sys\") pod \"perf-node-gather-daemonset-v96fl\" (UID: \"cc3688bf-9063-4f00-8073-b38da6cc3aeb\") " pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-v96fl" Apr 22 20:26:27.920971 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:27.920914 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cc3688bf-9063-4f00-8073-b38da6cc3aeb-lib-modules\") pod \"perf-node-gather-daemonset-v96fl\" (UID: \"cc3688bf-9063-4f00-8073-b38da6cc3aeb\") " pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-v96fl" Apr 22 20:26:28.021630 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:28.021557 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cc3688bf-9063-4f00-8073-b38da6cc3aeb-sys\") pod \"perf-node-gather-daemonset-v96fl\" (UID: \"cc3688bf-9063-4f00-8073-b38da6cc3aeb\") " pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-v96fl" Apr 22 20:26:28.021630 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:28.021600 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cc3688bf-9063-4f00-8073-b38da6cc3aeb-lib-modules\") pod \"perf-node-gather-daemonset-v96fl\" (UID: \"cc3688bf-9063-4f00-8073-b38da6cc3aeb\") " pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-v96fl" Apr 22 20:26:28.021811 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:28.021671 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cc3688bf-9063-4f00-8073-b38da6cc3aeb-sys\") pod \"perf-node-gather-daemonset-v96fl\" (UID: \"cc3688bf-9063-4f00-8073-b38da6cc3aeb\") " pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-v96fl" Apr 22 20:26:28.021811 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:28.021674 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hwcg\" (UniqueName: \"kubernetes.io/projected/cc3688bf-9063-4f00-8073-b38da6cc3aeb-kube-api-access-2hwcg\") pod \"perf-node-gather-daemonset-v96fl\" (UID: \"cc3688bf-9063-4f00-8073-b38da6cc3aeb\") " pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-v96fl" Apr 22 20:26:28.021811 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:28.021716 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cc3688bf-9063-4f00-8073-b38da6cc3aeb-proc\") pod \"perf-node-gather-daemonset-v96fl\" (UID: \"cc3688bf-9063-4f00-8073-b38da6cc3aeb\") " pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-v96fl" Apr 22 20:26:28.021811 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:28.021733 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cc3688bf-9063-4f00-8073-b38da6cc3aeb-podres\") pod \"perf-node-gather-daemonset-v96fl\" (UID: \"cc3688bf-9063-4f00-8073-b38da6cc3aeb\") " pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-v96fl" Apr 22 20:26:28.021811 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:28.021797 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cc3688bf-9063-4f00-8073-b38da6cc3aeb-lib-modules\") pod \"perf-node-gather-daemonset-v96fl\" (UID: \"cc3688bf-9063-4f00-8073-b38da6cc3aeb\") " pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-v96fl" Apr 22 20:26:28.021977 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:28.021818 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cc3688bf-9063-4f00-8073-b38da6cc3aeb-proc\") pod \"perf-node-gather-daemonset-v96fl\" (UID: \"cc3688bf-9063-4f00-8073-b38da6cc3aeb\") " pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-v96fl" Apr 22 20:26:28.021977 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:28.021854 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cc3688bf-9063-4f00-8073-b38da6cc3aeb-podres\") pod \"perf-node-gather-daemonset-v96fl\" (UID: \"cc3688bf-9063-4f00-8073-b38da6cc3aeb\") " pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-v96fl" Apr 22 20:26:28.029060 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:28.029029 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hwcg\" (UniqueName: \"kubernetes.io/projected/cc3688bf-9063-4f00-8073-b38da6cc3aeb-kube-api-access-2hwcg\") pod \"perf-node-gather-daemonset-v96fl\" (UID: \"cc3688bf-9063-4f00-8073-b38da6cc3aeb\") " pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-v96fl" Apr 22 20:26:28.032062 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:28.032038 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-dqgqv_ccc07e82-60c3-4ce8-9255-15e80fff83d9/volume-data-source-validator/0.log" Apr 22 20:26:28.076112 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:28.076071 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-v96fl" Apr 22 20:26:28.219627 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:28.219589 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7cxtv/perf-node-gather-daemonset-v96fl"] Apr 22 20:26:28.223365 ip-10-0-133-60 kubenswrapper[2575]: W0422 20:26:28.223286 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcc3688bf_9063_4f00_8073_b38da6cc3aeb.slice/crio-1acc02fad01491a5ee212e18fad057cf5770ca6304db99cb8833cb4ead7affb8 WatchSource:0}: Error finding container 1acc02fad01491a5ee212e18fad057cf5770ca6304db99cb8833cb4ead7affb8: Status 404 returned error can't find the container with id 1acc02fad01491a5ee212e18fad057cf5770ca6304db99cb8833cb4ead7affb8 Apr 22 20:26:28.398793 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:28.398751 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-v96fl" event={"ID":"cc3688bf-9063-4f00-8073-b38da6cc3aeb","Type":"ContainerStarted","Data":"60078b3f09660320bb0a8c75546ef6e4a821c7dc57813dd4503b0dbd2eda93dc"} Apr 22 20:26:28.398994 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:28.398799 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-v96fl" event={"ID":"cc3688bf-9063-4f00-8073-b38da6cc3aeb","Type":"ContainerStarted","Data":"1acc02fad01491a5ee212e18fad057cf5770ca6304db99cb8833cb4ead7affb8"} Apr 22 20:26:28.399340 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:28.399297 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-v96fl" Apr 22 20:26:28.415552 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:28.415504 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-v96fl" podStartSLOduration=1.4154867740000001 podStartE2EDuration="1.415486774s" podCreationTimestamp="2026-04-22 20:26:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:26:28.413364901 +0000 UTC m=+1725.226750083" watchObservedRunningTime="2026-04-22 20:26:28.415486774 +0000 UTC m=+1725.228872001" Apr 22 20:26:28.792815 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:28.792788 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fpzrf_7561fdeb-e97b-4652-ba5c-1e555b68f4aa/dns/0.log" Apr 22 20:26:28.812616 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:28.812589 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fpzrf_7561fdeb-e97b-4652-ba5c-1e555b68f4aa/kube-rbac-proxy/0.log" Apr 22 20:26:28.885016 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:28.884991 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jlns8_0a9f8642-7eca-4b0a-bf48-893987e02188/dns-node-resolver/0.log" Apr 22 20:26:29.414200 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:29.414172 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-lnz7b_c63ee2c0-b298-42a6-bcd1-b05ffc7971f2/node-ca/0.log" Apr 22 20:26:30.358067 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:30.358031 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5ff55498bb-hwmkq_9e041c4f-74bf-46d4-bd4a-0c9e8952bdb9/router/0.log" Apr 22 20:26:30.812093 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:30.812066 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-ggr2l_efcdc84a-302d-4ab3-b70a-269b931a9634/serve-healthcheck-canary/0.log" Apr 22 20:26:31.253290 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:31.253257 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-46lb7_7fa0c2d8-cebb-4563-88fb-c974a238cc8d/insights-operator/0.log" Apr 22 20:26:31.253624 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:31.253434 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-46lb7_7fa0c2d8-cebb-4563-88fb-c974a238cc8d/insights-operator/1.log" Apr 22 20:26:31.338949 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:31.338921 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bcrgz_5efb288f-2e5e-41f2-b49e-ac95f1491b32/kube-rbac-proxy/0.log" Apr 22 20:26:31.361886 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:31.361858 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bcrgz_5efb288f-2e5e-41f2-b49e-ac95f1491b32/exporter/0.log" Apr 22 20:26:31.381389 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:31.381358 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bcrgz_5efb288f-2e5e-41f2-b49e-ac95f1491b32/extractor/0.log" Apr 22 20:26:33.978586 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:33.978557 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5846f88986-sb9lm_c5b8fee2-7d89-406d-8086-b029e86eaa75/manager/0.log" Apr 22 20:26:34.412416 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:34.412389 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7cxtv/perf-node-gather-daemonset-v96fl" Apr 22 20:26:39.806227 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:39.806189 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-zxkxh_41ae5fd4-d9cf-4314-b727-acf1d473957e/kube-storage-version-migrator-operator/1.log" Apr 22 20:26:39.807421 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:39.807400 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-zxkxh_41ae5fd4-d9cf-4314-b727-acf1d473957e/kube-storage-version-migrator-operator/0.log" Apr 22 20:26:40.811462 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:40.811405 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7qp87_8a3063cb-ff79-4232-8126-f9de4b63a839/kube-multus/0.log" Apr 22 20:26:40.833583 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:40.833558 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cfjtn_153f74bd-9a2c-4a02-88c9-243b60b35439/kube-multus-additional-cni-plugins/0.log" Apr 22 20:26:40.853572 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:40.853551 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cfjtn_153f74bd-9a2c-4a02-88c9-243b60b35439/egress-router-binary-copy/0.log" Apr 22 20:26:40.875406 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:40.875380 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cfjtn_153f74bd-9a2c-4a02-88c9-243b60b35439/cni-plugins/0.log" Apr 22 20:26:40.897779 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:40.897757 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cfjtn_153f74bd-9a2c-4a02-88c9-243b60b35439/bond-cni-plugin/0.log" Apr 22 20:26:40.917417 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:40.917393 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cfjtn_153f74bd-9a2c-4a02-88c9-243b60b35439/routeoverride-cni/0.log" Apr 22 20:26:40.936371 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:40.936346 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cfjtn_153f74bd-9a2c-4a02-88c9-243b60b35439/whereabouts-cni-bincopy/0.log" Apr 22 20:26:40.956398 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:40.956380 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cfjtn_153f74bd-9a2c-4a02-88c9-243b60b35439/whereabouts-cni/0.log" Apr 22 20:26:41.455966 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:41.455942 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-v8fph_d7bee1d4-9229-4b17-8ec5-e19b53d61c9d/network-metrics-daemon/0.log" Apr 22 20:26:41.476253 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:41.476230 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-v8fph_d7bee1d4-9229-4b17-8ec5-e19b53d61c9d/kube-rbac-proxy/0.log" Apr 22 20:26:42.263294 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:42.263263 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jf5c_e92d8d1a-4d78-4b32-8e69-32db4468f373/ovn-controller/0.log" Apr 22 20:26:42.290567 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:42.290539 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jf5c_e92d8d1a-4d78-4b32-8e69-32db4468f373/ovn-acl-logging/0.log" Apr 22 20:26:42.307371 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:42.307339 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jf5c_e92d8d1a-4d78-4b32-8e69-32db4468f373/kube-rbac-proxy-node/0.log" Apr 22 20:26:42.327620 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:42.327592 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jf5c_e92d8d1a-4d78-4b32-8e69-32db4468f373/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 20:26:42.346580 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:42.346556 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jf5c_e92d8d1a-4d78-4b32-8e69-32db4468f373/northd/0.log" Apr 22 20:26:42.365663 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:42.365640 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jf5c_e92d8d1a-4d78-4b32-8e69-32db4468f373/nbdb/0.log" Apr 22 20:26:42.384460 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:42.384431 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jf5c_e92d8d1a-4d78-4b32-8e69-32db4468f373/sbdb/0.log" Apr 22 20:26:42.528513 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:42.528434 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6jf5c_e92d8d1a-4d78-4b32-8e69-32db4468f373/ovnkube-controller/0.log" Apr 22 20:26:44.163597 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:44.163559 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-4wqkr_240239de-ebef-48d8-bfd9-2171161b364f/check-endpoints/0.log" Apr 22 20:26:44.185462 ip-10-0-133-60 kubenswrapper[2575]: I0422 20:26:44.185435 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-ncvmn_f55cb5a5-4e42-4775-bf9f-5f92344b63ff/network-check-target-container/0.log"