Apr 16 22:10:50.061858 ip-10-0-133-72 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 22:10:50.061869 ip-10-0-133-72 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 22:10:50.061876 ip-10-0-133-72 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 22:10:50.062099 ip-10-0-133-72 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 22:11:00.183830 ip-10-0-133-72 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 22:11:00.183850 ip-10-0-133-72 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 5ba47e05a2154b0aacb857dd001ea329 -- Apr 16 22:13:29.497028 ip-10-0-133-72 systemd[1]: Starting Kubernetes Kubelet... Apr 16 22:13:29.991232 ip-10-0-133-72 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:29.991232 ip-10-0-133-72 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 22:13:29.991232 ip-10-0-133-72 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:29.991232 ip-10-0-133-72 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 22:13:29.991232 ip-10-0-133-72 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:29.993031 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:29.992922 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 22:13:29.998963 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.998940 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:29.998963 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.998959 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:29.998963 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.998964 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:29.998963 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.998968 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:29.998963 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.998973 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:29.999264 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.998977 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:29.999264 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.998981 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:29.999264 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.998986 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:29.999264 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.998990 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:29.999264 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.998993 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:29.999264 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.998997 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:29.999264 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999000 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:29.999264 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999004 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:29.999264 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999009 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:29.999264 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999014 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:29.999264 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999018 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:29.999264 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999021 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:29.999264 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999025 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:29.999264 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999029 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:29.999264 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999032 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:29.999264 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999043 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:29.999264 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999047 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:29.999264 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999051 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:29.999264 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999056 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:30.000039 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999060 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:30.000039 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999064 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:30.000039 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999067 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:30.000039 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999072 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:30.000039 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999075 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:30.000039 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999079 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:30.000039 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999084 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:30.000039 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999089 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:30.000039 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999093 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:30.000039 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999097 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:30.000039 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999101 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:30.000039 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999105 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:30.000039 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999109 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:30.000039 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999113 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:30.000039 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999119 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:30.000039 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999123 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:30.000039 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999127 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:30.000039 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999132 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:30.000039 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999139 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:30.000523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999144 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:30.000523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999150 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:30.000523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999155 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:30.000523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999160 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:30.000523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999164 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:30.000523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999168 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:30.000523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999173 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:30.000523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999177 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:30.000523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999181 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:30.000523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999185 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:30.000523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999189 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:30.000523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999194 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:30.000523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999198 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:30.000523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999202 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:30.000523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999206 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:30.000523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999210 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:30.000523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999217 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:30.000523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999221 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:30.000523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999225 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:30.000523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999229 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:30.001047 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999234 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:30.001047 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999239 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:30.001047 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999243 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:30.001047 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999248 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:30.001047 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999254 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:30.001047 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999258 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:30.001047 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999261 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:30.001047 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999266 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:30.001047 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999270 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:30.001047 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999273 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:30.001047 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999277 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:30.001047 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999282 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:30.001047 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999286 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:30.001047 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999290 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:30.001047 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999294 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:30.001047 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999298 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:30.001047 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999302 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:30.001047 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999307 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:30.001047 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999311 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:30.001047 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999315 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:30.001523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999319 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:30.001523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999323 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:30.001523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:29.999328 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:30.001523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000070 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:30.001523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000084 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:30.001523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000087 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:30.001523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000091 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:30.001523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000094 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:30.001523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000097 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:30.001523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000100 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:30.001523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000103 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:30.001523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000106 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:30.001523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000109 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:30.001523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000113 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:30.001523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000116 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:30.001523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000120 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:30.001523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000126 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:30.001523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000129 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:30.001523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000132 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:30.001523 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000135 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:30.002057 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000138 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:30.002057 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000142 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:30.002057 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000145 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:30.002057 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000148 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:30.002057 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000153 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:30.002057 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000157 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:30.002057 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000160 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:30.002057 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000163 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:30.002057 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000166 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:30.002057 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000168 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:30.002057 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000171 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:30.002057 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000173 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:30.002057 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000176 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:30.002057 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000179 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:30.002057 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000181 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:30.002057 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000186 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:30.002057 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000190 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:30.002057 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000193 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:30.002057 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000196 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:30.002525 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000198 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:30.002525 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000201 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:30.002525 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000203 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:30.002525 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000206 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:30.002525 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000209 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:30.002525 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000211 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:30.002525 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000214 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:30.002525 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000217 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:30.002525 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000219 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:30.002525 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000222 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:30.002525 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000224 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:30.002525 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000227 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:30.002525 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000229 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:30.002525 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000232 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:30.002525 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000234 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:30.002525 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000237 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:30.002525 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000239 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:30.002525 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000242 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:30.002525 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000245 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:30.002525 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000248 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:30.003023 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000251 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:30.003023 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000254 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:30.003023 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000256 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:30.003023 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000259 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:30.003023 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000261 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:30.003023 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000264 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:30.003023 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000267 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:30.003023 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000269 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:30.003023 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000273 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:30.003023 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000276 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:30.003023 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000279 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:30.003023 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000282 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:30.003023 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000285 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:30.003023 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000287 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:30.003023 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000290 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:30.003023 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000293 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:30.003023 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000295 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:30.003023 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000298 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:30.003023 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000300 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:30.003023 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000303 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:30.003513 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000305 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:30.003513 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000308 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:30.003513 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000311 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:30.003513 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000314 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:30.003513 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000316 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:30.003513 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000318 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:30.003513 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000321 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:30.003513 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000323 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:30.003513 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000326 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:30.003513 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.000328 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:30.003513 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001333 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 22:13:30.003513 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001344 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 22:13:30.003513 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001351 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 22:13:30.003513 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001357 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 22:13:30.003513 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001362 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 22:13:30.003513 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001365 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 22:13:30.003513 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001370 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 22:13:30.003513 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001374 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 22:13:30.003513 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001378 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 22:13:30.003513 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001381 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 22:13:30.003513 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001384 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 22:13:30.004035 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001389 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 22:13:30.004035 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001392 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 22:13:30.004035 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001396 2574 flags.go:64] FLAG: --cgroup-root="" Apr 16 22:13:30.004035 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001399 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 22:13:30.004035 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001402 2574 flags.go:64] FLAG: --client-ca-file="" Apr 16 22:13:30.004035 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001405 2574 flags.go:64] FLAG: --cloud-config="" Apr 16 22:13:30.004035 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001408 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 16 22:13:30.004035 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001411 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 22:13:30.004035 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001415 2574 flags.go:64] FLAG: --cluster-domain="" Apr 16 22:13:30.004035 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001418 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 22:13:30.004035 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001421 2574 flags.go:64] FLAG: --config-dir="" Apr 16 22:13:30.004035 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001424 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 22:13:30.004035 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001428 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 22:13:30.004035 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001432 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 22:13:30.004035 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001436 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 22:13:30.004035 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001439 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 22:13:30.004035 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001442 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 22:13:30.004035 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001445 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 16 22:13:30.004035 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001448 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 22:13:30.004035 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001452 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 22:13:30.004035 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001455 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 22:13:30.004035 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001458 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 22:13:30.004035 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001463 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 22:13:30.004035 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001466 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 22:13:30.004035 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001469 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 22:13:30.004641 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001472 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 22:13:30.004641 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001475 2574 flags.go:64] FLAG: --enable-server="true" Apr 16 22:13:30.004641 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001478 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 22:13:30.004641 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001485 2574 flags.go:64] FLAG: --event-burst="100" Apr 16 22:13:30.004641 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001488 2574 flags.go:64] FLAG: --event-qps="50" Apr 16 22:13:30.004641 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001491 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 22:13:30.004641 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001494 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 22:13:30.004641 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001498 2574 flags.go:64] FLAG: --eviction-hard="" Apr 16 22:13:30.004641 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001502 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 22:13:30.004641 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001505 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 22:13:30.004641 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001508 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 22:13:30.004641 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001511 2574 flags.go:64] FLAG: --eviction-soft="" Apr 16 22:13:30.004641 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001514 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 22:13:30.004641 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001518 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 22:13:30.004641 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001521 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 22:13:30.004641 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001524 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 22:13:30.004641 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001527 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 22:13:30.004641 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001530 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 22:13:30.004641 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001534 2574 flags.go:64] FLAG: --feature-gates="" Apr 16 22:13:30.004641 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001538 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 22:13:30.004641 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001541 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 22:13:30.004641 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001545 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 22:13:30.004641 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001548 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 22:13:30.004641 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001551 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 16 22:13:30.004641 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001555 2574 flags.go:64] FLAG: --help="false" Apr 16 22:13:30.005262 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001558 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-133-72.ec2.internal" Apr 16 22:13:30.005262 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001562 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 22:13:30.005262 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001565 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 22:13:30.005262 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001568 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 22:13:30.005262 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001571 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 22:13:30.005262 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001575 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 22:13:30.005262 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001578 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 22:13:30.005262 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001581 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 22:13:30.005262 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001583 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 22:13:30.005262 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001586 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 22:13:30.005262 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001589 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 22:13:30.005262 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001592 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 22:13:30.005262 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001595 2574 flags.go:64] FLAG: --kube-reserved="" Apr 16 22:13:30.005262 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001598 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 22:13:30.005262 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001601 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 22:13:30.005262 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001605 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 22:13:30.005262 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001607 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 22:13:30.005262 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001610 2574 flags.go:64] FLAG: --lock-file="" Apr 16 22:13:30.005262 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001614 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 22:13:30.005262 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001617 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 22:13:30.005262 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001620 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 22:13:30.005262 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001625 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 22:13:30.005262 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001628 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 22:13:30.005848 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001631 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 22:13:30.005848 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001634 2574 flags.go:64] FLAG: --logging-format="text" Apr 16 22:13:30.005848 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001637 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 22:13:30.005848 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001641 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 22:13:30.005848 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001643 2574 flags.go:64] FLAG: --manifest-url="" Apr 16 22:13:30.005848 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001647 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 16 22:13:30.005848 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001651 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 22:13:30.005848 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001654 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 22:13:30.005848 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001659 2574 flags.go:64] FLAG: --max-pods="110" Apr 16 22:13:30.005848 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001662 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 22:13:30.005848 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001665 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 22:13:30.005848 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001668 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 22:13:30.005848 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001671 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 22:13:30.005848 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001674 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 22:13:30.005848 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001677 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 22:13:30.005848 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001681 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 22:13:30.005848 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001690 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 22:13:30.005848 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001693 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 22:13:30.005848 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001696 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 22:13:30.005848 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001699 2574 flags.go:64] FLAG: --pod-cidr="" Apr 16 22:13:30.005848 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001702 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 22:13:30.005848 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001707 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 22:13:30.005848 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001710 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 22:13:30.005848 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001714 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 16 22:13:30.006504 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001717 2574 flags.go:64] FLAG: --port="10250" Apr 16 22:13:30.006504 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001720 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 22:13:30.006504 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001723 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-00506c1f1618ce4ef" Apr 16 22:13:30.006504 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001726 2574 flags.go:64] FLAG: --qos-reserved="" Apr 16 22:13:30.006504 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001730 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 16 22:13:30.006504 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001733 2574 flags.go:64] FLAG: --register-node="true" Apr 16 22:13:30.006504 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001751 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 16 22:13:30.006504 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001755 2574 flags.go:64] FLAG: --register-with-taints="" Apr 16 22:13:30.006504 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001759 2574 flags.go:64] FLAG: --registry-burst="10" Apr 16 22:13:30.006504 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001762 2574 flags.go:64] FLAG: --registry-qps="5" Apr 16 22:13:30.006504 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001765 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 16 22:13:30.006504 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001768 2574 flags.go:64] FLAG: --reserved-memory="" Apr 16 22:13:30.006504 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001772 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 22:13:30.006504 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001775 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 22:13:30.006504 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001778 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 22:13:30.006504 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001781 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 22:13:30.006504 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001784 2574 flags.go:64] FLAG: --runonce="false" Apr 16 22:13:30.006504 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001787 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 22:13:30.006504 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001790 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 22:13:30.006504 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001793 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 16 22:13:30.006504 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001796 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 22:13:30.006504 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001799 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 22:13:30.006504 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001803 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 22:13:30.006504 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001806 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 22:13:30.006504 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001809 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 22:13:30.006504 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001812 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 22:13:30.007183 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001815 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 22:13:30.007183 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001818 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 22:13:30.007183 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001821 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 22:13:30.007183 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001824 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 22:13:30.007183 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001827 2574 flags.go:64] FLAG: --system-cgroups="" Apr 16 22:13:30.007183 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001830 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 22:13:30.007183 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001860 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 22:13:30.007183 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001864 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 16 22:13:30.007183 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001867 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 22:13:30.007183 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001871 2574 flags.go:64] FLAG: --tls-min-version="" Apr 16 22:13:30.007183 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001874 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 22:13:30.007183 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001885 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 22:13:30.007183 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001888 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 22:13:30.007183 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001891 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 22:13:30.007183 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001895 2574 flags.go:64] FLAG: --v="2" Apr 16 22:13:30.007183 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001903 2574 flags.go:64] FLAG: --version="false" Apr 16 22:13:30.007183 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001908 2574 flags.go:64] FLAG: --vmodule="" Apr 16 22:13:30.007183 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001913 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 22:13:30.007183 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.001917 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 22:13:30.007183 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002012 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:30.007183 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002016 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:30.007183 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002019 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:30.007183 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002021 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:30.007183 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002024 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:30.007783 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002027 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:30.007783 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002029 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:30.007783 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002032 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:30.007783 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002034 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:30.007783 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002037 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:30.007783 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002040 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:30.007783 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002043 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:30.007783 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002046 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:30.007783 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002048 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:30.007783 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002051 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:30.007783 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002053 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:30.007783 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002056 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:30.007783 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002063 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:30.007783 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002066 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:30.007783 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002069 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:30.007783 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002071 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:30.007783 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002074 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:30.007783 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002077 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:30.007783 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002079 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:30.007783 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002082 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:30.008304 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002085 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:30.008304 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002088 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:30.008304 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002091 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:30.008304 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002093 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:30.008304 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002096 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:30.008304 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002099 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:30.008304 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002101 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:30.008304 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002104 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:30.008304 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002107 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:30.008304 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002109 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:30.008304 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002112 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:30.008304 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002114 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:30.008304 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002117 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:30.008304 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002120 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:30.008304 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002122 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:30.008304 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002124 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:30.008304 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002127 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:30.008304 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002130 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:30.008304 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002132 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:30.008304 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002134 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:30.008800 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002137 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:30.008800 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002139 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:30.008800 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002142 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:30.008800 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002145 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:30.008800 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002149 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:30.008800 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002152 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:30.008800 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002155 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:30.008800 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002159 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:30.008800 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002163 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:30.008800 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002166 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:30.008800 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002168 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:30.008800 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002171 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:30.008800 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002174 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:30.008800 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002177 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:30.008800 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002179 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:30.008800 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002182 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:30.008800 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002184 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:30.008800 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002187 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:30.008800 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002189 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:30.009267 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002192 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:30.009267 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002194 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:30.009267 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002197 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:30.009267 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002199 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:30.009267 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002202 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:30.009267 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002205 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:30.009267 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002208 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:30.009267 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002210 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:30.009267 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002213 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:30.009267 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002215 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:30.009267 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002218 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:30.009267 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002220 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:30.009267 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002223 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:30.009267 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002226 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:30.009267 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002228 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:30.009267 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002231 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:30.009267 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002233 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:30.009267 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002237 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:30.009267 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002240 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:30.009719 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002244 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:30.009719 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002247 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:30.009719 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.002249 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:30.009719 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.002255 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:30.009719 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.009262 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 22:13:30.009719 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.009280 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 22:13:30.009719 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009328 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:30.009719 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009333 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:30.009719 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009339 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:30.009719 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009342 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:30.009719 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009345 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:30.009719 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009347 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:30.009719 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009350 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:30.009719 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009353 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:30.009719 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009355 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:30.009719 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009358 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:30.010123 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009361 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:30.010123 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009363 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:30.010123 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009366 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:30.010123 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009368 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:30.010123 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009371 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:30.010123 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009374 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:30.010123 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009376 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:30.010123 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009379 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:30.010123 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009383 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:30.010123 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009387 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:30.010123 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009390 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:30.010123 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009393 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:30.010123 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009395 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:30.010123 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009398 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:30.010123 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009401 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:30.010123 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009403 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:30.010123 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009406 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:30.010123 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009408 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:30.010123 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009411 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:30.010123 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009414 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:30.010643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009416 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:30.010643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009420 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:30.010643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009422 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:30.010643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009425 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:30.010643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009429 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:30.010643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009432 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:30.010643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009435 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:30.010643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009437 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:30.010643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009440 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:30.010643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009443 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:30.010643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009445 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:30.010643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009448 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:30.010643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009450 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:30.010643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009453 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:30.010643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009456 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:30.010643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009458 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:30.010643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009461 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:30.010643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009463 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:30.010643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009466 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:30.010643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009468 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:30.011182 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009471 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:30.011182 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009473 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:30.011182 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009476 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:30.011182 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009479 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:30.011182 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009482 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:30.011182 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009485 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:30.011182 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009487 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:30.011182 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009490 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:30.011182 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009492 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:30.011182 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009495 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:30.011182 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009498 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:30.011182 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009501 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:30.011182 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009503 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:30.011182 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009506 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:30.011182 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009509 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:30.011182 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009512 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:30.011182 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009516 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:30.011182 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009520 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:30.011182 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009523 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:30.011643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009526 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:30.011643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009529 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:30.011643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009532 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:30.011643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009535 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:30.011643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009537 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:30.011643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009540 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:30.011643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009543 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:30.011643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009545 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:30.011643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009548 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:30.011643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009550 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:30.011643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009553 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:30.011643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009555 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:30.011643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009558 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:30.011643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009560 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:30.011643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009563 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:30.011643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009565 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:30.011643 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009568 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:30.012172 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.009574 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:30.012172 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009694 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:30.012172 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009700 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:30.012172 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009703 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:30.012172 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009706 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:30.012172 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009709 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:30.012172 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009711 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:30.012172 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009714 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:30.012172 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009716 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:30.012172 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009720 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:30.012172 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009722 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:30.012172 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009725 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:30.012172 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009728 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:30.012172 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009730 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:30.012172 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009734 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:30.012543 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009755 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:30.012543 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009759 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:30.012543 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009762 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:30.012543 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009765 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:30.012543 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009767 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:30.012543 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009770 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:30.012543 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009773 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:30.012543 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009775 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:30.012543 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009778 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:30.012543 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009780 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:30.012543 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009783 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:30.012543 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009785 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:30.012543 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009788 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:30.012543 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009791 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:30.012543 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009793 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:30.012543 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009796 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:30.012543 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009798 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:30.012543 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009802 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:30.012543 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009806 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:30.013027 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009808 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:30.013027 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009811 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:30.013027 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009814 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:30.013027 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009816 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:30.013027 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009818 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:30.013027 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009821 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:30.013027 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009824 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:30.013027 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009826 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:30.013027 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009829 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:30.013027 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009831 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:30.013027 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009834 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:30.013027 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009836 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:30.013027 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009839 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:30.013027 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009843 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:30.013027 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009845 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:30.013027 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009848 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:30.013027 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009851 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:30.013027 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009854 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:30.013027 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009856 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:30.013027 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009859 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:30.013501 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009861 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:30.013501 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009864 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:30.013501 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009866 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:30.013501 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009869 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:30.013501 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009872 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:30.013501 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009876 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:30.013501 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009879 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:30.013501 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009882 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:30.013501 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009884 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:30.013501 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009887 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:30.013501 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009890 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:30.013501 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009893 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:30.013501 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009895 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:30.013501 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009898 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:30.013501 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009900 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:30.013501 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009902 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:30.013501 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009905 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:30.013501 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009908 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:30.013501 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009910 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:30.013501 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009913 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:30.014040 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009932 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:30.014040 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009935 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:30.014040 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009939 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:30.014040 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009942 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:30.014040 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009945 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:30.014040 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009947 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:30.014040 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009950 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:30.014040 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009953 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:30.014040 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009956 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:30.014040 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009958 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:30.014040 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009961 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:30.014040 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009963 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:30.014040 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:30.009966 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:30.014040 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.009970 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:30.014040 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.010753 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 22:13:30.014404 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.013019 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 22:13:30.014404 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.013994 2574 server.go:1019] "Starting client certificate rotation" Apr 16 22:13:30.014404 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.014093 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 22:13:30.014404 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.014135 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 22:13:30.039584 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.039566 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 22:13:30.048268 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.048203 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 22:13:30.069116 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.069091 2574 log.go:25] "Validated CRI v1 runtime API" Apr 16 22:13:30.074313 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.074278 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 22:13:30.075849 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.075836 2574 log.go:25] "Validated CRI v1 image API" Apr 16 22:13:30.077344 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.077324 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 22:13:30.081663 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.081643 2574 fs.go:135] Filesystem UUIDs: map[09076e67-64ce-4d13-8434-f6735adee9e3:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 b0e220e5-2d14-4298-9f57-55b312169a2e:/dev/nvme0n1p3] Apr 16 22:13:30.081717 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.081664 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 22:13:30.088281 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.088164 2574 manager.go:217] Machine: {Timestamp:2026-04-16 22:13:30.086020213 +0000 UTC m=+0.454426356 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100266 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2d25b30e57072a36fbf6a434ca92a5 SystemUUID:ec2d25b3-0e57-072a-36fb-f6a434ca92a5 BootID:5ba47e05-a215-4b0a-acb8-57dd001ea329 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:7b:d9:ab:32:33 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:7b:d9:ab:32:33 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:66:e9:9e:a6:f5:1f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 22:13:30.088281 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.088275 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 22:13:30.088392 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.088361 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 22:13:30.089480 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.089455 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 22:13:30.089613 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.089482 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-72.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 22:13:30.089660 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.089623 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 22:13:30.089660 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.089632 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 22:13:30.089660 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.089645 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 22:13:30.090883 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.090871 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 22:13:30.091716 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.091707 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 16 22:13:30.091844 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.091835 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 22:13:30.094841 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.094831 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 16 22:13:30.094879 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.094847 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 22:13:30.094879 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.094863 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 22:13:30.094879 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.094875 2574 kubelet.go:397] "Adding apiserver pod source" Apr 16 22:13:30.094978 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.094886 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 22:13:30.095638 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.095606 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-bcc5w" Apr 16 22:13:30.096058 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.096047 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 22:13:30.096093 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.096064 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 22:13:30.100046 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.100031 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 22:13:30.101355 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.101342 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 22:13:30.102931 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.102913 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-bcc5w" Apr 16 22:13:30.103025 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.103010 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 22:13:30.103080 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.103035 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 22:13:30.103080 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.103042 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 22:13:30.103080 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.103049 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 22:13:30.103165 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.103085 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 22:13:30.103165 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.103092 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 22:13:30.103165 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.103098 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 22:13:30.103165 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.103103 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 22:13:30.103165 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.103109 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 22:13:30.103165 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.103115 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 22:13:30.103165 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.103130 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 22:13:30.103165 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.103139 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 22:13:30.105043 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.105031 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 22:13:30.105043 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.105043 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 22:13:30.109341 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.109316 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 22:13:30.109534 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.109506 2574 server.go:1295] "Started kubelet" Apr 16 22:13:30.109598 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.109555 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 22:13:30.109651 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.109615 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 22:13:30.111386 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.111279 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 22:13:30.111484 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.111441 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 22:13:30.111797 ip-10-0-133-72 systemd[1]: Started Kubernetes Kubelet. Apr 16 22:13:30.111929 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.111801 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:30.112626 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.112603 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:30.114161 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.114144 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 16 22:13:30.117758 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.117717 2574 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-72.ec2.internal" not found Apr 16 22:13:30.118363 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.118344 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 22:13:30.118363 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.118360 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 22:13:30.119272 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.119091 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 22:13:30.119272 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.119092 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 22:13:30.119412 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.119288 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 22:13:30.119412 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:30.119196 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-72.ec2.internal\" not found" Apr 16 22:13:30.119412 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.119174 2574 factory.go:55] Registering systemd factory Apr 16 22:13:30.119547 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.119425 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 16 22:13:30.119547 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.119435 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 16 22:13:30.119547 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.119426 2574 factory.go:223] Registration of the systemd container factory successfully Apr 16 22:13:30.119894 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.119876 2574 factory.go:153] Registering CRI-O factory Apr 16 22:13:30.119894 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.119896 2574 factory.go:223] Registration of the crio container factory successfully Apr 16 22:13:30.120018 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.120009 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 22:13:30.120100 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.120035 2574 factory.go:103] Registering Raw factory Apr 16 22:13:30.120100 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.120051 2574 manager.go:1196] Started watching for new ooms in manager Apr 16 22:13:30.120602 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.120589 2574 manager.go:319] Starting recovery of all containers Apr 16 22:13:30.120965 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.120938 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:30.121417 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:30.121392 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 22:13:30.124011 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:30.123987 2574 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-133-72.ec2.internal\" not found" node="ip-10-0-133-72.ec2.internal" Apr 16 22:13:30.131488 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.131474 2574 manager.go:324] Recovery completed Apr 16 22:13:30.137057 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.137037 2574 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-72.ec2.internal" not found Apr 16 22:13:30.137127 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.137111 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:30.139211 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.139195 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-72.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:30.139262 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.139224 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-72.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:30.139262 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.139236 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-72.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:30.139725 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.139712 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 22:13:30.139725 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.139723 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 22:13:30.139834 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.139754 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 16 22:13:30.142034 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.142023 2574 policy_none.go:49] "None policy: Start" Apr 16 22:13:30.142074 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.142037 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 22:13:30.142074 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.142047 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 16 22:13:30.181334 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.181321 2574 manager.go:341] "Starting Device Plugin manager" Apr 16 22:13:30.191060 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:30.181392 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 22:13:30.191060 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.181402 2574 server.go:85] "Starting device plugin registration server" Apr 16 22:13:30.191060 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.181618 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 22:13:30.191060 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.181631 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 22:13:30.191060 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.181723 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 22:13:30.191060 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.181861 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 22:13:30.191060 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.181870 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 22:13:30.191060 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:30.182374 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 22:13:30.191060 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:30.182406 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-72.ec2.internal\" not found" Apr 16 22:13:30.194324 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.194309 2574 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-72.ec2.internal" not found Apr 16 22:13:30.256241 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.256165 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 22:13:30.257377 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.257359 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 22:13:30.257463 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.257390 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 22:13:30.257463 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.257417 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 22:13:30.257463 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.257428 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 22:13:30.257589 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:30.257475 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 22:13:30.260040 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.260012 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:30.282534 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.282521 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:30.283343 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.283326 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-72.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:30.283411 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.283356 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-72.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:30.283411 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.283367 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-72.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:30.283411 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.283392 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-72.ec2.internal" Apr 16 22:13:30.293521 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.293504 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-72.ec2.internal" Apr 16 22:13:30.293611 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:30.293529 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-72.ec2.internal\": node \"ip-10-0-133-72.ec2.internal\" not found" Apr 16 22:13:30.358074 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.358046 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-72.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-72.ec2.internal"] Apr 16 22:13:30.360351 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.360333 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-72.ec2.internal" Apr 16 22:13:30.360437 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.360342 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-72.ec2.internal" Apr 16 22:13:30.389375 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.389355 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-72.ec2.internal" Apr 16 22:13:30.393590 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.393574 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-72.ec2.internal" Apr 16 22:13:30.402456 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.402440 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 22:13:30.405208 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.405193 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 22:13:30.420589 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.420570 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b707a3c20ff0ea56ea13d746a8edc26b-config\") pod \"kube-apiserver-proxy-ip-10-0-133-72.ec2.internal\" (UID: \"b707a3c20ff0ea56ea13d746a8edc26b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-72.ec2.internal" Apr 16 22:13:30.521511 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.521418 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7d0edda7066ec5684bc2bd9c10fb4784-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-72.ec2.internal\" (UID: \"7d0edda7066ec5684bc2bd9c10fb4784\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-72.ec2.internal" Apr 16 22:13:30.521511 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.521479 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d0edda7066ec5684bc2bd9c10fb4784-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-72.ec2.internal\" (UID: \"7d0edda7066ec5684bc2bd9c10fb4784\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-72.ec2.internal" Apr 16 22:13:30.521670 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.521523 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b707a3c20ff0ea56ea13d746a8edc26b-config\") pod \"kube-apiserver-proxy-ip-10-0-133-72.ec2.internal\" (UID: \"b707a3c20ff0ea56ea13d746a8edc26b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-72.ec2.internal" Apr 16 22:13:30.521670 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.521568 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b707a3c20ff0ea56ea13d746a8edc26b-config\") pod \"kube-apiserver-proxy-ip-10-0-133-72.ec2.internal\" (UID: \"b707a3c20ff0ea56ea13d746a8edc26b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-72.ec2.internal" Apr 16 22:13:30.622586 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.622559 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d0edda7066ec5684bc2bd9c10fb4784-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-72.ec2.internal\" (UID: \"7d0edda7066ec5684bc2bd9c10fb4784\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-72.ec2.internal" Apr 16 22:13:30.622586 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.622519 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d0edda7066ec5684bc2bd9c10fb4784-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-72.ec2.internal\" (UID: \"7d0edda7066ec5684bc2bd9c10fb4784\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-72.ec2.internal" Apr 16 22:13:30.622789 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.622618 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7d0edda7066ec5684bc2bd9c10fb4784-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-72.ec2.internal\" (UID: \"7d0edda7066ec5684bc2bd9c10fb4784\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-72.ec2.internal" Apr 16 22:13:30.622789 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.622652 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7d0edda7066ec5684bc2bd9c10fb4784-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-72.ec2.internal\" (UID: \"7d0edda7066ec5684bc2bd9c10fb4784\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-72.ec2.internal" Apr 16 22:13:30.704774 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.704727 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-72.ec2.internal" Apr 16 22:13:30.708358 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:30.708334 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-72.ec2.internal" Apr 16 22:13:31.013532 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.013501 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 22:13:31.014203 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.013638 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:13:31.014203 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.013671 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:13:31.014203 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.013689 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:13:31.095886 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.095861 2574 apiserver.go:52] "Watching apiserver" Apr 16 22:13:31.101687 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.101667 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 22:13:31.102069 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.102049 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-zfz79","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-72.ec2.internal","openshift-multus/multus-8pm2c","openshift-network-operator/iptables-alerter-mcqg2","openshift-ovn-kubernetes/ovnkube-node-hm4t5","kube-system/konnectivity-agent-9hrlh","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jcgzh","openshift-dns/node-resolver-9cdfs","openshift-multus/multus-additional-cni-plugins-dx88j","openshift-multus/network-metrics-daemon-wvq6s","openshift-network-diagnostics/network-check-target-4pbqm","kube-system/kube-apiserver-proxy-ip-10-0-133-72.ec2.internal","openshift-cluster-node-tuning-operator/tuned-mxzxv"] Apr 16 22:13:31.104344 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.104313 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 22:08:30 +0000 UTC" deadline="2027-12-28 13:02:55.400982576 +0000 UTC" Apr 16 22:13:31.104389 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.104346 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14894h49m24.296640077s" Apr 16 22:13:31.104760 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.104731 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zfz79" Apr 16 22:13:31.106165 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.106066 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.106165 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.106163 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-mcqg2" Apr 16 22:13:31.107992 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.107648 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-prx6s\"" Apr 16 22:13:31.107992 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.107848 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 22:13:31.107992 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.107920 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 22:13:31.107992 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.107980 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 22:13:31.108245 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.108069 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 22:13:31.108332 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.108314 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-7v4r4\"" Apr 16 22:13:31.108790 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.108760 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 22:13:31.109863 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.109824 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9hrlh" Apr 16 22:13:31.110002 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.109977 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.110095 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.110062 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 22:13:31.110153 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.110105 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 22:13:31.110567 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.110544 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-rh7fg\"" Apr 16 22:13:31.110662 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.110568 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:13:31.110719 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.110567 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 22:13:31.110855 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.110833 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 22:13:31.112123 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.112016 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 22:13:31.112123 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.112101 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 22:13:31.112314 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.112299 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-dpgrh\"" Apr 16 22:13:31.112474 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.112461 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 22:13:31.112753 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.112729 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 22:13:31.112826 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.112797 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 22:13:31.112885 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.112836 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 22:13:31.112936 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.112915 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-swkrs\"" Apr 16 22:13:31.113416 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.113398 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 22:13:31.113817 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.113799 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 22:13:31.115312 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.115288 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jcgzh" Apr 16 22:13:31.115401 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.115377 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9cdfs" Apr 16 22:13:31.117157 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.117141 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dx88j" Apr 16 22:13:31.117258 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.117226 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 22:13:31.117319 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.117287 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 22:13:31.117416 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.117230 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 22:13:31.117618 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.117590 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 22:13:31.117618 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.117598 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-k442x\"" Apr 16 22:13:31.117800 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.117640 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 22:13:31.117800 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.117664 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-vmjgr\"" Apr 16 22:13:31.118588 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.118574 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 22:13:31.119191 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.119173 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:13:31.119280 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.119247 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 22:13:31.119333 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:31.119261 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvq6s" podUID="c220c5af-4b42-4b44-a789-17aa37d44b90" Apr 16 22:13:31.119333 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.119293 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 22:13:31.119333 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.119283 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-tlmhv\"" Apr 16 22:13:31.121413 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.121398 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4pbqm" Apr 16 22:13:31.121486 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:31.121449 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4pbqm" podUID="65a893c9-3b9b-48c6-a82b-6236d443cacf" Apr 16 22:13:31.123271 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.123248 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.124698 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.124682 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-host-var-lib-cni-multus\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.124783 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.124705 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-system-cni-dir\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.124783 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.124722 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-host-run-k8s-cni-cncf-io\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.124783 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.124760 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-host-var-lib-kubelet\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.124907 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.124828 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/38e7a963-729e-40af-91e7-9fa6910bc258-system-cni-dir\") pod \"multus-additional-cni-plugins-dx88j\" (UID: \"38e7a963-729e-40af-91e7-9fa6910bc258\") " pod="openshift-multus/multus-additional-cni-plugins-dx88j" Apr 16 22:13:31.124907 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.124864 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8485072b-91cf-42a7-978b-e530b3a7b911-iptables-alerter-script\") pod \"iptables-alerter-mcqg2\" (UID: \"8485072b-91cf-42a7-978b-e530b3a7b911\") " pod="openshift-network-operator/iptables-alerter-mcqg2" Apr 16 22:13:31.124907 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.124888 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jcgzh\" (UID: \"ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jcgzh" Apr 16 22:13:31.125004 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.124911 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/46cb706e-dcb9-4950-aa25-14e582448ea8-serviceca\") pod \"node-ca-zfz79\" (UID: \"46cb706e-dcb9-4950-aa25-14e582448ea8\") " pod="openshift-image-registry/node-ca-zfz79" Apr 16 22:13:31.125004 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.124958 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-host-slash\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.125004 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.124995 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-run-systemd\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.125090 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125022 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5f4b\" (UniqueName: \"kubernetes.io/projected/8485072b-91cf-42a7-978b-e530b3a7b911-kube-api-access-v5f4b\") pod \"iptables-alerter-mcqg2\" (UID: \"8485072b-91cf-42a7-978b-e530b3a7b911\") " pod="openshift-network-operator/iptables-alerter-mcqg2" Apr 16 22:13:31.125090 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125059 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-multus-cni-dir\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.125090 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125069 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:13:31.125090 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125081 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-hostroot\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.125255 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125135 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cdab8cce-9b55-478d-b1b5-740aa9746143-ovnkube-config\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.125255 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125165 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-os-release\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.125255 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125174 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 22:13:31.125255 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125200 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ede793eb-64b1-4045-a60c-349b6c07e08b-cni-binary-copy\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.125255 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125216 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-systemd-units\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.125255 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125230 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e515cbfe-da1c-405f-8e8d-a7ddc73de30a-konnectivity-ca\") pod \"konnectivity-agent-9hrlh\" (UID: \"e515cbfe-da1c-405f-8e8d-a7ddc73de30a\") " pod="kube-system/konnectivity-agent-9hrlh" Apr 16 22:13:31.125509 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125251 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd-sys-fs\") pod \"aws-ebs-csi-driver-node-jcgzh\" (UID: \"ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jcgzh" Apr 16 22:13:31.125509 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125306 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-host-run-multus-certs\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.125509 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125283 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-b9t9q\"" Apr 16 22:13:31.125509 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125345 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqlln\" (UniqueName: \"kubernetes.io/projected/c220c5af-4b42-4b44-a789-17aa37d44b90-kube-api-access-gqlln\") pod \"network-metrics-daemon-wvq6s\" (UID: \"c220c5af-4b42-4b44-a789-17aa37d44b90\") " pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:13:31.125509 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125383 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-run-ovn\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.125509 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125402 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-log-socket\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.125509 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125421 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-host-cni-bin\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.125509 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125456 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k8zs\" (UniqueName: \"kubernetes.io/projected/cdab8cce-9b55-478d-b1b5-740aa9746143-kube-api-access-8k8zs\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.125509 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125490 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/38e7a963-729e-40af-91e7-9fa6910bc258-cnibin\") pod \"multus-additional-cni-plugins-dx88j\" (UID: \"38e7a963-729e-40af-91e7-9fa6910bc258\") " pod="openshift-multus/multus-additional-cni-plugins-dx88j" Apr 16 22:13:31.125902 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125520 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77r28\" (UniqueName: \"kubernetes.io/projected/76fa2914-d925-4555-87be-d9837e6295d8-kube-api-access-77r28\") pod \"node-resolver-9cdfs\" (UID: \"76fa2914-d925-4555-87be-d9837e6295d8\") " pod="openshift-dns/node-resolver-9cdfs" Apr 16 22:13:31.125902 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125546 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/38e7a963-729e-40af-91e7-9fa6910bc258-cni-binary-copy\") pod \"multus-additional-cni-plugins-dx88j\" (UID: \"38e7a963-729e-40af-91e7-9fa6910bc258\") " pod="openshift-multus/multus-additional-cni-plugins-dx88j" Apr 16 22:13:31.125902 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125572 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8485072b-91cf-42a7-978b-e530b3a7b911-host-slash\") pod \"iptables-alerter-mcqg2\" (UID: \"8485072b-91cf-42a7-978b-e530b3a7b911\") " pod="openshift-network-operator/iptables-alerter-mcqg2" Apr 16 22:13:31.125902 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125599 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd-registration-dir\") pod \"aws-ebs-csi-driver-node-jcgzh\" (UID: \"ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jcgzh" Apr 16 22:13:31.125902 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125646 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp5gn\" (UniqueName: \"kubernetes.io/projected/ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd-kube-api-access-gp5gn\") pod \"aws-ebs-csi-driver-node-jcgzh\" (UID: \"ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jcgzh" Apr 16 22:13:31.125902 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125664 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ede793eb-64b1-4045-a60c-349b6c07e08b-multus-daemon-config\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.125902 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125680 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/76fa2914-d925-4555-87be-d9837e6295d8-tmp-dir\") pod \"node-resolver-9cdfs\" (UID: \"76fa2914-d925-4555-87be-d9837e6295d8\") " pod="openshift-dns/node-resolver-9cdfs" Apr 16 22:13:31.125902 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125695 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-host-kubelet\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.125902 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125729 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-host-run-netns\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.125902 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125772 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-etc-openvswitch\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.125902 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125795 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-run-openvswitch\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.125902 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125817 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-multus-socket-dir-parent\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.125902 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125842 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/38e7a963-729e-40af-91e7-9fa6910bc258-os-release\") pod \"multus-additional-cni-plugins-dx88j\" (UID: \"38e7a963-729e-40af-91e7-9fa6910bc258\") " pod="openshift-multus/multus-additional-cni-plugins-dx88j" Apr 16 22:13:31.125902 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125884 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-multus-conf-dir\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.126424 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125911 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c220c5af-4b42-4b44-a789-17aa37d44b90-metrics-certs\") pod \"network-metrics-daemon-wvq6s\" (UID: \"c220c5af-4b42-4b44-a789-17aa37d44b90\") " pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:13:31.126424 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125941 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/46cb706e-dcb9-4950-aa25-14e582448ea8-host\") pod \"node-ca-zfz79\" (UID: \"46cb706e-dcb9-4950-aa25-14e582448ea8\") " pod="openshift-image-registry/node-ca-zfz79" Apr 16 22:13:31.126424 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.125975 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cdab8cce-9b55-478d-b1b5-740aa9746143-ovnkube-script-lib\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.126424 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.126003 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-host-run-netns\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.126424 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.126024 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/38e7a963-729e-40af-91e7-9fa6910bc258-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dx88j\" (UID: \"38e7a963-729e-40af-91e7-9fa6910bc258\") " pod="openshift-multus/multus-additional-cni-plugins-dx88j" Apr 16 22:13:31.126424 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.126038 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-node-log\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.126424 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.126063 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-host-cni-netd\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.126424 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.126081 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cdab8cce-9b55-478d-b1b5-740aa9746143-env-overrides\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.126424 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.126111 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.126424 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.126161 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xscg5\" (UniqueName: \"kubernetes.io/projected/65a893c9-3b9b-48c6-a82b-6236d443cacf-kube-api-access-xscg5\") pod \"network-check-target-4pbqm\" (UID: \"65a893c9-3b9b-48c6-a82b-6236d443cacf\") " pod="openshift-network-diagnostics/network-check-target-4pbqm" Apr 16 22:13:31.126424 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.126194 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/76fa2914-d925-4555-87be-d9837e6295d8-hosts-file\") pod \"node-resolver-9cdfs\" (UID: \"76fa2914-d925-4555-87be-d9837e6295d8\") " pod="openshift-dns/node-resolver-9cdfs" Apr 16 22:13:31.126424 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.126224 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/38e7a963-729e-40af-91e7-9fa6910bc258-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dx88j\" (UID: \"38e7a963-729e-40af-91e7-9fa6910bc258\") " pod="openshift-multus/multus-additional-cni-plugins-dx88j" Apr 16 22:13:31.126424 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.126250 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjdj2\" (UniqueName: \"kubernetes.io/projected/38e7a963-729e-40af-91e7-9fa6910bc258-kube-api-access-tjdj2\") pod \"multus-additional-cni-plugins-dx88j\" (UID: \"38e7a963-729e-40af-91e7-9fa6910bc258\") " pod="openshift-multus/multus-additional-cni-plugins-dx88j" Apr 16 22:13:31.126424 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.126276 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-var-lib-openvswitch\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.126424 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.126299 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cdab8cce-9b55-478d-b1b5-740aa9746143-ovn-node-metrics-cert\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.126424 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.126322 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd-socket-dir\") pod \"aws-ebs-csi-driver-node-jcgzh\" (UID: \"ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jcgzh" Apr 16 22:13:31.126952 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.126367 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d457\" (UniqueName: \"kubernetes.io/projected/ede793eb-64b1-4045-a60c-349b6c07e08b-kube-api-access-4d457\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.126952 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.126389 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-host-run-ovn-kubernetes\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.126952 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.126405 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e515cbfe-da1c-405f-8e8d-a7ddc73de30a-agent-certs\") pod \"konnectivity-agent-9hrlh\" (UID: \"e515cbfe-da1c-405f-8e8d-a7ddc73de30a\") " pod="kube-system/konnectivity-agent-9hrlh" Apr 16 22:13:31.126952 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.126424 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd-etc-selinux\") pod \"aws-ebs-csi-driver-node-jcgzh\" (UID: \"ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jcgzh" Apr 16 22:13:31.126952 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.126444 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-host-var-lib-cni-bin\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.126952 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.126457 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-cnibin\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.126952 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.126472 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/38e7a963-729e-40af-91e7-9fa6910bc258-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dx88j\" (UID: \"38e7a963-729e-40af-91e7-9fa6910bc258\") " pod="openshift-multus/multus-additional-cni-plugins-dx88j" Apr 16 22:13:31.126952 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.126513 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8htr6\" (UniqueName: \"kubernetes.io/projected/46cb706e-dcb9-4950-aa25-14e582448ea8-kube-api-access-8htr6\") pod \"node-ca-zfz79\" (UID: \"46cb706e-dcb9-4950-aa25-14e582448ea8\") " pod="openshift-image-registry/node-ca-zfz79" Apr 16 22:13:31.126952 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.126529 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd-device-dir\") pod \"aws-ebs-csi-driver-node-jcgzh\" (UID: \"ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jcgzh" Apr 16 22:13:31.126952 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.126543 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-etc-kubernetes\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.130592 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.130571 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 22:13:31.150239 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.150223 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-vk4r8" Apr 16 22:13:31.159297 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.159278 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-vk4r8" Apr 16 22:13:31.202050 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:31.202026 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d0edda7066ec5684bc2bd9c10fb4784.slice/crio-3ae2435fbb8d0abd0fffe66c93e1f3c78bb1f67802198fde6ecd78989a1e9880 WatchSource:0}: Error finding container 3ae2435fbb8d0abd0fffe66c93e1f3c78bb1f67802198fde6ecd78989a1e9880: Status 404 returned error can't find the container with id 3ae2435fbb8d0abd0fffe66c93e1f3c78bb1f67802198fde6ecd78989a1e9880 Apr 16 22:13:31.208376 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.208358 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:13:31.211634 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:31.211602 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb707a3c20ff0ea56ea13d746a8edc26b.slice/crio-14d315dd9a9002bd451fb7ec843973bda4805c4d80a40a3701fa9da20159ddaa WatchSource:0}: Error finding container 14d315dd9a9002bd451fb7ec843973bda4805c4d80a40a3701fa9da20159ddaa: Status 404 returned error can't find the container with id 14d315dd9a9002bd451fb7ec843973bda4805c4d80a40a3701fa9da20159ddaa Apr 16 22:13:31.220342 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.220322 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 22:13:31.227087 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227069 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-hostroot\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.227154 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227095 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cdab8cce-9b55-478d-b1b5-740aa9746143-ovnkube-config\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.227154 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227111 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-os-release\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.227154 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227133 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ede793eb-64b1-4045-a60c-349b6c07e08b-cni-binary-copy\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.227299 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227158 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-systemd-units\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.227299 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227160 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-hostroot\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.227299 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227182 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e515cbfe-da1c-405f-8e8d-a7ddc73de30a-konnectivity-ca\") pod \"konnectivity-agent-9hrlh\" (UID: \"e515cbfe-da1c-405f-8e8d-a7ddc73de30a\") " pod="kube-system/konnectivity-agent-9hrlh" Apr 16 22:13:31.227299 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227205 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-os-release\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.227299 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227215 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-systemd-units\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.227299 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227210 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd-sys-fs\") pod \"aws-ebs-csi-driver-node-jcgzh\" (UID: \"ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jcgzh" Apr 16 22:13:31.227299 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227259 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-run\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.227299 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227269 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd-sys-fs\") pod \"aws-ebs-csi-driver-node-jcgzh\" (UID: \"ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jcgzh" Apr 16 22:13:31.227299 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227278 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-host-run-multus-certs\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.227299 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227298 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqlln\" (UniqueName: \"kubernetes.io/projected/c220c5af-4b42-4b44-a789-17aa37d44b90-kube-api-access-gqlln\") pod \"network-metrics-daemon-wvq6s\" (UID: \"c220c5af-4b42-4b44-a789-17aa37d44b90\") " pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:13:31.227776 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227321 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-run-ovn\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.227776 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227344 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-log-socket\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.227776 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227365 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-host-run-multus-certs\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.227776 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227392 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-host-cni-bin\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.227776 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227405 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-run-ovn\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.227776 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227416 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8k8zs\" (UniqueName: \"kubernetes.io/projected/cdab8cce-9b55-478d-b1b5-740aa9746143-kube-api-access-8k8zs\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.227776 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227425 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-log-socket\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.227776 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227437 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/38e7a963-729e-40af-91e7-9fa6910bc258-cnibin\") pod \"multus-additional-cni-plugins-dx88j\" (UID: \"38e7a963-729e-40af-91e7-9fa6910bc258\") " pod="openshift-multus/multus-additional-cni-plugins-dx88j" Apr 16 22:13:31.227776 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227453 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-77r28\" (UniqueName: \"kubernetes.io/projected/76fa2914-d925-4555-87be-d9837e6295d8-kube-api-access-77r28\") pod \"node-resolver-9cdfs\" (UID: \"76fa2914-d925-4555-87be-d9837e6295d8\") " pod="openshift-dns/node-resolver-9cdfs" Apr 16 22:13:31.227776 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227452 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-host-cni-bin\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.227776 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227502 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/38e7a963-729e-40af-91e7-9fa6910bc258-cnibin\") pod \"multus-additional-cni-plugins-dx88j\" (UID: \"38e7a963-729e-40af-91e7-9fa6910bc258\") " pod="openshift-multus/multus-additional-cni-plugins-dx88j" Apr 16 22:13:31.227776 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227650 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/38e7a963-729e-40af-91e7-9fa6910bc258-cni-binary-copy\") pod \"multus-additional-cni-plugins-dx88j\" (UID: \"38e7a963-729e-40af-91e7-9fa6910bc258\") " pod="openshift-multus/multus-additional-cni-plugins-dx88j" Apr 16 22:13:31.227776 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227683 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8485072b-91cf-42a7-978b-e530b3a7b911-host-slash\") pod \"iptables-alerter-mcqg2\" (UID: \"8485072b-91cf-42a7-978b-e530b3a7b911\") " pod="openshift-network-operator/iptables-alerter-mcqg2" Apr 16 22:13:31.227776 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227708 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd-registration-dir\") pod \"aws-ebs-csi-driver-node-jcgzh\" (UID: \"ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jcgzh" Apr 16 22:13:31.227776 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227735 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gp5gn\" (UniqueName: \"kubernetes.io/projected/ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd-kube-api-access-gp5gn\") pod \"aws-ebs-csi-driver-node-jcgzh\" (UID: \"ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jcgzh" Apr 16 22:13:31.227776 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227761 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e515cbfe-da1c-405f-8e8d-a7ddc73de30a-konnectivity-ca\") pod \"konnectivity-agent-9hrlh\" (UID: \"e515cbfe-da1c-405f-8e8d-a7ddc73de30a\") " pod="kube-system/konnectivity-agent-9hrlh" Apr 16 22:13:31.227776 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227772 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8485072b-91cf-42a7-978b-e530b3a7b911-host-slash\") pod \"iptables-alerter-mcqg2\" (UID: \"8485072b-91cf-42a7-978b-e530b3a7b911\") " pod="openshift-network-operator/iptables-alerter-mcqg2" Apr 16 22:13:31.228528 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227781 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-host\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.228528 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227789 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ede793eb-64b1-4045-a60c-349b6c07e08b-cni-binary-copy\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.228528 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227832 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1742d26f-b66b-429d-b5c4-b398bdc141a1-etc-tuned\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.228528 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227863 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ede793eb-64b1-4045-a60c-349b6c07e08b-multus-daemon-config\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.228528 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227844 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd-registration-dir\") pod \"aws-ebs-csi-driver-node-jcgzh\" (UID: \"ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jcgzh" Apr 16 22:13:31.228528 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227892 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/76fa2914-d925-4555-87be-d9837e6295d8-tmp-dir\") pod \"node-resolver-9cdfs\" (UID: \"76fa2914-d925-4555-87be-d9837e6295d8\") " pod="openshift-dns/node-resolver-9cdfs" Apr 16 22:13:31.228528 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227916 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-host-kubelet\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.228528 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227939 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-host-run-netns\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.228528 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227996 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-host-kubelet\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.228528 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.227998 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-host-run-netns\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.228528 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228018 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-etc-openvswitch\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.228528 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228049 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-etc-openvswitch\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.228528 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228056 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-run-openvswitch\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.228528 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228084 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-etc-sysctl-d\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.228528 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228088 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-run-openvswitch\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.228528 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228056 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cdab8cce-9b55-478d-b1b5-740aa9746143-ovnkube-config\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.228528 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228110 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-multus-socket-dir-parent\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.229226 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228143 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/38e7a963-729e-40af-91e7-9fa6910bc258-os-release\") pod \"multus-additional-cni-plugins-dx88j\" (UID: \"38e7a963-729e-40af-91e7-9fa6910bc258\") " pod="openshift-multus/multus-additional-cni-plugins-dx88j" Apr 16 22:13:31.229226 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228172 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1742d26f-b66b-429d-b5c4-b398bdc141a1-tmp\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.229226 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228198 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-multus-conf-dir\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.229226 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228215 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-multus-socket-dir-parent\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.229226 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228226 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c220c5af-4b42-4b44-a789-17aa37d44b90-metrics-certs\") pod \"network-metrics-daemon-wvq6s\" (UID: \"c220c5af-4b42-4b44-a789-17aa37d44b90\") " pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:13:31.229226 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228246 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/76fa2914-d925-4555-87be-d9837e6295d8-tmp-dir\") pod \"node-resolver-9cdfs\" (UID: \"76fa2914-d925-4555-87be-d9837e6295d8\") " pod="openshift-dns/node-resolver-9cdfs" Apr 16 22:13:31.229226 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228251 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/38e7a963-729e-40af-91e7-9fa6910bc258-cni-binary-copy\") pod \"multus-additional-cni-plugins-dx88j\" (UID: \"38e7a963-729e-40af-91e7-9fa6910bc258\") " pod="openshift-multus/multus-additional-cni-plugins-dx88j" Apr 16 22:13:31.229226 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228260 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/46cb706e-dcb9-4950-aa25-14e582448ea8-host\") pod \"node-ca-zfz79\" (UID: \"46cb706e-dcb9-4950-aa25-14e582448ea8\") " pod="openshift-image-registry/node-ca-zfz79" Apr 16 22:13:31.229226 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228264 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/38e7a963-729e-40af-91e7-9fa6910bc258-os-release\") pod \"multus-additional-cni-plugins-dx88j\" (UID: \"38e7a963-729e-40af-91e7-9fa6910bc258\") " pod="openshift-multus/multus-additional-cni-plugins-dx88j" Apr 16 22:13:31.229226 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228253 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-multus-conf-dir\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.229226 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228288 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cdab8cce-9b55-478d-b1b5-740aa9746143-ovnkube-script-lib\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.229226 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:31.228306 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:31.229226 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228314 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-etc-modprobe-d\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.229226 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228333 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-host-run-netns\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.229226 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228312 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/46cb706e-dcb9-4950-aa25-14e582448ea8-host\") pod \"node-ca-zfz79\" (UID: \"46cb706e-dcb9-4950-aa25-14e582448ea8\") " pod="openshift-image-registry/node-ca-zfz79" Apr 16 22:13:31.229226 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228350 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ede793eb-64b1-4045-a60c-349b6c07e08b-multus-daemon-config\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.229226 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:31.228387 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c220c5af-4b42-4b44-a789-17aa37d44b90-metrics-certs podName:c220c5af-4b42-4b44-a789-17aa37d44b90 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:31.72834395 +0000 UTC m=+2.096750092 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c220c5af-4b42-4b44-a789-17aa37d44b90-metrics-certs") pod "network-metrics-daemon-wvq6s" (UID: "c220c5af-4b42-4b44-a789-17aa37d44b90") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:31.229959 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228405 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/38e7a963-729e-40af-91e7-9fa6910bc258-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dx88j\" (UID: \"38e7a963-729e-40af-91e7-9fa6910bc258\") " pod="openshift-multus/multus-additional-cni-plugins-dx88j" Apr 16 22:13:31.229959 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228424 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-node-log\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.229959 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228462 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-node-log\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.229959 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228473 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-host-run-netns\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.229959 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228486 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-host-cni-netd\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.229959 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228503 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cdab8cce-9b55-478d-b1b5-740aa9746143-env-overrides\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.229959 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228543 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-host-cni-netd\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.229959 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228574 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-etc-sysconfig\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.229959 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228599 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-etc-kubernetes\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.229959 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228626 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.229959 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228654 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xscg5\" (UniqueName: \"kubernetes.io/projected/65a893c9-3b9b-48c6-a82b-6236d443cacf-kube-api-access-xscg5\") pod \"network-check-target-4pbqm\" (UID: \"65a893c9-3b9b-48c6-a82b-6236d443cacf\") " pod="openshift-network-diagnostics/network-check-target-4pbqm" Apr 16 22:13:31.229959 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228679 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/76fa2914-d925-4555-87be-d9837e6295d8-hosts-file\") pod \"node-resolver-9cdfs\" (UID: \"76fa2914-d925-4555-87be-d9837e6295d8\") " pod="openshift-dns/node-resolver-9cdfs" Apr 16 22:13:31.229959 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228706 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/38e7a963-729e-40af-91e7-9fa6910bc258-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dx88j\" (UID: \"38e7a963-729e-40af-91e7-9fa6910bc258\") " pod="openshift-multus/multus-additional-cni-plugins-dx88j" Apr 16 22:13:31.229959 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228733 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tjdj2\" (UniqueName: \"kubernetes.io/projected/38e7a963-729e-40af-91e7-9fa6910bc258-kube-api-access-tjdj2\") pod \"multus-additional-cni-plugins-dx88j\" (UID: \"38e7a963-729e-40af-91e7-9fa6910bc258\") " pod="openshift-multus/multus-additional-cni-plugins-dx88j" Apr 16 22:13:31.229959 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228772 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/76fa2914-d925-4555-87be-d9837e6295d8-hosts-file\") pod \"node-resolver-9cdfs\" (UID: \"76fa2914-d925-4555-87be-d9837e6295d8\") " pod="openshift-dns/node-resolver-9cdfs" Apr 16 22:13:31.229959 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228779 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-var-lib-openvswitch\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.229959 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228793 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cdab8cce-9b55-478d-b1b5-740aa9746143-ovnkube-script-lib\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.230579 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228704 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.230579 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228816 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cdab8cce-9b55-478d-b1b5-740aa9746143-ovn-node-metrics-cert\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.230579 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228822 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-var-lib-openvswitch\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.230579 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228872 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/38e7a963-729e-40af-91e7-9fa6910bc258-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dx88j\" (UID: \"38e7a963-729e-40af-91e7-9fa6910bc258\") " pod="openshift-multus/multus-additional-cni-plugins-dx88j" Apr 16 22:13:31.230579 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228904 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd-socket-dir\") pod \"aws-ebs-csi-driver-node-jcgzh\" (UID: \"ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jcgzh" Apr 16 22:13:31.230579 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228937 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4d457\" (UniqueName: \"kubernetes.io/projected/ede793eb-64b1-4045-a60c-349b6c07e08b-kube-api-access-4d457\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.230579 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228973 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cdab8cce-9b55-478d-b1b5-740aa9746143-env-overrides\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.230579 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228977 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-host-run-ovn-kubernetes\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.230579 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229012 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e515cbfe-da1c-405f-8e8d-a7ddc73de30a-agent-certs\") pod \"konnectivity-agent-9hrlh\" (UID: \"e515cbfe-da1c-405f-8e8d-a7ddc73de30a\") " pod="kube-system/konnectivity-agent-9hrlh" Apr 16 22:13:31.230579 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.228937 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/38e7a963-729e-40af-91e7-9fa6910bc258-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dx88j\" (UID: \"38e7a963-729e-40af-91e7-9fa6910bc258\") " pod="openshift-multus/multus-additional-cni-plugins-dx88j" Apr 16 22:13:31.230579 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229030 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd-socket-dir\") pod \"aws-ebs-csi-driver-node-jcgzh\" (UID: \"ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jcgzh" Apr 16 22:13:31.230579 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229038 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd-etc-selinux\") pod \"aws-ebs-csi-driver-node-jcgzh\" (UID: \"ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jcgzh" Apr 16 22:13:31.230579 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229072 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-lib-modules\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.230579 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229013 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-host-run-ovn-kubernetes\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.230579 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229093 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd-etc-selinux\") pod \"aws-ebs-csi-driver-node-jcgzh\" (UID: \"ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jcgzh" Apr 16 22:13:31.230579 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229128 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-host-var-lib-cni-bin\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.230579 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229169 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-host-var-lib-cni-bin\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.231049 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229201 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-cnibin\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.231049 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229213 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 22:13:31.231049 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229231 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/38e7a963-729e-40af-91e7-9fa6910bc258-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dx88j\" (UID: \"38e7a963-729e-40af-91e7-9fa6910bc258\") " pod="openshift-multus/multus-additional-cni-plugins-dx88j" Apr 16 22:13:31.231049 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229255 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-cnibin\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.231049 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229257 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8htr6\" (UniqueName: \"kubernetes.io/projected/46cb706e-dcb9-4950-aa25-14e582448ea8-kube-api-access-8htr6\") pod \"node-ca-zfz79\" (UID: \"46cb706e-dcb9-4950-aa25-14e582448ea8\") " pod="openshift-image-registry/node-ca-zfz79" Apr 16 22:13:31.231049 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229333 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd-device-dir\") pod \"aws-ebs-csi-driver-node-jcgzh\" (UID: \"ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jcgzh" Apr 16 22:13:31.231049 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229365 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4nmm\" (UniqueName: \"kubernetes.io/projected/1742d26f-b66b-429d-b5c4-b398bdc141a1-kube-api-access-r4nmm\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.231049 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229392 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-etc-kubernetes\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.231049 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229418 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-sys\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.231049 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229421 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd-device-dir\") pod \"aws-ebs-csi-driver-node-jcgzh\" (UID: \"ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jcgzh" Apr 16 22:13:31.231049 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229442 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-var-lib-kubelet\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.231049 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229467 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-host-var-lib-cni-multus\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.231049 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229472 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-etc-kubernetes\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.231049 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229505 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-system-cni-dir\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.231049 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229512 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-host-var-lib-cni-multus\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.231049 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229574 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-host-run-k8s-cni-cncf-io\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.231049 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229607 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-host-run-k8s-cni-cncf-io\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.231049 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229620 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-host-var-lib-kubelet\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.231515 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229647 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-system-cni-dir\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.231515 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229682 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/38e7a963-729e-40af-91e7-9fa6910bc258-system-cni-dir\") pod \"multus-additional-cni-plugins-dx88j\" (UID: \"38e7a963-729e-40af-91e7-9fa6910bc258\") " pod="openshift-multus/multus-additional-cni-plugins-dx88j" Apr 16 22:13:31.231515 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229681 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/38e7a963-729e-40af-91e7-9fa6910bc258-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dx88j\" (UID: \"38e7a963-729e-40af-91e7-9fa6910bc258\") " pod="openshift-multus/multus-additional-cni-plugins-dx88j" Apr 16 22:13:31.231515 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229684 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-host-var-lib-kubelet\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.231515 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229724 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8485072b-91cf-42a7-978b-e530b3a7b911-iptables-alerter-script\") pod \"iptables-alerter-mcqg2\" (UID: \"8485072b-91cf-42a7-978b-e530b3a7b911\") " pod="openshift-network-operator/iptables-alerter-mcqg2" Apr 16 22:13:31.231515 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229757 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/38e7a963-729e-40af-91e7-9fa6910bc258-system-cni-dir\") pod \"multus-additional-cni-plugins-dx88j\" (UID: \"38e7a963-729e-40af-91e7-9fa6910bc258\") " pod="openshift-multus/multus-additional-cni-plugins-dx88j" Apr 16 22:13:31.231515 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229773 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jcgzh\" (UID: \"ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jcgzh" Apr 16 22:13:31.231515 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229797 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-etc-sysctl-conf\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.231515 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229813 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/46cb706e-dcb9-4950-aa25-14e582448ea8-serviceca\") pod \"node-ca-zfz79\" (UID: \"46cb706e-dcb9-4950-aa25-14e582448ea8\") " pod="openshift-image-registry/node-ca-zfz79" Apr 16 22:13:31.231515 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229843 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-host-slash\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.231515 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229905 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-host-slash\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.231515 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229903 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jcgzh\" (UID: \"ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jcgzh" Apr 16 22:13:31.231515 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.229973 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-run-systemd\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.231515 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.230043 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-etc-systemd\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.231515 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.230072 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5f4b\" (UniqueName: \"kubernetes.io/projected/8485072b-91cf-42a7-978b-e530b3a7b911-kube-api-access-v5f4b\") pod \"iptables-alerter-mcqg2\" (UID: \"8485072b-91cf-42a7-978b-e530b3a7b911\") " pod="openshift-network-operator/iptables-alerter-mcqg2" Apr 16 22:13:31.231515 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.230088 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cdab8cce-9b55-478d-b1b5-740aa9746143-run-systemd\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.231515 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.230097 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-multus-cni-dir\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.232143 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.230183 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ede793eb-64b1-4045-a60c-349b6c07e08b-multus-cni-dir\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.232143 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.230772 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/46cb706e-dcb9-4950-aa25-14e582448ea8-serviceca\") pod \"node-ca-zfz79\" (UID: \"46cb706e-dcb9-4950-aa25-14e582448ea8\") " pod="openshift-image-registry/node-ca-zfz79" Apr 16 22:13:31.232143 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.230866 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8485072b-91cf-42a7-978b-e530b3a7b911-iptables-alerter-script\") pod \"iptables-alerter-mcqg2\" (UID: \"8485072b-91cf-42a7-978b-e530b3a7b911\") " pod="openshift-network-operator/iptables-alerter-mcqg2" Apr 16 22:13:31.232143 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.231967 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cdab8cce-9b55-478d-b1b5-740aa9746143-ovn-node-metrics-cert\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.232270 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.232144 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e515cbfe-da1c-405f-8e8d-a7ddc73de30a-agent-certs\") pod \"konnectivity-agent-9hrlh\" (UID: \"e515cbfe-da1c-405f-8e8d-a7ddc73de30a\") " pod="kube-system/konnectivity-agent-9hrlh" Apr 16 22:13:31.236656 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.236632 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-77r28\" (UniqueName: \"kubernetes.io/projected/76fa2914-d925-4555-87be-d9837e6295d8-kube-api-access-77r28\") pod \"node-resolver-9cdfs\" (UID: \"76fa2914-d925-4555-87be-d9837e6295d8\") " pod="openshift-dns/node-resolver-9cdfs" Apr 16 22:13:31.237346 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.237332 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqlln\" (UniqueName: \"kubernetes.io/projected/c220c5af-4b42-4b44-a789-17aa37d44b90-kube-api-access-gqlln\") pod \"network-metrics-daemon-wvq6s\" (UID: \"c220c5af-4b42-4b44-a789-17aa37d44b90\") " pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:13:31.240769 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:31.240750 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:31.240854 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:31.240775 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:31.240854 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:31.240789 2574 projected.go:194] Error preparing data for projected volume kube-api-access-xscg5 for pod openshift-network-diagnostics/network-check-target-4pbqm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:31.240854 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:31.240846 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65a893c9-3b9b-48c6-a82b-6236d443cacf-kube-api-access-xscg5 podName:65a893c9-3b9b-48c6-a82b-6236d443cacf nodeName:}" failed. No retries permitted until 2026-04-16 22:13:31.740826584 +0000 UTC m=+2.109232717 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xscg5" (UniqueName: "kubernetes.io/projected/65a893c9-3b9b-48c6-a82b-6236d443cacf-kube-api-access-xscg5") pod "network-check-target-4pbqm" (UID: "65a893c9-3b9b-48c6-a82b-6236d443cacf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:31.242986 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.242966 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d457\" (UniqueName: \"kubernetes.io/projected/ede793eb-64b1-4045-a60c-349b6c07e08b-kube-api-access-4d457\") pod \"multus-8pm2c\" (UID: \"ede793eb-64b1-4045-a60c-349b6c07e08b\") " pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.243150 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.243131 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8htr6\" (UniqueName: \"kubernetes.io/projected/46cb706e-dcb9-4950-aa25-14e582448ea8-kube-api-access-8htr6\") pod \"node-ca-zfz79\" (UID: \"46cb706e-dcb9-4950-aa25-14e582448ea8\") " pod="openshift-image-registry/node-ca-zfz79" Apr 16 22:13:31.243205 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.243135 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5f4b\" (UniqueName: \"kubernetes.io/projected/8485072b-91cf-42a7-978b-e530b3a7b911-kube-api-access-v5f4b\") pod \"iptables-alerter-mcqg2\" (UID: \"8485072b-91cf-42a7-978b-e530b3a7b911\") " pod="openshift-network-operator/iptables-alerter-mcqg2" Apr 16 22:13:31.243624 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.243605 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp5gn\" (UniqueName: \"kubernetes.io/projected/ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd-kube-api-access-gp5gn\") pod \"aws-ebs-csi-driver-node-jcgzh\" (UID: \"ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jcgzh" Apr 16 22:13:31.245477 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.245459 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjdj2\" (UniqueName: \"kubernetes.io/projected/38e7a963-729e-40af-91e7-9fa6910bc258-kube-api-access-tjdj2\") pod \"multus-additional-cni-plugins-dx88j\" (UID: \"38e7a963-729e-40af-91e7-9fa6910bc258\") " pod="openshift-multus/multus-additional-cni-plugins-dx88j" Apr 16 22:13:31.251189 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.251172 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k8zs\" (UniqueName: \"kubernetes.io/projected/cdab8cce-9b55-478d-b1b5-740aa9746143-kube-api-access-8k8zs\") pod \"ovnkube-node-hm4t5\" (UID: \"cdab8cce-9b55-478d-b1b5-740aa9746143\") " pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.260002 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.259966 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-72.ec2.internal" event={"ID":"b707a3c20ff0ea56ea13d746a8edc26b","Type":"ContainerStarted","Data":"14d315dd9a9002bd451fb7ec843973bda4805c4d80a40a3701fa9da20159ddaa"} Apr 16 22:13:31.260815 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.260799 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-72.ec2.internal" event={"ID":"7d0edda7066ec5684bc2bd9c10fb4784","Type":"ContainerStarted","Data":"3ae2435fbb8d0abd0fffe66c93e1f3c78bb1f67802198fde6ecd78989a1e9880"} Apr 16 22:13:31.331141 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.331068 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-lib-modules\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.331141 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.331100 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r4nmm\" (UniqueName: \"kubernetes.io/projected/1742d26f-b66b-429d-b5c4-b398bdc141a1-kube-api-access-r4nmm\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.331141 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.331121 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-sys\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.331354 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.331144 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-var-lib-kubelet\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.331354 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.331172 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-etc-sysctl-conf\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.331354 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.331191 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-sys\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.331354 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.331194 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-etc-systemd\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.331354 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.331224 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-lib-modules\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.331354 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.331236 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-etc-systemd\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.331354 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.331241 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-run\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.331354 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.331272 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-host\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.331354 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.331271 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-var-lib-kubelet\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.331354 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.331288 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1742d26f-b66b-429d-b5c4-b398bdc141a1-etc-tuned\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.331354 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.331308 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-etc-sysctl-d\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.331354 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.331316 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-run\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.331354 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.331338 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1742d26f-b66b-429d-b5c4-b398bdc141a1-tmp\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.331354 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.331354 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-host\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.331821 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.331376 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-etc-modprobe-d\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.331821 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.331391 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-etc-sysctl-conf\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.331821 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.331404 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-etc-sysconfig\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.331821 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.331466 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-etc-kubernetes\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.331821 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.331493 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-etc-sysconfig\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.331821 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.331498 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-etc-modprobe-d\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.331821 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.331505 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-etc-sysctl-d\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.331821 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.331530 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1742d26f-b66b-429d-b5c4-b398bdc141a1-etc-kubernetes\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.333356 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.333327 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1742d26f-b66b-429d-b5c4-b398bdc141a1-etc-tuned\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.333836 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.333820 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1742d26f-b66b-429d-b5c4-b398bdc141a1-tmp\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.339029 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.339013 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4nmm\" (UniqueName: \"kubernetes.io/projected/1742d26f-b66b-429d-b5c4-b398bdc141a1-kube-api-access-r4nmm\") pod \"tuned-mxzxv\" (UID: \"1742d26f-b66b-429d-b5c4-b398bdc141a1\") " pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.428966 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.428931 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zfz79" Apr 16 22:13:31.434883 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:31.434855 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46cb706e_dcb9_4950_aa25_14e582448ea8.slice/crio-cf6b4fbf0954e670e65a4c3256e3a6c17d69c362a32f0941e89f6b3077f64be9 WatchSource:0}: Error finding container cf6b4fbf0954e670e65a4c3256e3a6c17d69c362a32f0941e89f6b3077f64be9: Status 404 returned error can't find the container with id cf6b4fbf0954e670e65a4c3256e3a6c17d69c362a32f0941e89f6b3077f64be9 Apr 16 22:13:31.443401 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.443383 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8pm2c" Apr 16 22:13:31.450136 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.450113 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-mcqg2" Apr 16 22:13:31.450488 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:31.450463 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podede793eb_64b1_4045_a60c_349b6c07e08b.slice/crio-9bd414c6d8bcba3b44df79cddfeae87396ce368d239c9bf0525c25eeb8f64d43 WatchSource:0}: Error finding container 9bd414c6d8bcba3b44df79cddfeae87396ce368d239c9bf0525c25eeb8f64d43: Status 404 returned error can't find the container with id 9bd414c6d8bcba3b44df79cddfeae87396ce368d239c9bf0525c25eeb8f64d43 Apr 16 22:13:31.457651 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:31.457626 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8485072b_91cf_42a7_978b_e530b3a7b911.slice/crio-c0bfb8614ea27f47de64416f830b83400863b18352e334b16b8e039c057cad34 WatchSource:0}: Error finding container c0bfb8614ea27f47de64416f830b83400863b18352e334b16b8e039c057cad34: Status 404 returned error can't find the container with id c0bfb8614ea27f47de64416f830b83400863b18352e334b16b8e039c057cad34 Apr 16 22:13:31.467628 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.467608 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9hrlh" Apr 16 22:13:31.474405 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:31.474239 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode515cbfe_da1c_405f_8e8d_a7ddc73de30a.slice/crio-7edd41bfe03b0c5eb9f51dfe0f19d1a20ca5e766de2e3d410084002bce8e049b WatchSource:0}: Error finding container 7edd41bfe03b0c5eb9f51dfe0f19d1a20ca5e766de2e3d410084002bce8e049b: Status 404 returned error can't find the container with id 7edd41bfe03b0c5eb9f51dfe0f19d1a20ca5e766de2e3d410084002bce8e049b Apr 16 22:13:31.480010 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.479995 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:31.485488 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:31.485462 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdab8cce_9b55_478d_b1b5_740aa9746143.slice/crio-ec9c315e514743cbb751e270dbfbce69a1b619259232059102730b86dcdbf011 WatchSource:0}: Error finding container ec9c315e514743cbb751e270dbfbce69a1b619259232059102730b86dcdbf011: Status 404 returned error can't find the container with id ec9c315e514743cbb751e270dbfbce69a1b619259232059102730b86dcdbf011 Apr 16 22:13:31.499451 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.499430 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jcgzh" Apr 16 22:13:31.505961 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.505937 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9cdfs" Apr 16 22:13:31.506693 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:31.506661 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee6f8a37_fb77_4ed8_87ce_8346f1f80fbd.slice/crio-6160ee02aaca869316bf7c03ce445d644a2cc1a4dc1739188eed87c3bf68bcf5 WatchSource:0}: Error finding container 6160ee02aaca869316bf7c03ce445d644a2cc1a4dc1739188eed87c3bf68bcf5: Status 404 returned error can't find the container with id 6160ee02aaca869316bf7c03ce445d644a2cc1a4dc1739188eed87c3bf68bcf5 Apr 16 22:13:31.512137 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.512121 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dx88j" Apr 16 22:13:31.513227 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:31.513202 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76fa2914_d925_4555_87be_d9837e6295d8.slice/crio-4ce0e2f91dbc182add4ca19c93e4aee15f127c9e47d6388fd2a55cfe95ccf035 WatchSource:0}: Error finding container 4ce0e2f91dbc182add4ca19c93e4aee15f127c9e47d6388fd2a55cfe95ccf035: Status 404 returned error can't find the container with id 4ce0e2f91dbc182add4ca19c93e4aee15f127c9e47d6388fd2a55cfe95ccf035 Apr 16 22:13:31.516911 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.516894 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" Apr 16 22:13:31.519498 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:31.519471 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38e7a963_729e_40af_91e7_9fa6910bc258.slice/crio-3ae62ddb011118cef2a8e672bdc532115610d38e4a19fcc570e3292b46dcaa9a WatchSource:0}: Error finding container 3ae62ddb011118cef2a8e672bdc532115610d38e4a19fcc570e3292b46dcaa9a: Status 404 returned error can't find the container with id 3ae62ddb011118cef2a8e672bdc532115610d38e4a19fcc570e3292b46dcaa9a Apr 16 22:13:31.524957 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:13:31.524928 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1742d26f_b66b_429d_b5c4_b398bdc141a1.slice/crio-6acea664697932ea265c11a59e50032cd46a7942cd11872859e03d490605330b WatchSource:0}: Error finding container 6acea664697932ea265c11a59e50032cd46a7942cd11872859e03d490605330b: Status 404 returned error can't find the container with id 6acea664697932ea265c11a59e50032cd46a7942cd11872859e03d490605330b Apr 16 22:13:31.734562 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.734440 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c220c5af-4b42-4b44-a789-17aa37d44b90-metrics-certs\") pod \"network-metrics-daemon-wvq6s\" (UID: \"c220c5af-4b42-4b44-a789-17aa37d44b90\") " pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:13:31.734717 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:31.734631 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:31.734717 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:31.734700 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c220c5af-4b42-4b44-a789-17aa37d44b90-metrics-certs podName:c220c5af-4b42-4b44-a789-17aa37d44b90 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:32.734680544 +0000 UTC m=+3.103086698 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c220c5af-4b42-4b44-a789-17aa37d44b90-metrics-certs") pod "network-metrics-daemon-wvq6s" (UID: "c220c5af-4b42-4b44-a789-17aa37d44b90") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:31.835309 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.835184 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xscg5\" (UniqueName: \"kubernetes.io/projected/65a893c9-3b9b-48c6-a82b-6236d443cacf-kube-api-access-xscg5\") pod \"network-check-target-4pbqm\" (UID: \"65a893c9-3b9b-48c6-a82b-6236d443cacf\") " pod="openshift-network-diagnostics/network-check-target-4pbqm" Apr 16 22:13:31.835480 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:31.835365 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:31.835480 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:31.835384 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:31.835480 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:31.835396 2574 projected.go:194] Error preparing data for projected volume kube-api-access-xscg5 for pod openshift-network-diagnostics/network-check-target-4pbqm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:31.835480 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:31.835451 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65a893c9-3b9b-48c6-a82b-6236d443cacf-kube-api-access-xscg5 podName:65a893c9-3b9b-48c6-a82b-6236d443cacf nodeName:}" failed. No retries permitted until 2026-04-16 22:13:32.835432696 +0000 UTC m=+3.203838838 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-xscg5" (UniqueName: "kubernetes.io/projected/65a893c9-3b9b-48c6-a82b-6236d443cacf-kube-api-access-xscg5") pod "network-check-target-4pbqm" (UID: "65a893c9-3b9b-48c6-a82b-6236d443cacf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:31.928503 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:31.928313 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:32.160193 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:32.160138 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 22:08:31 +0000 UTC" deadline="2027-12-03 14:02:12.209813502 +0000 UTC" Apr 16 22:13:32.160193 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:32.160174 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14295h48m40.049642287s" Apr 16 22:13:32.284966 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:32.284896 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zfz79" event={"ID":"46cb706e-dcb9-4950-aa25-14e582448ea8","Type":"ContainerStarted","Data":"cf6b4fbf0954e670e65a4c3256e3a6c17d69c362a32f0941e89f6b3077f64be9"} Apr 16 22:13:32.292505 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:32.292455 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dx88j" event={"ID":"38e7a963-729e-40af-91e7-9fa6910bc258","Type":"ContainerStarted","Data":"3ae62ddb011118cef2a8e672bdc532115610d38e4a19fcc570e3292b46dcaa9a"} Apr 16 22:13:32.307781 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:32.306650 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8pm2c" event={"ID":"ede793eb-64b1-4045-a60c-349b6c07e08b","Type":"ContainerStarted","Data":"9bd414c6d8bcba3b44df79cddfeae87396ce368d239c9bf0525c25eeb8f64d43"} Apr 16 22:13:32.317834 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:32.317802 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" event={"ID":"1742d26f-b66b-429d-b5c4-b398bdc141a1","Type":"ContainerStarted","Data":"6acea664697932ea265c11a59e50032cd46a7942cd11872859e03d490605330b"} Apr 16 22:13:32.329447 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:32.329421 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9cdfs" event={"ID":"76fa2914-d925-4555-87be-d9837e6295d8","Type":"ContainerStarted","Data":"4ce0e2f91dbc182add4ca19c93e4aee15f127c9e47d6388fd2a55cfe95ccf035"} Apr 16 22:13:32.333100 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:32.333077 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jcgzh" event={"ID":"ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd","Type":"ContainerStarted","Data":"6160ee02aaca869316bf7c03ce445d644a2cc1a4dc1739188eed87c3bf68bcf5"} Apr 16 22:13:32.346013 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:32.345990 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" event={"ID":"cdab8cce-9b55-478d-b1b5-740aa9746143","Type":"ContainerStarted","Data":"ec9c315e514743cbb751e270dbfbce69a1b619259232059102730b86dcdbf011"} Apr 16 22:13:32.353092 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:32.353061 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9hrlh" event={"ID":"e515cbfe-da1c-405f-8e8d-a7ddc73de30a","Type":"ContainerStarted","Data":"7edd41bfe03b0c5eb9f51dfe0f19d1a20ca5e766de2e3d410084002bce8e049b"} Apr 16 22:13:32.363012 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:32.362988 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-mcqg2" event={"ID":"8485072b-91cf-42a7-978b-e530b3a7b911","Type":"ContainerStarted","Data":"c0bfb8614ea27f47de64416f830b83400863b18352e334b16b8e039c057cad34"} Apr 16 22:13:32.410978 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:32.410788 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:32.605004 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:32.604973 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:32.742935 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:32.742889 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c220c5af-4b42-4b44-a789-17aa37d44b90-metrics-certs\") pod \"network-metrics-daemon-wvq6s\" (UID: \"c220c5af-4b42-4b44-a789-17aa37d44b90\") " pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:13:32.743116 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:32.743070 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:32.743176 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:32.743133 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c220c5af-4b42-4b44-a789-17aa37d44b90-metrics-certs podName:c220c5af-4b42-4b44-a789-17aa37d44b90 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:34.743114085 +0000 UTC m=+5.111520232 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c220c5af-4b42-4b44-a789-17aa37d44b90-metrics-certs") pod "network-metrics-daemon-wvq6s" (UID: "c220c5af-4b42-4b44-a789-17aa37d44b90") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:32.844464 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:32.843781 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xscg5\" (UniqueName: \"kubernetes.io/projected/65a893c9-3b9b-48c6-a82b-6236d443cacf-kube-api-access-xscg5\") pod \"network-check-target-4pbqm\" (UID: \"65a893c9-3b9b-48c6-a82b-6236d443cacf\") " pod="openshift-network-diagnostics/network-check-target-4pbqm" Apr 16 22:13:32.844464 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:32.843974 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:32.844464 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:32.843994 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:32.844464 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:32.844008 2574 projected.go:194] Error preparing data for projected volume kube-api-access-xscg5 for pod openshift-network-diagnostics/network-check-target-4pbqm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:32.844464 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:32.844067 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65a893c9-3b9b-48c6-a82b-6236d443cacf-kube-api-access-xscg5 podName:65a893c9-3b9b-48c6-a82b-6236d443cacf nodeName:}" failed. No retries permitted until 2026-04-16 22:13:34.844047328 +0000 UTC m=+5.212453481 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-xscg5" (UniqueName: "kubernetes.io/projected/65a893c9-3b9b-48c6-a82b-6236d443cacf-kube-api-access-xscg5") pod "network-check-target-4pbqm" (UID: "65a893c9-3b9b-48c6-a82b-6236d443cacf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:33.161039 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:33.160932 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 22:08:31 +0000 UTC" deadline="2027-10-26 16:15:31.981846352 +0000 UTC" Apr 16 22:13:33.161039 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:33.160981 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13386h1m58.820869253s" Apr 16 22:13:33.258146 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:33.258108 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4pbqm" Apr 16 22:13:33.258302 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:33.258240 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4pbqm" podUID="65a893c9-3b9b-48c6-a82b-6236d443cacf" Apr 16 22:13:33.258714 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:33.258691 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:13:33.258847 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:33.258821 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvq6s" podUID="c220c5af-4b42-4b44-a789-17aa37d44b90" Apr 16 22:13:34.759444 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:34.759405 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c220c5af-4b42-4b44-a789-17aa37d44b90-metrics-certs\") pod \"network-metrics-daemon-wvq6s\" (UID: \"c220c5af-4b42-4b44-a789-17aa37d44b90\") " pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:13:34.759919 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:34.759589 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:34.759919 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:34.759662 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c220c5af-4b42-4b44-a789-17aa37d44b90-metrics-certs podName:c220c5af-4b42-4b44-a789-17aa37d44b90 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:38.75964114 +0000 UTC m=+9.128047283 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c220c5af-4b42-4b44-a789-17aa37d44b90-metrics-certs") pod "network-metrics-daemon-wvq6s" (UID: "c220c5af-4b42-4b44-a789-17aa37d44b90") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:34.859977 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:34.859937 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xscg5\" (UniqueName: \"kubernetes.io/projected/65a893c9-3b9b-48c6-a82b-6236d443cacf-kube-api-access-xscg5\") pod \"network-check-target-4pbqm\" (UID: \"65a893c9-3b9b-48c6-a82b-6236d443cacf\") " pod="openshift-network-diagnostics/network-check-target-4pbqm" Apr 16 22:13:34.860154 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:34.860109 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:34.860154 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:34.860130 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:34.860154 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:34.860143 2574 projected.go:194] Error preparing data for projected volume kube-api-access-xscg5 for pod openshift-network-diagnostics/network-check-target-4pbqm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:34.860329 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:34.860199 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65a893c9-3b9b-48c6-a82b-6236d443cacf-kube-api-access-xscg5 podName:65a893c9-3b9b-48c6-a82b-6236d443cacf nodeName:}" failed. No retries permitted until 2026-04-16 22:13:38.860181724 +0000 UTC m=+9.228587868 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-xscg5" (UniqueName: "kubernetes.io/projected/65a893c9-3b9b-48c6-a82b-6236d443cacf-kube-api-access-xscg5") pod "network-check-target-4pbqm" (UID: "65a893c9-3b9b-48c6-a82b-6236d443cacf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:35.258091 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:35.258054 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:13:35.258281 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:35.258207 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvq6s" podUID="c220c5af-4b42-4b44-a789-17aa37d44b90" Apr 16 22:13:35.258571 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:35.258548 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4pbqm" Apr 16 22:13:35.258682 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:35.258651 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4pbqm" podUID="65a893c9-3b9b-48c6-a82b-6236d443cacf" Apr 16 22:13:37.258889 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:37.258122 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:13:37.258889 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:37.258276 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvq6s" podUID="c220c5af-4b42-4b44-a789-17aa37d44b90" Apr 16 22:13:37.258889 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:37.258441 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4pbqm" Apr 16 22:13:37.258889 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:37.258520 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4pbqm" podUID="65a893c9-3b9b-48c6-a82b-6236d443cacf" Apr 16 22:13:38.792669 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:38.792573 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c220c5af-4b42-4b44-a789-17aa37d44b90-metrics-certs\") pod \"network-metrics-daemon-wvq6s\" (UID: \"c220c5af-4b42-4b44-a789-17aa37d44b90\") " pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:13:38.793156 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:38.792717 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:38.793156 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:38.792798 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c220c5af-4b42-4b44-a789-17aa37d44b90-metrics-certs podName:c220c5af-4b42-4b44-a789-17aa37d44b90 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:46.79277655 +0000 UTC m=+17.161182692 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c220c5af-4b42-4b44-a789-17aa37d44b90-metrics-certs") pod "network-metrics-daemon-wvq6s" (UID: "c220c5af-4b42-4b44-a789-17aa37d44b90") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:38.893009 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:38.892976 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xscg5\" (UniqueName: \"kubernetes.io/projected/65a893c9-3b9b-48c6-a82b-6236d443cacf-kube-api-access-xscg5\") pod \"network-check-target-4pbqm\" (UID: \"65a893c9-3b9b-48c6-a82b-6236d443cacf\") " pod="openshift-network-diagnostics/network-check-target-4pbqm" Apr 16 22:13:38.893186 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:38.893168 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:38.893257 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:38.893192 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:38.893257 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:38.893206 2574 projected.go:194] Error preparing data for projected volume kube-api-access-xscg5 for pod openshift-network-diagnostics/network-check-target-4pbqm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:38.893361 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:38.893267 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65a893c9-3b9b-48c6-a82b-6236d443cacf-kube-api-access-xscg5 podName:65a893c9-3b9b-48c6-a82b-6236d443cacf nodeName:}" failed. No retries permitted until 2026-04-16 22:13:46.893248731 +0000 UTC m=+17.261654874 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-xscg5" (UniqueName: "kubernetes.io/projected/65a893c9-3b9b-48c6-a82b-6236d443cacf-kube-api-access-xscg5") pod "network-check-target-4pbqm" (UID: "65a893c9-3b9b-48c6-a82b-6236d443cacf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:39.258544 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:39.258509 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4pbqm" Apr 16 22:13:39.258705 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:39.258550 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:13:39.258705 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:39.258647 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4pbqm" podUID="65a893c9-3b9b-48c6-a82b-6236d443cacf" Apr 16 22:13:39.258919 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:39.258883 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvq6s" podUID="c220c5af-4b42-4b44-a789-17aa37d44b90" Apr 16 22:13:40.625800 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:40.625763 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-7m5s7"] Apr 16 22:13:40.632676 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:40.632647 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7m5s7" Apr 16 22:13:40.632822 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:40.632733 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7m5s7" podUID="f60e0c7e-0c9f-4696-ba69-04969deb255d" Apr 16 22:13:40.706882 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:40.706843 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f60e0c7e-0c9f-4696-ba69-04969deb255d-dbus\") pod \"global-pull-secret-syncer-7m5s7\" (UID: \"f60e0c7e-0c9f-4696-ba69-04969deb255d\") " pod="kube-system/global-pull-secret-syncer-7m5s7" Apr 16 22:13:40.707060 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:40.706893 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f60e0c7e-0c9f-4696-ba69-04969deb255d-kubelet-config\") pod \"global-pull-secret-syncer-7m5s7\" (UID: \"f60e0c7e-0c9f-4696-ba69-04969deb255d\") " pod="kube-system/global-pull-secret-syncer-7m5s7" Apr 16 22:13:40.707060 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:40.706924 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f60e0c7e-0c9f-4696-ba69-04969deb255d-original-pull-secret\") pod \"global-pull-secret-syncer-7m5s7\" (UID: \"f60e0c7e-0c9f-4696-ba69-04969deb255d\") " pod="kube-system/global-pull-secret-syncer-7m5s7" Apr 16 22:13:40.807858 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:40.807828 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f60e0c7e-0c9f-4696-ba69-04969deb255d-dbus\") pod \"global-pull-secret-syncer-7m5s7\" (UID: \"f60e0c7e-0c9f-4696-ba69-04969deb255d\") " pod="kube-system/global-pull-secret-syncer-7m5s7" Apr 16 22:13:40.808025 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:40.807871 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f60e0c7e-0c9f-4696-ba69-04969deb255d-kubelet-config\") pod \"global-pull-secret-syncer-7m5s7\" (UID: \"f60e0c7e-0c9f-4696-ba69-04969deb255d\") " pod="kube-system/global-pull-secret-syncer-7m5s7" Apr 16 22:13:40.808025 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:40.807903 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f60e0c7e-0c9f-4696-ba69-04969deb255d-original-pull-secret\") pod \"global-pull-secret-syncer-7m5s7\" (UID: \"f60e0c7e-0c9f-4696-ba69-04969deb255d\") " pod="kube-system/global-pull-secret-syncer-7m5s7" Apr 16 22:13:40.808130 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:40.808021 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f60e0c7e-0c9f-4696-ba69-04969deb255d-dbus\") pod \"global-pull-secret-syncer-7m5s7\" (UID: \"f60e0c7e-0c9f-4696-ba69-04969deb255d\") " pod="kube-system/global-pull-secret-syncer-7m5s7" Apr 16 22:13:40.808130 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:40.808031 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f60e0c7e-0c9f-4696-ba69-04969deb255d-kubelet-config\") pod \"global-pull-secret-syncer-7m5s7\" (UID: \"f60e0c7e-0c9f-4696-ba69-04969deb255d\") " pod="kube-system/global-pull-secret-syncer-7m5s7" Apr 16 22:13:40.808130 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:40.808081 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:40.808254 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:40.808145 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f60e0c7e-0c9f-4696-ba69-04969deb255d-original-pull-secret podName:f60e0c7e-0c9f-4696-ba69-04969deb255d nodeName:}" failed. No retries permitted until 2026-04-16 22:13:41.308126672 +0000 UTC m=+11.676532815 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f60e0c7e-0c9f-4696-ba69-04969deb255d-original-pull-secret") pod "global-pull-secret-syncer-7m5s7" (UID: "f60e0c7e-0c9f-4696-ba69-04969deb255d") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:41.257835 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:41.257805 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4pbqm" Apr 16 22:13:41.257835 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:41.257819 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:13:41.258033 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:41.257915 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4pbqm" podUID="65a893c9-3b9b-48c6-a82b-6236d443cacf" Apr 16 22:13:41.258110 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:41.258088 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvq6s" podUID="c220c5af-4b42-4b44-a789-17aa37d44b90" Apr 16 22:13:41.312056 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:41.312022 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f60e0c7e-0c9f-4696-ba69-04969deb255d-original-pull-secret\") pod \"global-pull-secret-syncer-7m5s7\" (UID: \"f60e0c7e-0c9f-4696-ba69-04969deb255d\") " pod="kube-system/global-pull-secret-syncer-7m5s7" Apr 16 22:13:41.312207 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:41.312147 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:41.312207 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:41.312206 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f60e0c7e-0c9f-4696-ba69-04969deb255d-original-pull-secret podName:f60e0c7e-0c9f-4696-ba69-04969deb255d nodeName:}" failed. No retries permitted until 2026-04-16 22:13:42.312187877 +0000 UTC m=+12.680594020 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f60e0c7e-0c9f-4696-ba69-04969deb255d-original-pull-secret") pod "global-pull-secret-syncer-7m5s7" (UID: "f60e0c7e-0c9f-4696-ba69-04969deb255d") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:42.258561 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:42.258521 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7m5s7" Apr 16 22:13:42.259018 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:42.258669 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7m5s7" podUID="f60e0c7e-0c9f-4696-ba69-04969deb255d" Apr 16 22:13:42.317910 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:42.317878 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f60e0c7e-0c9f-4696-ba69-04969deb255d-original-pull-secret\") pod \"global-pull-secret-syncer-7m5s7\" (UID: \"f60e0c7e-0c9f-4696-ba69-04969deb255d\") " pod="kube-system/global-pull-secret-syncer-7m5s7" Apr 16 22:13:42.318064 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:42.318024 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:42.318119 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:42.318098 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f60e0c7e-0c9f-4696-ba69-04969deb255d-original-pull-secret podName:f60e0c7e-0c9f-4696-ba69-04969deb255d nodeName:}" failed. No retries permitted until 2026-04-16 22:13:44.31807682 +0000 UTC m=+14.686482953 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f60e0c7e-0c9f-4696-ba69-04969deb255d-original-pull-secret") pod "global-pull-secret-syncer-7m5s7" (UID: "f60e0c7e-0c9f-4696-ba69-04969deb255d") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:43.258014 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:43.257982 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4pbqm" Apr 16 22:13:43.258188 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:43.257983 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:13:43.258188 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:43.258096 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4pbqm" podUID="65a893c9-3b9b-48c6-a82b-6236d443cacf" Apr 16 22:13:43.258279 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:43.258194 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvq6s" podUID="c220c5af-4b42-4b44-a789-17aa37d44b90" Apr 16 22:13:44.257988 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:44.257955 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7m5s7" Apr 16 22:13:44.258450 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:44.258088 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7m5s7" podUID="f60e0c7e-0c9f-4696-ba69-04969deb255d" Apr 16 22:13:44.334545 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:44.334507 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f60e0c7e-0c9f-4696-ba69-04969deb255d-original-pull-secret\") pod \"global-pull-secret-syncer-7m5s7\" (UID: \"f60e0c7e-0c9f-4696-ba69-04969deb255d\") " pod="kube-system/global-pull-secret-syncer-7m5s7" Apr 16 22:13:44.334716 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:44.334693 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:44.334806 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:44.334794 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f60e0c7e-0c9f-4696-ba69-04969deb255d-original-pull-secret podName:f60e0c7e-0c9f-4696-ba69-04969deb255d nodeName:}" failed. No retries permitted until 2026-04-16 22:13:48.334773787 +0000 UTC m=+18.703179929 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f60e0c7e-0c9f-4696-ba69-04969deb255d-original-pull-secret") pod "global-pull-secret-syncer-7m5s7" (UID: "f60e0c7e-0c9f-4696-ba69-04969deb255d") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:45.258547 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:45.258519 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4pbqm" Apr 16 22:13:45.258970 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:45.258523 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:13:45.258970 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:45.258629 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4pbqm" podUID="65a893c9-3b9b-48c6-a82b-6236d443cacf" Apr 16 22:13:45.258970 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:45.258700 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvq6s" podUID="c220c5af-4b42-4b44-a789-17aa37d44b90" Apr 16 22:13:46.257793 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:46.257754 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7m5s7" Apr 16 22:13:46.257983 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:46.257894 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7m5s7" podUID="f60e0c7e-0c9f-4696-ba69-04969deb255d" Apr 16 22:13:46.852093 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:46.852057 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c220c5af-4b42-4b44-a789-17aa37d44b90-metrics-certs\") pod \"network-metrics-daemon-wvq6s\" (UID: \"c220c5af-4b42-4b44-a789-17aa37d44b90\") " pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:13:46.852513 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:46.852216 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:46.852513 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:46.852286 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c220c5af-4b42-4b44-a789-17aa37d44b90-metrics-certs podName:c220c5af-4b42-4b44-a789-17aa37d44b90 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:02.852270428 +0000 UTC m=+33.220676561 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c220c5af-4b42-4b44-a789-17aa37d44b90-metrics-certs") pod "network-metrics-daemon-wvq6s" (UID: "c220c5af-4b42-4b44-a789-17aa37d44b90") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:46.952397 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:46.952366 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xscg5\" (UniqueName: \"kubernetes.io/projected/65a893c9-3b9b-48c6-a82b-6236d443cacf-kube-api-access-xscg5\") pod \"network-check-target-4pbqm\" (UID: \"65a893c9-3b9b-48c6-a82b-6236d443cacf\") " pod="openshift-network-diagnostics/network-check-target-4pbqm" Apr 16 22:13:46.952571 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:46.952529 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:46.952571 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:46.952556 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:46.952571 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:46.952569 2574 projected.go:194] Error preparing data for projected volume kube-api-access-xscg5 for pod openshift-network-diagnostics/network-check-target-4pbqm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:46.952712 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:46.952633 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/65a893c9-3b9b-48c6-a82b-6236d443cacf-kube-api-access-xscg5 podName:65a893c9-3b9b-48c6-a82b-6236d443cacf nodeName:}" failed. No retries permitted until 2026-04-16 22:14:02.952614143 +0000 UTC m=+33.321020275 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-xscg5" (UniqueName: "kubernetes.io/projected/65a893c9-3b9b-48c6-a82b-6236d443cacf-kube-api-access-xscg5") pod "network-check-target-4pbqm" (UID: "65a893c9-3b9b-48c6-a82b-6236d443cacf") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:47.258331 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:47.258287 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4pbqm" Apr 16 22:13:47.258526 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:47.258296 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:13:47.258526 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:47.258404 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4pbqm" podUID="65a893c9-3b9b-48c6-a82b-6236d443cacf" Apr 16 22:13:47.258526 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:47.258515 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvq6s" podUID="c220c5af-4b42-4b44-a789-17aa37d44b90" Apr 16 22:13:48.257754 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:48.257697 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7m5s7" Apr 16 22:13:48.258214 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:48.257849 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7m5s7" podUID="f60e0c7e-0c9f-4696-ba69-04969deb255d" Apr 16 22:13:48.363319 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:48.363278 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f60e0c7e-0c9f-4696-ba69-04969deb255d-original-pull-secret\") pod \"global-pull-secret-syncer-7m5s7\" (UID: \"f60e0c7e-0c9f-4696-ba69-04969deb255d\") " pod="kube-system/global-pull-secret-syncer-7m5s7" Apr 16 22:13:48.363514 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:48.363406 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:48.363514 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:48.363466 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f60e0c7e-0c9f-4696-ba69-04969deb255d-original-pull-secret podName:f60e0c7e-0c9f-4696-ba69-04969deb255d nodeName:}" failed. No retries permitted until 2026-04-16 22:13:56.363451751 +0000 UTC m=+26.731857880 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f60e0c7e-0c9f-4696-ba69-04969deb255d-original-pull-secret") pod "global-pull-secret-syncer-7m5s7" (UID: "f60e0c7e-0c9f-4696-ba69-04969deb255d") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:49.258161 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:49.258134 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4pbqm" Apr 16 22:13:49.258553 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:49.258161 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:13:49.258553 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:49.258236 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4pbqm" podUID="65a893c9-3b9b-48c6-a82b-6236d443cacf" Apr 16 22:13:49.258553 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:49.258354 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvq6s" podUID="c220c5af-4b42-4b44-a789-17aa37d44b90" Apr 16 22:13:49.397996 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:49.397695 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-72.ec2.internal" event={"ID":"b707a3c20ff0ea56ea13d746a8edc26b","Type":"ContainerStarted","Data":"f8c053aafb001c4a527ca26a506ab59875887af7ea4f0832e747b4efa38162f0"} Apr 16 22:13:49.399835 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:49.399799 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" event={"ID":"1742d26f-b66b-429d-b5c4-b398bdc141a1","Type":"ContainerStarted","Data":"55aced06639000ddee478fc657fc374f9ad5046f779927ef6534a31e2521fcf2"} Apr 16 22:13:49.402214 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:49.402180 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" event={"ID":"cdab8cce-9b55-478d-b1b5-740aa9746143","Type":"ContainerStarted","Data":"6ddab334d442e475cc837b090568ea526121a70ebd0b45e91a029ac202e15e46"} Apr 16 22:13:49.413601 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:49.413550 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-72.ec2.internal" podStartSLOduration=19.413537068 podStartE2EDuration="19.413537068s" podCreationTimestamp="2026-04-16 22:13:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:13:49.412912057 +0000 UTC m=+19.781318209" watchObservedRunningTime="2026-04-16 22:13:49.413537068 +0000 UTC m=+19.781943215" Apr 16 22:13:49.429698 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:49.429470 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-mxzxv" podStartSLOduration=1.802537268 podStartE2EDuration="19.429454836s" podCreationTimestamp="2026-04-16 22:13:30 +0000 UTC" firstStartedPulling="2026-04-16 22:13:31.528722278 +0000 UTC m=+1.897128412" lastFinishedPulling="2026-04-16 22:13:49.155639835 +0000 UTC m=+19.524045980" observedRunningTime="2026-04-16 22:13:49.429318011 +0000 UTC m=+19.797724158" watchObservedRunningTime="2026-04-16 22:13:49.429454836 +0000 UTC m=+19.797860990" Apr 16 22:13:50.258534 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:50.258353 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7m5s7" Apr 16 22:13:50.259180 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:50.258595 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7m5s7" podUID="f60e0c7e-0c9f-4696-ba69-04969deb255d" Apr 16 22:13:50.408931 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:50.408842 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hm4t5_cdab8cce-9b55-478d-b1b5-740aa9746143/ovn-acl-logging/0.log" Apr 16 22:13:50.409278 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:50.409240 2574 generic.go:358] "Generic (PLEG): container finished" podID="cdab8cce-9b55-478d-b1b5-740aa9746143" containerID="810d1c3b0dc875df810991022bdea82f29248af5bf5d7010ba9d10605f05cb1c" exitCode=1 Apr 16 22:13:50.409348 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:50.409308 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" event={"ID":"cdab8cce-9b55-478d-b1b5-740aa9746143","Type":"ContainerDied","Data":"810d1c3b0dc875df810991022bdea82f29248af5bf5d7010ba9d10605f05cb1c"} Apr 16 22:13:50.409348 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:50.409342 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" event={"ID":"cdab8cce-9b55-478d-b1b5-740aa9746143","Type":"ContainerStarted","Data":"03cabe392c4e33531abe00b7c7cc9d42937035e6ceaddc548122ef6111e2ea75"} Apr 16 22:13:50.409451 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:50.409357 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" event={"ID":"cdab8cce-9b55-478d-b1b5-740aa9746143","Type":"ContainerStarted","Data":"709b07a098f4aff7361e748e133d8e3ecf4980a676b3da001ac701c42ab66595"} Apr 16 22:13:50.409451 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:50.409370 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" event={"ID":"cdab8cce-9b55-478d-b1b5-740aa9746143","Type":"ContainerStarted","Data":"a8e1eb70b4dde5210ea1532b594c2393edbdd22be80d1de4862f95fefd1ddcae"} Apr 16 22:13:50.409451 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:50.409385 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" event={"ID":"cdab8cce-9b55-478d-b1b5-740aa9746143","Type":"ContainerStarted","Data":"ef6d763e15b07c06e9749ade3c4c89f676aad9606e4e3c25eca044518997a46f"} Apr 16 22:13:50.411265 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:50.411237 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9hrlh" event={"ID":"e515cbfe-da1c-405f-8e8d-a7ddc73de30a","Type":"ContainerStarted","Data":"62684640b6191d49dbb247693a883373e7c108f738d1a931765a162390aa96cd"} Apr 16 22:13:50.413392 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:50.413367 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zfz79" event={"ID":"46cb706e-dcb9-4950-aa25-14e582448ea8","Type":"ContainerStarted","Data":"87a84bb053e28bf6b7939850c6646ac18008f37699eb6c65c57e425327a04fea"} Apr 16 22:13:50.415109 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:50.415020 2574 generic.go:358] "Generic (PLEG): container finished" podID="7d0edda7066ec5684bc2bd9c10fb4784" containerID="f16dff5cb76336264cb0e731b66468ee1ad450819139560a941d3a3f9bedae2d" exitCode=0 Apr 16 22:13:50.415221 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:50.415103 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-72.ec2.internal" event={"ID":"7d0edda7066ec5684bc2bd9c10fb4784","Type":"ContainerDied","Data":"f16dff5cb76336264cb0e731b66468ee1ad450819139560a941d3a3f9bedae2d"} Apr 16 22:13:50.416451 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:50.416431 2574 generic.go:358] "Generic (PLEG): container finished" podID="38e7a963-729e-40af-91e7-9fa6910bc258" containerID="28d3b4b13c5eaabf3cd26ff5dea7a20a8358ae28956e89775e8cf3f8db746a7d" exitCode=0 Apr 16 22:13:50.416554 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:50.416460 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dx88j" event={"ID":"38e7a963-729e-40af-91e7-9fa6910bc258","Type":"ContainerDied","Data":"28d3b4b13c5eaabf3cd26ff5dea7a20a8358ae28956e89775e8cf3f8db746a7d"} Apr 16 22:13:50.418818 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:50.418736 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8pm2c" event={"ID":"ede793eb-64b1-4045-a60c-349b6c07e08b","Type":"ContainerStarted","Data":"6a3ac9c86f86aeac45f3ef7d8e773587525e867651b4b4dcb2ec6d0717111bad"} Apr 16 22:13:50.420363 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:50.420326 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9cdfs" event={"ID":"76fa2914-d925-4555-87be-d9837e6295d8","Type":"ContainerStarted","Data":"f8309cd14fa1b217735f68df0234dee6643e647d196ef0ed4ab3a848dd9fe022"} Apr 16 22:13:50.421620 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:50.421597 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jcgzh" event={"ID":"ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd","Type":"ContainerStarted","Data":"073b762f4a8c46706000c9f83d472c2fb74a377ebbd8ac9bfda5e7b64d93ea4d"} Apr 16 22:13:50.430600 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:50.430561 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-9hrlh" podStartSLOduration=2.771865197 podStartE2EDuration="20.430546465s" podCreationTimestamp="2026-04-16 22:13:30 +0000 UTC" firstStartedPulling="2026-04-16 22:13:31.475557372 +0000 UTC m=+1.843963504" lastFinishedPulling="2026-04-16 22:13:49.134238635 +0000 UTC m=+19.502644772" observedRunningTime="2026-04-16 22:13:50.429964765 +0000 UTC m=+20.798370918" watchObservedRunningTime="2026-04-16 22:13:50.430546465 +0000 UTC m=+20.798952616" Apr 16 22:13:50.445159 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:50.445101 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-zfz79" podStartSLOduration=2.7472629079999997 podStartE2EDuration="20.445088234s" podCreationTimestamp="2026-04-16 22:13:30 +0000 UTC" firstStartedPulling="2026-04-16 22:13:31.436348365 +0000 UTC m=+1.804754498" lastFinishedPulling="2026-04-16 22:13:49.134173688 +0000 UTC m=+19.502579824" observedRunningTime="2026-04-16 22:13:50.444396205 +0000 UTC m=+20.812802384" watchObservedRunningTime="2026-04-16 22:13:50.445088234 +0000 UTC m=+20.813494385" Apr 16 22:13:50.460779 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:50.460709 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9cdfs" podStartSLOduration=2.82142748 podStartE2EDuration="20.460691092s" podCreationTimestamp="2026-04-16 22:13:30 +0000 UTC" firstStartedPulling="2026-04-16 22:13:31.515459191 +0000 UTC m=+1.883865322" lastFinishedPulling="2026-04-16 22:13:49.154722793 +0000 UTC m=+19.523128934" observedRunningTime="2026-04-16 22:13:50.459983192 +0000 UTC m=+20.828389343" watchObservedRunningTime="2026-04-16 22:13:50.460691092 +0000 UTC m=+20.829097244" Apr 16 22:13:50.478713 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:50.478652 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8pm2c" podStartSLOduration=2.59754659 podStartE2EDuration="20.478633383s" podCreationTimestamp="2026-04-16 22:13:30 +0000 UTC" firstStartedPulling="2026-04-16 22:13:31.453569716 +0000 UTC m=+1.821975849" lastFinishedPulling="2026-04-16 22:13:49.334656512 +0000 UTC m=+19.703062642" observedRunningTime="2026-04-16 22:13:50.47850791 +0000 UTC m=+20.846914062" watchObservedRunningTime="2026-04-16 22:13:50.478633383 +0000 UTC m=+20.847039512" Apr 16 22:13:50.895162 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:50.895130 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 22:13:51.194656 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:51.194500 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T22:13:50.895156417Z","UUID":"e3b4f8aa-1512-46ae-bb71-46339ae73cfa","Handler":null,"Name":"","Endpoint":""} Apr 16 22:13:51.196434 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:51.196408 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 22:13:51.196434 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:51.196439 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 22:13:51.258091 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:51.258048 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4pbqm" Apr 16 22:13:51.258244 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:51.258062 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:13:51.258244 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:51.258191 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4pbqm" podUID="65a893c9-3b9b-48c6-a82b-6236d443cacf" Apr 16 22:13:51.258331 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:51.258276 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvq6s" podUID="c220c5af-4b42-4b44-a789-17aa37d44b90" Apr 16 22:13:51.425712 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:51.425670 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jcgzh" event={"ID":"ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd","Type":"ContainerStarted","Data":"8dc9e2c2749f879996251446d1c37ba6a5c38b562e613c62c3c6a4fd3ef79fa8"} Apr 16 22:13:51.427451 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:51.427422 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-mcqg2" event={"ID":"8485072b-91cf-42a7-978b-e530b3a7b911","Type":"ContainerStarted","Data":"40bb290e4afc41d1270c8e1212dfeac0cb9b996a15fcfc1f7be4cd10c610b8f9"} Apr 16 22:13:51.454801 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:51.454680 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-mcqg2" podStartSLOduration=3.779513174 podStartE2EDuration="21.454659812s" podCreationTimestamp="2026-04-16 22:13:30 +0000 UTC" firstStartedPulling="2026-04-16 22:13:31.459026329 +0000 UTC m=+1.827432460" lastFinishedPulling="2026-04-16 22:13:49.134172963 +0000 UTC m=+19.502579098" observedRunningTime="2026-04-16 22:13:51.454260641 +0000 UTC m=+21.822666792" watchObservedRunningTime="2026-04-16 22:13:51.454659812 +0000 UTC m=+21.823065966" Apr 16 22:13:52.261431 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:52.261202 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7m5s7" Apr 16 22:13:52.261649 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:52.261515 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7m5s7" podUID="f60e0c7e-0c9f-4696-ba69-04969deb255d" Apr 16 22:13:52.431261 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:52.431223 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jcgzh" event={"ID":"ee6f8a37-fb77-4ed8-87ce-8346f1f80fbd","Type":"ContainerStarted","Data":"604b6f384cb694fb8851d4595db2acc5bf7ab6ebd10d52f60969725ebd63c409"} Apr 16 22:13:52.434426 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:52.434402 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hm4t5_cdab8cce-9b55-478d-b1b5-740aa9746143/ovn-acl-logging/0.log" Apr 16 22:13:52.434802 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:52.434773 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" event={"ID":"cdab8cce-9b55-478d-b1b5-740aa9746143","Type":"ContainerStarted","Data":"f8c3ba56bc88dff18be8221813150162715ecbb0b3f48e2b653c750a44f321d4"} Apr 16 22:13:52.436480 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:52.436451 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-72.ec2.internal" event={"ID":"7d0edda7066ec5684bc2bd9c10fb4784","Type":"ContainerStarted","Data":"41316f2b7893f156134e5e87c524c0ca93b698aa8b0c72471a74e77ba1381cdc"} Apr 16 22:13:52.451547 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:52.451493 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jcgzh" podStartSLOduration=2.266425897 podStartE2EDuration="22.451477519s" podCreationTimestamp="2026-04-16 22:13:30 +0000 UTC" firstStartedPulling="2026-04-16 22:13:31.50948855 +0000 UTC m=+1.877894685" lastFinishedPulling="2026-04-16 22:13:51.69454016 +0000 UTC m=+22.062946307" observedRunningTime="2026-04-16 22:13:52.451207629 +0000 UTC m=+22.819613782" watchObservedRunningTime="2026-04-16 22:13:52.451477519 +0000 UTC m=+22.819883671" Apr 16 22:13:52.468329 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:52.468287 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-72.ec2.internal" podStartSLOduration=22.468275124 podStartE2EDuration="22.468275124s" podCreationTimestamp="2026-04-16 22:13:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:13:52.467717287 +0000 UTC m=+22.836123439" watchObservedRunningTime="2026-04-16 22:13:52.468275124 +0000 UTC m=+22.836681274" Apr 16 22:13:53.257953 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:53.257918 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:13:53.258145 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:53.257918 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4pbqm" Apr 16 22:13:53.258145 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:53.258062 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvq6s" podUID="c220c5af-4b42-4b44-a789-17aa37d44b90" Apr 16 22:13:53.258145 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:53.258135 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4pbqm" podUID="65a893c9-3b9b-48c6-a82b-6236d443cacf" Apr 16 22:13:54.079302 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:54.079255 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-9hrlh" Apr 16 22:13:54.080035 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:54.079908 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-9hrlh" Apr 16 22:13:54.258298 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:54.258266 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7m5s7" Apr 16 22:13:54.258482 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:54.258387 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7m5s7" podUID="f60e0c7e-0c9f-4696-ba69-04969deb255d" Apr 16 22:13:54.444145 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:54.443844 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hm4t5_cdab8cce-9b55-478d-b1b5-740aa9746143/ovn-acl-logging/0.log" Apr 16 22:13:54.444494 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:54.444373 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" event={"ID":"cdab8cce-9b55-478d-b1b5-740aa9746143","Type":"ContainerStarted","Data":"1cc2629a69ac233b1b70a17ca16b3ac1fb9d985f502b40e10142cd536f554dce"} Apr 16 22:13:54.444775 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:54.444575 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-9hrlh" Apr 16 22:13:54.445239 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:54.445119 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-9hrlh" Apr 16 22:13:55.258522 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:55.258490 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4pbqm" Apr 16 22:13:55.258998 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:55.258533 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:13:55.258998 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:55.258613 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvq6s" podUID="c220c5af-4b42-4b44-a789-17aa37d44b90" Apr 16 22:13:55.258998 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:55.258720 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4pbqm" podUID="65a893c9-3b9b-48c6-a82b-6236d443cacf" Apr 16 22:13:55.447666 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:55.447637 2574 generic.go:358] "Generic (PLEG): container finished" podID="38e7a963-729e-40af-91e7-9fa6910bc258" containerID="f575e922c10a3608de8c8f8630a661132fb1fc3f79c0fd1c66eb158436c721a8" exitCode=0 Apr 16 22:13:55.447889 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:55.447716 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dx88j" event={"ID":"38e7a963-729e-40af-91e7-9fa6910bc258","Type":"ContainerDied","Data":"f575e922c10a3608de8c8f8630a661132fb1fc3f79c0fd1c66eb158436c721a8"} Apr 16 22:13:55.448209 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:55.448180 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:55.448209 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:55.448209 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:55.448351 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:55.448220 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:55.449223 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:55.448390 2574 scope.go:117] "RemoveContainer" containerID="810d1c3b0dc875df810991022bdea82f29248af5bf5d7010ba9d10605f05cb1c" Apr 16 22:13:55.463577 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:55.463559 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:55.464384 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:55.464366 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:13:56.257965 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:56.257795 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7m5s7" Apr 16 22:13:56.258108 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:56.258037 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7m5s7" podUID="f60e0c7e-0c9f-4696-ba69-04969deb255d" Apr 16 22:13:56.382011 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:56.381936 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7m5s7"] Apr 16 22:13:56.386291 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:56.386265 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wvq6s"] Apr 16 22:13:56.386412 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:56.386399 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:13:56.386539 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:56.386517 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvq6s" podUID="c220c5af-4b42-4b44-a789-17aa37d44b90" Apr 16 22:13:56.386964 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:56.386946 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4pbqm"] Apr 16 22:13:56.387057 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:56.387029 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4pbqm" Apr 16 22:13:56.387116 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:56.387095 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4pbqm" podUID="65a893c9-3b9b-48c6-a82b-6236d443cacf" Apr 16 22:13:56.430436 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:56.430409 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f60e0c7e-0c9f-4696-ba69-04969deb255d-original-pull-secret\") pod \"global-pull-secret-syncer-7m5s7\" (UID: \"f60e0c7e-0c9f-4696-ba69-04969deb255d\") " pod="kube-system/global-pull-secret-syncer-7m5s7" Apr 16 22:13:56.430570 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:56.430539 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:56.430610 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:56.430593 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f60e0c7e-0c9f-4696-ba69-04969deb255d-original-pull-secret podName:f60e0c7e-0c9f-4696-ba69-04969deb255d nodeName:}" failed. No retries permitted until 2026-04-16 22:14:12.430577272 +0000 UTC m=+42.798983402 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f60e0c7e-0c9f-4696-ba69-04969deb255d-original-pull-secret") pod "global-pull-secret-syncer-7m5s7" (UID: "f60e0c7e-0c9f-4696-ba69-04969deb255d") : object "kube-system"/"original-pull-secret" not registered Apr 16 22:13:56.452368 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:56.452345 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hm4t5_cdab8cce-9b55-478d-b1b5-740aa9746143/ovn-acl-logging/0.log" Apr 16 22:13:56.452707 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:56.452680 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" event={"ID":"cdab8cce-9b55-478d-b1b5-740aa9746143","Type":"ContainerStarted","Data":"5af1c78aa7c264fda7e258d915948777c269f32fd203b54cc958d0767012acc9"} Apr 16 22:13:56.454468 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:56.454440 2574 generic.go:358] "Generic (PLEG): container finished" podID="38e7a963-729e-40af-91e7-9fa6910bc258" containerID="3c8362a8a2ffe082495e0edce633a45b05e76722a63d6ef47e890c29cdac04fb" exitCode=0 Apr 16 22:13:56.454577 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:56.454513 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7m5s7" Apr 16 22:13:56.454577 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:56.454518 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dx88j" event={"ID":"38e7a963-729e-40af-91e7-9fa6910bc258","Type":"ContainerDied","Data":"3c8362a8a2ffe082495e0edce633a45b05e76722a63d6ef47e890c29cdac04fb"} Apr 16 22:13:56.454867 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:56.454843 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7m5s7" podUID="f60e0c7e-0c9f-4696-ba69-04969deb255d" Apr 16 22:13:56.479696 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:56.479656 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" podStartSLOduration=8.771459626 podStartE2EDuration="26.479642865s" podCreationTimestamp="2026-04-16 22:13:30 +0000 UTC" firstStartedPulling="2026-04-16 22:13:31.486858039 +0000 UTC m=+1.855264171" lastFinishedPulling="2026-04-16 22:13:49.195041282 +0000 UTC m=+19.563447410" observedRunningTime="2026-04-16 22:13:56.479439515 +0000 UTC m=+26.847845667" watchObservedRunningTime="2026-04-16 22:13:56.479642865 +0000 UTC m=+26.848049186" Apr 16 22:13:57.459664 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:57.459576 2574 generic.go:358] "Generic (PLEG): container finished" podID="38e7a963-729e-40af-91e7-9fa6910bc258" containerID="719a2ba5c39a996153164a256aa578a9e885771b504d300cbebab273c369e28a" exitCode=0 Apr 16 22:13:57.459664 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:57.459631 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dx88j" event={"ID":"38e7a963-729e-40af-91e7-9fa6910bc258","Type":"ContainerDied","Data":"719a2ba5c39a996153164a256aa578a9e885771b504d300cbebab273c369e28a"} Apr 16 22:13:58.258547 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:58.258514 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4pbqm" Apr 16 22:13:58.258732 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:58.258514 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7m5s7" Apr 16 22:13:58.258732 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:58.258644 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4pbqm" podUID="65a893c9-3b9b-48c6-a82b-6236d443cacf" Apr 16 22:13:58.258858 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:58.258755 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7m5s7" podUID="f60e0c7e-0c9f-4696-ba69-04969deb255d" Apr 16 22:13:58.258858 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:13:58.258515 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:13:58.258939 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:13:58.258880 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvq6s" podUID="c220c5af-4b42-4b44-a789-17aa37d44b90" Apr 16 22:14:00.258890 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:00.258856 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4pbqm" Apr 16 22:14:00.259509 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:00.258955 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:14:00.259509 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:00.258987 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4pbqm" podUID="65a893c9-3b9b-48c6-a82b-6236d443cacf" Apr 16 22:14:00.259509 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:00.259023 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wvq6s" podUID="c220c5af-4b42-4b44-a789-17aa37d44b90" Apr 16 22:14:00.259509 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:00.259050 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7m5s7" Apr 16 22:14:00.259509 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:00.259101 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7m5s7" podUID="f60e0c7e-0c9f-4696-ba69-04969deb255d" Apr 16 22:14:01.947924 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:01.947712 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-72.ec2.internal" event="NodeReady" Apr 16 22:14:01.948338 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:01.948051 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 22:14:01.999949 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:01.999912 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-ffxpx"] Apr 16 22:14:02.036982 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.036953 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-xmqcp"] Apr 16 22:14:02.037151 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.037126 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ffxpx" Apr 16 22:14:02.039576 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.039513 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-skfrk\"" Apr 16 22:14:02.039576 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.039513 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 22:14:02.039817 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.039637 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 22:14:02.039817 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.039771 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 22:14:02.051324 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.051303 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ffxpx"] Apr 16 22:14:02.051324 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.051325 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xmqcp"] Apr 16 22:14:02.051492 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.051419 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xmqcp" Apr 16 22:14:02.053716 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.053694 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 22:14:02.053847 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.053704 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 22:14:02.054184 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.054167 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-nlsdl\"" Apr 16 22:14:02.176701 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.176661 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-tmp-dir\") pod \"dns-default-xmqcp\" (UID: \"cbfccc2d-f25a-4d8a-bd22-25a929f12d64\") " pod="openshift-dns/dns-default-xmqcp" Apr 16 22:14:02.176894 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.176755 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vxh6\" (UniqueName: \"kubernetes.io/projected/d146454c-862c-4665-b411-fd4c29e30335-kube-api-access-4vxh6\") pod \"ingress-canary-ffxpx\" (UID: \"d146454c-862c-4665-b411-fd4c29e30335\") " pod="openshift-ingress-canary/ingress-canary-ffxpx" Apr 16 22:14:02.176894 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.176791 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4785\" (UniqueName: \"kubernetes.io/projected/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-kube-api-access-t4785\") pod \"dns-default-xmqcp\" (UID: \"cbfccc2d-f25a-4d8a-bd22-25a929f12d64\") " pod="openshift-dns/dns-default-xmqcp" Apr 16 22:14:02.176894 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.176835 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d146454c-862c-4665-b411-fd4c29e30335-cert\") pod \"ingress-canary-ffxpx\" (UID: \"d146454c-862c-4665-b411-fd4c29e30335\") " pod="openshift-ingress-canary/ingress-canary-ffxpx" Apr 16 22:14:02.176894 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.176867 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-metrics-tls\") pod \"dns-default-xmqcp\" (UID: \"cbfccc2d-f25a-4d8a-bd22-25a929f12d64\") " pod="openshift-dns/dns-default-xmqcp" Apr 16 22:14:02.176894 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.176893 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-config-volume\") pod \"dns-default-xmqcp\" (UID: \"cbfccc2d-f25a-4d8a-bd22-25a929f12d64\") " pod="openshift-dns/dns-default-xmqcp" Apr 16 22:14:02.257750 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.257666 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4pbqm" Apr 16 22:14:02.257910 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.257666 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7m5s7" Apr 16 22:14:02.257910 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.257666 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:14:02.260409 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.260387 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 22:14:02.260409 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.260398 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-28226\"" Apr 16 22:14:02.260582 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.260426 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 22:14:02.260582 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.260428 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 22:14:02.260684 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.260647 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 22:14:02.260684 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.260677 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-hqwt5\"" Apr 16 22:14:02.277303 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.277200 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-config-volume\") pod \"dns-default-xmqcp\" (UID: \"cbfccc2d-f25a-4d8a-bd22-25a929f12d64\") " pod="openshift-dns/dns-default-xmqcp" Apr 16 22:14:02.277303 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.277234 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-tmp-dir\") pod \"dns-default-xmqcp\" (UID: \"cbfccc2d-f25a-4d8a-bd22-25a929f12d64\") " pod="openshift-dns/dns-default-xmqcp" Apr 16 22:14:02.277434 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.277302 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vxh6\" (UniqueName: \"kubernetes.io/projected/d146454c-862c-4665-b411-fd4c29e30335-kube-api-access-4vxh6\") pod \"ingress-canary-ffxpx\" (UID: \"d146454c-862c-4665-b411-fd4c29e30335\") " pod="openshift-ingress-canary/ingress-canary-ffxpx" Apr 16 22:14:02.277434 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.277337 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4785\" (UniqueName: \"kubernetes.io/projected/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-kube-api-access-t4785\") pod \"dns-default-xmqcp\" (UID: \"cbfccc2d-f25a-4d8a-bd22-25a929f12d64\") " pod="openshift-dns/dns-default-xmqcp" Apr 16 22:14:02.277434 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.277386 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d146454c-862c-4665-b411-fd4c29e30335-cert\") pod \"ingress-canary-ffxpx\" (UID: \"d146454c-862c-4665-b411-fd4c29e30335\") " pod="openshift-ingress-canary/ingress-canary-ffxpx" Apr 16 22:14:02.277434 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.277419 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-metrics-tls\") pod \"dns-default-xmqcp\" (UID: \"cbfccc2d-f25a-4d8a-bd22-25a929f12d64\") " pod="openshift-dns/dns-default-xmqcp" Apr 16 22:14:02.277623 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:02.277522 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:02.277623 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:02.277567 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-metrics-tls podName:cbfccc2d-f25a-4d8a-bd22-25a929f12d64 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:02.777553174 +0000 UTC m=+33.145959303 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-metrics-tls") pod "dns-default-xmqcp" (UID: "cbfccc2d-f25a-4d8a-bd22-25a929f12d64") : secret "dns-default-metrics-tls" not found Apr 16 22:14:02.277730 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.277705 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-tmp-dir\") pod \"dns-default-xmqcp\" (UID: \"cbfccc2d-f25a-4d8a-bd22-25a929f12d64\") " pod="openshift-dns/dns-default-xmqcp" Apr 16 22:14:02.277837 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:02.277812 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:02.277951 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:02.277880 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d146454c-862c-4665-b411-fd4c29e30335-cert podName:d146454c-862c-4665-b411-fd4c29e30335 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:02.777861403 +0000 UTC m=+33.146267543 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d146454c-862c-4665-b411-fd4c29e30335-cert") pod "ingress-canary-ffxpx" (UID: "d146454c-862c-4665-b411-fd4c29e30335") : secret "canary-serving-cert" not found Apr 16 22:14:02.277951 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.277936 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-config-volume\") pod \"dns-default-xmqcp\" (UID: \"cbfccc2d-f25a-4d8a-bd22-25a929f12d64\") " pod="openshift-dns/dns-default-xmqcp" Apr 16 22:14:02.290544 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.290521 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vxh6\" (UniqueName: \"kubernetes.io/projected/d146454c-862c-4665-b411-fd4c29e30335-kube-api-access-4vxh6\") pod \"ingress-canary-ffxpx\" (UID: \"d146454c-862c-4665-b411-fd4c29e30335\") " pod="openshift-ingress-canary/ingress-canary-ffxpx" Apr 16 22:14:02.300043 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.300022 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4785\" (UniqueName: \"kubernetes.io/projected/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-kube-api-access-t4785\") pod \"dns-default-xmqcp\" (UID: \"cbfccc2d-f25a-4d8a-bd22-25a929f12d64\") " pod="openshift-dns/dns-default-xmqcp" Apr 16 22:14:02.781758 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.781698 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d146454c-862c-4665-b411-fd4c29e30335-cert\") pod \"ingress-canary-ffxpx\" (UID: \"d146454c-862c-4665-b411-fd4c29e30335\") " pod="openshift-ingress-canary/ingress-canary-ffxpx" Apr 16 22:14:02.781954 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.781780 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-metrics-tls\") pod \"dns-default-xmqcp\" (UID: \"cbfccc2d-f25a-4d8a-bd22-25a929f12d64\") " pod="openshift-dns/dns-default-xmqcp" Apr 16 22:14:02.781954 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:02.781918 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:02.782062 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:02.781998 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-metrics-tls podName:cbfccc2d-f25a-4d8a-bd22-25a929f12d64 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:03.781978208 +0000 UTC m=+34.150384343 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-metrics-tls") pod "dns-default-xmqcp" (UID: "cbfccc2d-f25a-4d8a-bd22-25a929f12d64") : secret "dns-default-metrics-tls" not found Apr 16 22:14:02.782062 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:02.781919 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:02.782062 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:02.782037 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d146454c-862c-4665-b411-fd4c29e30335-cert podName:d146454c-862c-4665-b411-fd4c29e30335 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:03.782026793 +0000 UTC m=+34.150432936 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d146454c-862c-4665-b411-fd4c29e30335-cert") pod "ingress-canary-ffxpx" (UID: "d146454c-862c-4665-b411-fd4c29e30335") : secret "canary-serving-cert" not found Apr 16 22:14:02.882136 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.882099 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c220c5af-4b42-4b44-a789-17aa37d44b90-metrics-certs\") pod \"network-metrics-daemon-wvq6s\" (UID: \"c220c5af-4b42-4b44-a789-17aa37d44b90\") " pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:14:02.882273 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:02.882220 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:14:02.882323 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:02.882285 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c220c5af-4b42-4b44-a789-17aa37d44b90-metrics-certs podName:c220c5af-4b42-4b44-a789-17aa37d44b90 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:34.882269346 +0000 UTC m=+65.250675475 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c220c5af-4b42-4b44-a789-17aa37d44b90-metrics-certs") pod "network-metrics-daemon-wvq6s" (UID: "c220c5af-4b42-4b44-a789-17aa37d44b90") : secret "metrics-daemon-secret" not found Apr 16 22:14:02.983160 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.983120 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xscg5\" (UniqueName: \"kubernetes.io/projected/65a893c9-3b9b-48c6-a82b-6236d443cacf-kube-api-access-xscg5\") pod \"network-check-target-4pbqm\" (UID: \"65a893c9-3b9b-48c6-a82b-6236d443cacf\") " pod="openshift-network-diagnostics/network-check-target-4pbqm" Apr 16 22:14:02.985561 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:02.985537 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xscg5\" (UniqueName: \"kubernetes.io/projected/65a893c9-3b9b-48c6-a82b-6236d443cacf-kube-api-access-xscg5\") pod \"network-check-target-4pbqm\" (UID: \"65a893c9-3b9b-48c6-a82b-6236d443cacf\") " pod="openshift-network-diagnostics/network-check-target-4pbqm" Apr 16 22:14:03.174094 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:03.174014 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4pbqm" Apr 16 22:14:03.435398 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:03.435334 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4pbqm"] Apr 16 22:14:03.440313 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:14:03.440283 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65a893c9_3b9b_48c6_a82b_6236d443cacf.slice/crio-595d9f82ae03fbea4b146ead8a4320da6dd983e777eaa9ee5be830a8396eb7fb WatchSource:0}: Error finding container 595d9f82ae03fbea4b146ead8a4320da6dd983e777eaa9ee5be830a8396eb7fb: Status 404 returned error can't find the container with id 595d9f82ae03fbea4b146ead8a4320da6dd983e777eaa9ee5be830a8396eb7fb Apr 16 22:14:03.473906 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:03.473729 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dx88j" event={"ID":"38e7a963-729e-40af-91e7-9fa6910bc258","Type":"ContainerStarted","Data":"a65de0dde9dbdf836bada6f8db67f5a14795eebe8f8fe70d52b521457d97e7d7"} Apr 16 22:14:03.474773 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:03.474722 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4pbqm" event={"ID":"65a893c9-3b9b-48c6-a82b-6236d443cacf","Type":"ContainerStarted","Data":"595d9f82ae03fbea4b146ead8a4320da6dd983e777eaa9ee5be830a8396eb7fb"} Apr 16 22:14:03.789144 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:03.789111 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d146454c-862c-4665-b411-fd4c29e30335-cert\") pod \"ingress-canary-ffxpx\" (UID: \"d146454c-862c-4665-b411-fd4c29e30335\") " pod="openshift-ingress-canary/ingress-canary-ffxpx" Apr 16 22:14:03.789298 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:03.789158 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-metrics-tls\") pod \"dns-default-xmqcp\" (UID: \"cbfccc2d-f25a-4d8a-bd22-25a929f12d64\") " pod="openshift-dns/dns-default-xmqcp" Apr 16 22:14:03.789298 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:03.789254 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:03.789368 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:03.789315 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d146454c-862c-4665-b411-fd4c29e30335-cert podName:d146454c-862c-4665-b411-fd4c29e30335 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:05.789300159 +0000 UTC m=+36.157706291 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d146454c-862c-4665-b411-fd4c29e30335-cert") pod "ingress-canary-ffxpx" (UID: "d146454c-862c-4665-b411-fd4c29e30335") : secret "canary-serving-cert" not found Apr 16 22:14:03.789368 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:03.789262 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:03.789368 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:03.789347 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-metrics-tls podName:cbfccc2d-f25a-4d8a-bd22-25a929f12d64 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:05.78934053 +0000 UTC m=+36.157746664 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-metrics-tls") pod "dns-default-xmqcp" (UID: "cbfccc2d-f25a-4d8a-bd22-25a929f12d64") : secret "dns-default-metrics-tls" not found Apr 16 22:14:04.479608 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:04.479572 2574 generic.go:358] "Generic (PLEG): container finished" podID="38e7a963-729e-40af-91e7-9fa6910bc258" containerID="a65de0dde9dbdf836bada6f8db67f5a14795eebe8f8fe70d52b521457d97e7d7" exitCode=0 Apr 16 22:14:04.480073 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:04.479628 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dx88j" event={"ID":"38e7a963-729e-40af-91e7-9fa6910bc258","Type":"ContainerDied","Data":"a65de0dde9dbdf836bada6f8db67f5a14795eebe8f8fe70d52b521457d97e7d7"} Apr 16 22:14:05.484657 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:05.484617 2574 generic.go:358] "Generic (PLEG): container finished" podID="38e7a963-729e-40af-91e7-9fa6910bc258" containerID="f30de077c38b07496258b6d5122582f99ed59f6d2b6a3b3e79df9df958359371" exitCode=0 Apr 16 22:14:05.485177 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:05.484696 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dx88j" event={"ID":"38e7a963-729e-40af-91e7-9fa6910bc258","Type":"ContainerDied","Data":"f30de077c38b07496258b6d5122582f99ed59f6d2b6a3b3e79df9df958359371"} Apr 16 22:14:05.805215 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:05.805169 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d146454c-862c-4665-b411-fd4c29e30335-cert\") pod \"ingress-canary-ffxpx\" (UID: \"d146454c-862c-4665-b411-fd4c29e30335\") " pod="openshift-ingress-canary/ingress-canary-ffxpx" Apr 16 22:14:05.805215 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:05.805218 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-metrics-tls\") pod \"dns-default-xmqcp\" (UID: \"cbfccc2d-f25a-4d8a-bd22-25a929f12d64\") " pod="openshift-dns/dns-default-xmqcp" Apr 16 22:14:05.805451 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:05.805326 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:05.805451 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:05.805354 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:05.805451 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:05.805407 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d146454c-862c-4665-b411-fd4c29e30335-cert podName:d146454c-862c-4665-b411-fd4c29e30335 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:09.805388241 +0000 UTC m=+40.173794372 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d146454c-862c-4665-b411-fd4c29e30335-cert") pod "ingress-canary-ffxpx" (UID: "d146454c-862c-4665-b411-fd4c29e30335") : secret "canary-serving-cert" not found Apr 16 22:14:05.805451 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:05.805423 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-metrics-tls podName:cbfccc2d-f25a-4d8a-bd22-25a929f12d64 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:09.805416894 +0000 UTC m=+40.173823023 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-metrics-tls") pod "dns-default-xmqcp" (UID: "cbfccc2d-f25a-4d8a-bd22-25a929f12d64") : secret "dns-default-metrics-tls" not found Apr 16 22:14:06.491129 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:06.491105 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dx88j" event={"ID":"38e7a963-729e-40af-91e7-9fa6910bc258","Type":"ContainerStarted","Data":"9c3f05a3c1dc62a70bd350e4772c36864ccfafe58ecc9db03c4d3e92a98e7161"} Apr 16 22:14:06.514351 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:06.514267 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-dx88j" podStartSLOduration=4.762134543 podStartE2EDuration="36.514254538s" podCreationTimestamp="2026-04-16 22:13:30 +0000 UTC" firstStartedPulling="2026-04-16 22:13:31.521612309 +0000 UTC m=+1.890018441" lastFinishedPulling="2026-04-16 22:14:03.273732289 +0000 UTC m=+33.642138436" observedRunningTime="2026-04-16 22:14:06.512999981 +0000 UTC m=+36.881406125" watchObservedRunningTime="2026-04-16 22:14:06.514254538 +0000 UTC m=+36.882660688" Apr 16 22:14:07.495207 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:07.495173 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4pbqm" event={"ID":"65a893c9-3b9b-48c6-a82b-6236d443cacf","Type":"ContainerStarted","Data":"1de5db32c784b5da3cfb1d44ed53a057d98b72862ea98609f8e17b3427e4851d"} Apr 16 22:14:07.495783 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:07.495500 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-4pbqm" Apr 16 22:14:07.512637 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:07.512590 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-4pbqm" podStartSLOduration=34.558592656 podStartE2EDuration="37.512577742s" podCreationTimestamp="2026-04-16 22:13:30 +0000 UTC" firstStartedPulling="2026-04-16 22:14:03.442451583 +0000 UTC m=+33.810857726" lastFinishedPulling="2026-04-16 22:14:06.396436667 +0000 UTC m=+36.764842812" observedRunningTime="2026-04-16 22:14:07.511432103 +0000 UTC m=+37.879838255" watchObservedRunningTime="2026-04-16 22:14:07.512577742 +0000 UTC m=+37.880983893" Apr 16 22:14:09.835503 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:09.835464 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d146454c-862c-4665-b411-fd4c29e30335-cert\") pod \"ingress-canary-ffxpx\" (UID: \"d146454c-862c-4665-b411-fd4c29e30335\") " pod="openshift-ingress-canary/ingress-canary-ffxpx" Apr 16 22:14:09.835898 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:09.835509 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-metrics-tls\") pod \"dns-default-xmqcp\" (UID: \"cbfccc2d-f25a-4d8a-bd22-25a929f12d64\") " pod="openshift-dns/dns-default-xmqcp" Apr 16 22:14:09.835898 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:09.835599 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:09.835898 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:09.835605 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:09.835898 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:09.835660 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-metrics-tls podName:cbfccc2d-f25a-4d8a-bd22-25a929f12d64 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:17.835644234 +0000 UTC m=+48.204050367 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-metrics-tls") pod "dns-default-xmqcp" (UID: "cbfccc2d-f25a-4d8a-bd22-25a929f12d64") : secret "dns-default-metrics-tls" not found Apr 16 22:14:09.835898 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:09.835673 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d146454c-862c-4665-b411-fd4c29e30335-cert podName:d146454c-862c-4665-b411-fd4c29e30335 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:17.835667234 +0000 UTC m=+48.204073363 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d146454c-862c-4665-b411-fd4c29e30335-cert") pod "ingress-canary-ffxpx" (UID: "d146454c-862c-4665-b411-fd4c29e30335") : secret "canary-serving-cert" not found Apr 16 22:14:10.962933 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:10.962897 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-789c5c85c5-nmwzr"] Apr 16 22:14:10.999713 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:10.999686 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-bfc757575-sd5j9"] Apr 16 22:14:10.999871 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:10.999844 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-789c5c85c5-nmwzr" Apr 16 22:14:11.002209 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:11.002183 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 22:14:11.002390 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:11.002296 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 22:14:11.002390 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:11.002325 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 22:14:11.002390 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:11.002296 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 22:14:11.021160 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:11.021135 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-bfc757575-sd5j9"] Apr 16 22:14:11.021160 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:11.021161 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-789c5c85c5-nmwzr"] Apr 16 22:14:11.021313 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:11.021256 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-bfc757575-sd5j9" Apr 16 22:14:11.023483 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:11.023465 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-2hlzk\"" Apr 16 22:14:11.023561 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:11.023519 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 22:14:11.143252 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:11.143211 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/684d3477-17da-4111-9ac7-84233c26da51-klusterlet-config\") pod \"klusterlet-addon-workmgr-789c5c85c5-nmwzr\" (UID: \"684d3477-17da-4111-9ac7-84233c26da51\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-789c5c85c5-nmwzr" Apr 16 22:14:11.143426 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:11.143256 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98jvd\" (UniqueName: \"kubernetes.io/projected/8c091a7d-1fb3-42d3-992d-581e854f4fd6-kube-api-access-98jvd\") pod \"managed-serviceaccount-addon-agent-bfc757575-sd5j9\" (UID: \"8c091a7d-1fb3-42d3-992d-581e854f4fd6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-bfc757575-sd5j9" Apr 16 22:14:11.143426 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:11.143288 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xprn\" (UniqueName: \"kubernetes.io/projected/684d3477-17da-4111-9ac7-84233c26da51-kube-api-access-4xprn\") pod \"klusterlet-addon-workmgr-789c5c85c5-nmwzr\" (UID: \"684d3477-17da-4111-9ac7-84233c26da51\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-789c5c85c5-nmwzr" Apr 16 22:14:11.143426 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:11.143365 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8c091a7d-1fb3-42d3-992d-581e854f4fd6-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-bfc757575-sd5j9\" (UID: \"8c091a7d-1fb3-42d3-992d-581e854f4fd6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-bfc757575-sd5j9" Apr 16 22:14:11.143426 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:11.143423 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/684d3477-17da-4111-9ac7-84233c26da51-tmp\") pod \"klusterlet-addon-workmgr-789c5c85c5-nmwzr\" (UID: \"684d3477-17da-4111-9ac7-84233c26da51\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-789c5c85c5-nmwzr" Apr 16 22:14:11.244580 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:11.244553 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/684d3477-17da-4111-9ac7-84233c26da51-klusterlet-config\") pod \"klusterlet-addon-workmgr-789c5c85c5-nmwzr\" (UID: \"684d3477-17da-4111-9ac7-84233c26da51\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-789c5c85c5-nmwzr" Apr 16 22:14:11.244718 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:11.244587 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98jvd\" (UniqueName: \"kubernetes.io/projected/8c091a7d-1fb3-42d3-992d-581e854f4fd6-kube-api-access-98jvd\") pod \"managed-serviceaccount-addon-agent-bfc757575-sd5j9\" (UID: \"8c091a7d-1fb3-42d3-992d-581e854f4fd6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-bfc757575-sd5j9" Apr 16 22:14:11.244718 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:11.244606 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4xprn\" (UniqueName: \"kubernetes.io/projected/684d3477-17da-4111-9ac7-84233c26da51-kube-api-access-4xprn\") pod \"klusterlet-addon-workmgr-789c5c85c5-nmwzr\" (UID: \"684d3477-17da-4111-9ac7-84233c26da51\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-789c5c85c5-nmwzr" Apr 16 22:14:11.244718 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:11.244647 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8c091a7d-1fb3-42d3-992d-581e854f4fd6-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-bfc757575-sd5j9\" (UID: \"8c091a7d-1fb3-42d3-992d-581e854f4fd6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-bfc757575-sd5j9" Apr 16 22:14:11.244906 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:11.244717 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/684d3477-17da-4111-9ac7-84233c26da51-tmp\") pod \"klusterlet-addon-workmgr-789c5c85c5-nmwzr\" (UID: \"684d3477-17da-4111-9ac7-84233c26da51\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-789c5c85c5-nmwzr" Apr 16 22:14:11.245098 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:11.245078 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/684d3477-17da-4111-9ac7-84233c26da51-tmp\") pod \"klusterlet-addon-workmgr-789c5c85c5-nmwzr\" (UID: \"684d3477-17da-4111-9ac7-84233c26da51\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-789c5c85c5-nmwzr" Apr 16 22:14:11.247986 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:11.247963 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8c091a7d-1fb3-42d3-992d-581e854f4fd6-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-bfc757575-sd5j9\" (UID: \"8c091a7d-1fb3-42d3-992d-581e854f4fd6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-bfc757575-sd5j9" Apr 16 22:14:11.248068 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:11.248047 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/684d3477-17da-4111-9ac7-84233c26da51-klusterlet-config\") pod \"klusterlet-addon-workmgr-789c5c85c5-nmwzr\" (UID: \"684d3477-17da-4111-9ac7-84233c26da51\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-789c5c85c5-nmwzr" Apr 16 22:14:11.252576 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:11.252547 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xprn\" (UniqueName: \"kubernetes.io/projected/684d3477-17da-4111-9ac7-84233c26da51-kube-api-access-4xprn\") pod \"klusterlet-addon-workmgr-789c5c85c5-nmwzr\" (UID: \"684d3477-17da-4111-9ac7-84233c26da51\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-789c5c85c5-nmwzr" Apr 16 22:14:11.252716 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:11.252698 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-98jvd\" (UniqueName: \"kubernetes.io/projected/8c091a7d-1fb3-42d3-992d-581e854f4fd6-kube-api-access-98jvd\") pod \"managed-serviceaccount-addon-agent-bfc757575-sd5j9\" (UID: \"8c091a7d-1fb3-42d3-992d-581e854f4fd6\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-bfc757575-sd5j9" Apr 16 22:14:11.309622 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:11.309585 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-789c5c85c5-nmwzr" Apr 16 22:14:11.340598 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:11.340569 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-bfc757575-sd5j9" Apr 16 22:14:11.439699 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:11.439651 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-789c5c85c5-nmwzr"] Apr 16 22:14:11.443669 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:14:11.443635 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod684d3477_17da_4111_9ac7_84233c26da51.slice/crio-ef3f4b630a7da835a5a0095d3cb8e80db3f11d96365efded0728475c1f23d7cd WatchSource:0}: Error finding container ef3f4b630a7da835a5a0095d3cb8e80db3f11d96365efded0728475c1f23d7cd: Status 404 returned error can't find the container with id ef3f4b630a7da835a5a0095d3cb8e80db3f11d96365efded0728475c1f23d7cd Apr 16 22:14:11.480480 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:11.480452 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-bfc757575-sd5j9"] Apr 16 22:14:11.499533 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:14:11.499482 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c091a7d_1fb3_42d3_992d_581e854f4fd6.slice/crio-551a3527cf7b178d2c194db722425bbadfaaa8b60b399a4a8554d651bafe46ac WatchSource:0}: Error finding container 551a3527cf7b178d2c194db722425bbadfaaa8b60b399a4a8554d651bafe46ac: Status 404 returned error can't find the container with id 551a3527cf7b178d2c194db722425bbadfaaa8b60b399a4a8554d651bafe46ac Apr 16 22:14:11.503110 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:11.503081 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-bfc757575-sd5j9" event={"ID":"8c091a7d-1fb3-42d3-992d-581e854f4fd6","Type":"ContainerStarted","Data":"551a3527cf7b178d2c194db722425bbadfaaa8b60b399a4a8554d651bafe46ac"} Apr 16 22:14:11.503993 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:11.503969 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-789c5c85c5-nmwzr" event={"ID":"684d3477-17da-4111-9ac7-84233c26da51","Type":"ContainerStarted","Data":"ef3f4b630a7da835a5a0095d3cb8e80db3f11d96365efded0728475c1f23d7cd"} Apr 16 22:14:12.453385 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:12.453351 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f60e0c7e-0c9f-4696-ba69-04969deb255d-original-pull-secret\") pod \"global-pull-secret-syncer-7m5s7\" (UID: \"f60e0c7e-0c9f-4696-ba69-04969deb255d\") " pod="kube-system/global-pull-secret-syncer-7m5s7" Apr 16 22:14:12.457118 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:12.457092 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f60e0c7e-0c9f-4696-ba69-04969deb255d-original-pull-secret\") pod \"global-pull-secret-syncer-7m5s7\" (UID: \"f60e0c7e-0c9f-4696-ba69-04969deb255d\") " pod="kube-system/global-pull-secret-syncer-7m5s7" Apr 16 22:14:12.479882 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:12.479846 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7m5s7" Apr 16 22:14:12.615659 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:12.615626 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7m5s7"] Apr 16 22:14:12.625461 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:14:12.625429 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf60e0c7e_0c9f_4696_ba69_04969deb255d.slice/crio-3619bbe116b4fbc5c9aa29c619813696548b5c5b46f8b6dfec3bf21c39e173aa WatchSource:0}: Error finding container 3619bbe116b4fbc5c9aa29c619813696548b5c5b46f8b6dfec3bf21c39e173aa: Status 404 returned error can't find the container with id 3619bbe116b4fbc5c9aa29c619813696548b5c5b46f8b6dfec3bf21c39e173aa Apr 16 22:14:13.511366 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:13.511329 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7m5s7" event={"ID":"f60e0c7e-0c9f-4696-ba69-04969deb255d","Type":"ContainerStarted","Data":"3619bbe116b4fbc5c9aa29c619813696548b5c5b46f8b6dfec3bf21c39e173aa"} Apr 16 22:14:17.896574 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:17.896541 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d146454c-862c-4665-b411-fd4c29e30335-cert\") pod \"ingress-canary-ffxpx\" (UID: \"d146454c-862c-4665-b411-fd4c29e30335\") " pod="openshift-ingress-canary/ingress-canary-ffxpx" Apr 16 22:14:17.896574 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:17.896579 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-metrics-tls\") pod \"dns-default-xmqcp\" (UID: \"cbfccc2d-f25a-4d8a-bd22-25a929f12d64\") " pod="openshift-dns/dns-default-xmqcp" Apr 16 22:14:17.897038 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:17.896680 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:17.897038 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:17.896703 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:17.897038 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:17.896736 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-metrics-tls podName:cbfccc2d-f25a-4d8a-bd22-25a929f12d64 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:33.896719261 +0000 UTC m=+64.265125392 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-metrics-tls") pod "dns-default-xmqcp" (UID: "cbfccc2d-f25a-4d8a-bd22-25a929f12d64") : secret "dns-default-metrics-tls" not found Apr 16 22:14:17.897038 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:17.896788 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d146454c-862c-4665-b411-fd4c29e30335-cert podName:d146454c-862c-4665-b411-fd4c29e30335 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:33.896770264 +0000 UTC m=+64.265176409 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d146454c-862c-4665-b411-fd4c29e30335-cert") pod "ingress-canary-ffxpx" (UID: "d146454c-862c-4665-b411-fd4c29e30335") : secret "canary-serving-cert" not found Apr 16 22:14:18.523596 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:18.523562 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-789c5c85c5-nmwzr" event={"ID":"684d3477-17da-4111-9ac7-84233c26da51","Type":"ContainerStarted","Data":"445a2e79c60c865fa187f508e24f6b5ae375bbb6c72b4735d2ed6b96ed73a4f7"} Apr 16 22:14:18.523825 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:18.523704 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-789c5c85c5-nmwzr" Apr 16 22:14:18.524913 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:18.524870 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-bfc757575-sd5j9" event={"ID":"8c091a7d-1fb3-42d3-992d-581e854f4fd6","Type":"ContainerStarted","Data":"1907217c064553a77c288e54a46f57de636591a417b153f995101d7644fa9af4"} Apr 16 22:14:18.525661 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:18.525640 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-789c5c85c5-nmwzr" Apr 16 22:14:18.526136 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:18.526113 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7m5s7" event={"ID":"f60e0c7e-0c9f-4696-ba69-04969deb255d","Type":"ContainerStarted","Data":"df51592cc0dd006579137a945e94fe160d5df0519da63621f8802e3b549d55d1"} Apr 16 22:14:18.540785 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:18.540726 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-789c5c85c5-nmwzr" podStartSLOduration=1.810469097 podStartE2EDuration="8.540714721s" podCreationTimestamp="2026-04-16 22:14:10 +0000 UTC" firstStartedPulling="2026-04-16 22:14:11.445631419 +0000 UTC m=+41.814037560" lastFinishedPulling="2026-04-16 22:14:18.175877053 +0000 UTC m=+48.544283184" observedRunningTime="2026-04-16 22:14:18.539892304 +0000 UTC m=+48.908298454" watchObservedRunningTime="2026-04-16 22:14:18.540714721 +0000 UTC m=+48.909120880" Apr 16 22:14:18.561203 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:18.561163 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-7m5s7" podStartSLOduration=33.000954145 podStartE2EDuration="38.56115205s" podCreationTimestamp="2026-04-16 22:13:40 +0000 UTC" firstStartedPulling="2026-04-16 22:14:12.627793133 +0000 UTC m=+42.996199273" lastFinishedPulling="2026-04-16 22:14:18.187991049 +0000 UTC m=+48.556397178" observedRunningTime="2026-04-16 22:14:18.560267767 +0000 UTC m=+48.928673918" watchObservedRunningTime="2026-04-16 22:14:18.56115205 +0000 UTC m=+48.929558202" Apr 16 22:14:18.596320 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:18.596273 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-bfc757575-sd5j9" podStartSLOduration=1.922270242 podStartE2EDuration="8.59625834s" podCreationTimestamp="2026-04-16 22:14:10 +0000 UTC" firstStartedPulling="2026-04-16 22:14:11.501436564 +0000 UTC m=+41.869842707" lastFinishedPulling="2026-04-16 22:14:18.175424671 +0000 UTC m=+48.543830805" observedRunningTime="2026-04-16 22:14:18.595862987 +0000 UTC m=+48.964269142" watchObservedRunningTime="2026-04-16 22:14:18.59625834 +0000 UTC m=+48.964664490" Apr 16 22:14:27.470379 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:27.470350 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hm4t5" Apr 16 22:14:33.905419 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:33.905381 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d146454c-862c-4665-b411-fd4c29e30335-cert\") pod \"ingress-canary-ffxpx\" (UID: \"d146454c-862c-4665-b411-fd4c29e30335\") " pod="openshift-ingress-canary/ingress-canary-ffxpx" Apr 16 22:14:33.905419 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:33.905424 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-metrics-tls\") pod \"dns-default-xmqcp\" (UID: \"cbfccc2d-f25a-4d8a-bd22-25a929f12d64\") " pod="openshift-dns/dns-default-xmqcp" Apr 16 22:14:33.905900 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:33.905533 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:33.905900 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:33.905588 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-metrics-tls podName:cbfccc2d-f25a-4d8a-bd22-25a929f12d64 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:05.905574462 +0000 UTC m=+96.273980595 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-metrics-tls") pod "dns-default-xmqcp" (UID: "cbfccc2d-f25a-4d8a-bd22-25a929f12d64") : secret "dns-default-metrics-tls" not found Apr 16 22:14:33.905900 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:33.905534 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:33.905900 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:33.905678 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d146454c-862c-4665-b411-fd4c29e30335-cert podName:d146454c-862c-4665-b411-fd4c29e30335 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:05.905666796 +0000 UTC m=+96.274072929 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d146454c-862c-4665-b411-fd4c29e30335-cert") pod "ingress-canary-ffxpx" (UID: "d146454c-862c-4665-b411-fd4c29e30335") : secret "canary-serving-cert" not found Apr 16 22:14:34.912364 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:34.912326 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c220c5af-4b42-4b44-a789-17aa37d44b90-metrics-certs\") pod \"network-metrics-daemon-wvq6s\" (UID: \"c220c5af-4b42-4b44-a789-17aa37d44b90\") " pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:14:34.912814 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:34.912470 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:14:34.912814 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:14:34.912543 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c220c5af-4b42-4b44-a789-17aa37d44b90-metrics-certs podName:c220c5af-4b42-4b44-a789-17aa37d44b90 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:38.912525295 +0000 UTC m=+129.280931429 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c220c5af-4b42-4b44-a789-17aa37d44b90-metrics-certs") pod "network-metrics-daemon-wvq6s" (UID: "c220c5af-4b42-4b44-a789-17aa37d44b90") : secret "metrics-daemon-secret" not found Apr 16 22:14:38.499300 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:14:38.499268 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-4pbqm" Apr 16 22:15:05.926599 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:05.926569 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d146454c-862c-4665-b411-fd4c29e30335-cert\") pod \"ingress-canary-ffxpx\" (UID: \"d146454c-862c-4665-b411-fd4c29e30335\") " pod="openshift-ingress-canary/ingress-canary-ffxpx" Apr 16 22:15:05.926933 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:05.926609 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-metrics-tls\") pod \"dns-default-xmqcp\" (UID: \"cbfccc2d-f25a-4d8a-bd22-25a929f12d64\") " pod="openshift-dns/dns-default-xmqcp" Apr 16 22:15:05.926933 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:15:05.926719 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:15:05.926933 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:15:05.926723 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:15:05.926933 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:15:05.926799 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-metrics-tls podName:cbfccc2d-f25a-4d8a-bd22-25a929f12d64 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:09.926783413 +0000 UTC m=+160.295189546 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-metrics-tls") pod "dns-default-xmqcp" (UID: "cbfccc2d-f25a-4d8a-bd22-25a929f12d64") : secret "dns-default-metrics-tls" not found Apr 16 22:15:05.926933 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:15:05.926812 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d146454c-862c-4665-b411-fd4c29e30335-cert podName:d146454c-862c-4665-b411-fd4c29e30335 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:09.926806128 +0000 UTC m=+160.295212257 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d146454c-862c-4665-b411-fd4c29e30335-cert") pod "ingress-canary-ffxpx" (UID: "d146454c-862c-4665-b411-fd4c29e30335") : secret "canary-serving-cert" not found Apr 16 22:15:20.928755 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:20.928710 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5fb7ff9769-w4x49"] Apr 16 22:15:20.931449 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:20.931427 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-v4dxt"] Apr 16 22:15:20.931594 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:20.931574 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:20.933834 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:20.933813 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 22:15:20.934108 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:20.934084 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-v4dxt" Apr 16 22:15:20.934624 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:20.934608 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 22:15:20.934710 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:20.934635 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 22:15:20.934710 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:20.934652 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-s46c2\"" Apr 16 22:15:20.938362 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:20.937820 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-l5bqs\"" Apr 16 22:15:20.938362 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:20.937884 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 22:15:20.939131 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:20.939107 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:15:20.939428 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:20.939404 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 22:15:20.939515 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:20.939500 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 22:15:20.944513 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:20.944493 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 22:15:20.945298 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:20.945282 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 22:15:20.949376 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:20.949357 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5fb7ff9769-w4x49"] Apr 16 22:15:20.950125 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:20.950104 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-v4dxt"] Apr 16 22:15:21.026230 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.026200 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2c4c4eef-b842-4810-b506-7094264f295f-ca-trust-extracted\") pod \"image-registry-5fb7ff9769-w4x49\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:21.026230 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.026233 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2c4c4eef-b842-4810-b506-7094264f295f-image-registry-private-configuration\") pod \"image-registry-5fb7ff9769-w4x49\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:21.026408 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.026251 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-registry-tls\") pod \"image-registry-5fb7ff9769-w4x49\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:21.026408 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.026276 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqj5c\" (UniqueName: \"kubernetes.io/projected/dba969ac-c53c-4354-ab6e-cc853b8c1449-kube-api-access-nqj5c\") pod \"console-operator-9d4b6777b-v4dxt\" (UID: \"dba969ac-c53c-4354-ab6e-cc853b8c1449\") " pod="openshift-console-operator/console-operator-9d4b6777b-v4dxt" Apr 16 22:15:21.026408 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.026312 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2c4c4eef-b842-4810-b506-7094264f295f-installation-pull-secrets\") pod \"image-registry-5fb7ff9769-w4x49\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:21.026408 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.026345 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2f5j\" (UniqueName: \"kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-kube-api-access-r2f5j\") pod \"image-registry-5fb7ff9769-w4x49\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:21.026408 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.026391 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dba969ac-c53c-4354-ab6e-cc853b8c1449-config\") pod \"console-operator-9d4b6777b-v4dxt\" (UID: \"dba969ac-c53c-4354-ab6e-cc853b8c1449\") " pod="openshift-console-operator/console-operator-9d4b6777b-v4dxt" Apr 16 22:15:21.026408 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.026407 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-bound-sa-token\") pod \"image-registry-5fb7ff9769-w4x49\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:21.026582 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.026426 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dba969ac-c53c-4354-ab6e-cc853b8c1449-serving-cert\") pod \"console-operator-9d4b6777b-v4dxt\" (UID: \"dba969ac-c53c-4354-ab6e-cc853b8c1449\") " pod="openshift-console-operator/console-operator-9d4b6777b-v4dxt" Apr 16 22:15:21.026582 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.026441 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c4c4eef-b842-4810-b506-7094264f295f-trusted-ca\") pod \"image-registry-5fb7ff9769-w4x49\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:21.026582 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.026490 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dba969ac-c53c-4354-ab6e-cc853b8c1449-trusted-ca\") pod \"console-operator-9d4b6777b-v4dxt\" (UID: \"dba969ac-c53c-4354-ab6e-cc853b8c1449\") " pod="openshift-console-operator/console-operator-9d4b6777b-v4dxt" Apr 16 22:15:21.026582 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.026509 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2c4c4eef-b842-4810-b506-7094264f295f-registry-certificates\") pod \"image-registry-5fb7ff9769-w4x49\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:21.127660 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.127634 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2c4c4eef-b842-4810-b506-7094264f295f-ca-trust-extracted\") pod \"image-registry-5fb7ff9769-w4x49\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:21.127792 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.127665 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2c4c4eef-b842-4810-b506-7094264f295f-image-registry-private-configuration\") pod \"image-registry-5fb7ff9769-w4x49\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:21.127792 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.127685 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-registry-tls\") pod \"image-registry-5fb7ff9769-w4x49\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:21.127792 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.127703 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqj5c\" (UniqueName: \"kubernetes.io/projected/dba969ac-c53c-4354-ab6e-cc853b8c1449-kube-api-access-nqj5c\") pod \"console-operator-9d4b6777b-v4dxt\" (UID: \"dba969ac-c53c-4354-ab6e-cc853b8c1449\") " pod="openshift-console-operator/console-operator-9d4b6777b-v4dxt" Apr 16 22:15:21.127792 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.127727 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2c4c4eef-b842-4810-b506-7094264f295f-installation-pull-secrets\") pod \"image-registry-5fb7ff9769-w4x49\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:21.127983 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.127805 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2f5j\" (UniqueName: \"kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-kube-api-access-r2f5j\") pod \"image-registry-5fb7ff9769-w4x49\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:21.127983 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:15:21.127818 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:15:21.127983 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:15:21.127835 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5fb7ff9769-w4x49: secret "image-registry-tls" not found Apr 16 22:15:21.127983 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.127844 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dba969ac-c53c-4354-ab6e-cc853b8c1449-config\") pod \"console-operator-9d4b6777b-v4dxt\" (UID: \"dba969ac-c53c-4354-ab6e-cc853b8c1449\") " pod="openshift-console-operator/console-operator-9d4b6777b-v4dxt" Apr 16 22:15:21.127983 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.127868 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-bound-sa-token\") pod \"image-registry-5fb7ff9769-w4x49\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:21.127983 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:15:21.127895 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-registry-tls podName:2c4c4eef-b842-4810-b506-7094264f295f nodeName:}" failed. No retries permitted until 2026-04-16 22:15:21.627873103 +0000 UTC m=+111.996279255 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-registry-tls") pod "image-registry-5fb7ff9769-w4x49" (UID: "2c4c4eef-b842-4810-b506-7094264f295f") : secret "image-registry-tls" not found Apr 16 22:15:21.127983 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.127926 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dba969ac-c53c-4354-ab6e-cc853b8c1449-serving-cert\") pod \"console-operator-9d4b6777b-v4dxt\" (UID: \"dba969ac-c53c-4354-ab6e-cc853b8c1449\") " pod="openshift-console-operator/console-operator-9d4b6777b-v4dxt" Apr 16 22:15:21.128383 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.128055 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2c4c4eef-b842-4810-b506-7094264f295f-ca-trust-extracted\") pod \"image-registry-5fb7ff9769-w4x49\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:21.128383 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.128070 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c4c4eef-b842-4810-b506-7094264f295f-trusted-ca\") pod \"image-registry-5fb7ff9769-w4x49\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:21.128383 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.128156 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dba969ac-c53c-4354-ab6e-cc853b8c1449-trusted-ca\") pod \"console-operator-9d4b6777b-v4dxt\" (UID: \"dba969ac-c53c-4354-ab6e-cc853b8c1449\") " pod="openshift-console-operator/console-operator-9d4b6777b-v4dxt" Apr 16 22:15:21.128383 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.128286 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2c4c4eef-b842-4810-b506-7094264f295f-registry-certificates\") pod \"image-registry-5fb7ff9769-w4x49\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:21.128937 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.128908 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dba969ac-c53c-4354-ab6e-cc853b8c1449-config\") pod \"console-operator-9d4b6777b-v4dxt\" (UID: \"dba969ac-c53c-4354-ab6e-cc853b8c1449\") " pod="openshift-console-operator/console-operator-9d4b6777b-v4dxt" Apr 16 22:15:21.129045 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.128990 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2c4c4eef-b842-4810-b506-7094264f295f-registry-certificates\") pod \"image-registry-5fb7ff9769-w4x49\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:21.129110 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.129065 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dba969ac-c53c-4354-ab6e-cc853b8c1449-trusted-ca\") pod \"console-operator-9d4b6777b-v4dxt\" (UID: \"dba969ac-c53c-4354-ab6e-cc853b8c1449\") " pod="openshift-console-operator/console-operator-9d4b6777b-v4dxt" Apr 16 22:15:21.129164 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.129123 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c4c4eef-b842-4810-b506-7094264f295f-trusted-ca\") pod \"image-registry-5fb7ff9769-w4x49\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:21.130299 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.130269 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dba969ac-c53c-4354-ab6e-cc853b8c1449-serving-cert\") pod \"console-operator-9d4b6777b-v4dxt\" (UID: \"dba969ac-c53c-4354-ab6e-cc853b8c1449\") " pod="openshift-console-operator/console-operator-9d4b6777b-v4dxt" Apr 16 22:15:21.130545 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.130527 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2c4c4eef-b842-4810-b506-7094264f295f-installation-pull-secrets\") pod \"image-registry-5fb7ff9769-w4x49\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:21.130649 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.130607 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2c4c4eef-b842-4810-b506-7094264f295f-image-registry-private-configuration\") pod \"image-registry-5fb7ff9769-w4x49\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:21.135689 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.135664 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-bound-sa-token\") pod \"image-registry-5fb7ff9769-w4x49\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:21.136582 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.136564 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqj5c\" (UniqueName: \"kubernetes.io/projected/dba969ac-c53c-4354-ab6e-cc853b8c1449-kube-api-access-nqj5c\") pod \"console-operator-9d4b6777b-v4dxt\" (UID: \"dba969ac-c53c-4354-ab6e-cc853b8c1449\") " pod="openshift-console-operator/console-operator-9d4b6777b-v4dxt" Apr 16 22:15:21.137103 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.137084 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2f5j\" (UniqueName: \"kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-kube-api-access-r2f5j\") pod \"image-registry-5fb7ff9769-w4x49\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:21.249492 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.249466 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-v4dxt" Apr 16 22:15:21.361186 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.361123 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-v4dxt"] Apr 16 22:15:21.363555 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:15:21.363524 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddba969ac_c53c_4354_ab6e_cc853b8c1449.slice/crio-f2e0d1b42f5f6c8764f38d8678829d82a62eeff4eb7db7e130e6dc517b6df2aa WatchSource:0}: Error finding container f2e0d1b42f5f6c8764f38d8678829d82a62eeff4eb7db7e130e6dc517b6df2aa: Status 404 returned error can't find the container with id f2e0d1b42f5f6c8764f38d8678829d82a62eeff4eb7db7e130e6dc517b6df2aa Apr 16 22:15:21.631982 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.631888 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-registry-tls\") pod \"image-registry-5fb7ff9769-w4x49\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:21.632133 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:15:21.632034 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:15:21.632133 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:15:21.632055 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5fb7ff9769-w4x49: secret "image-registry-tls" not found Apr 16 22:15:21.632133 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:15:21.632111 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-registry-tls podName:2c4c4eef-b842-4810-b506-7094264f295f nodeName:}" failed. No retries permitted until 2026-04-16 22:15:22.632094657 +0000 UTC m=+113.000500785 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-registry-tls") pod "image-registry-5fb7ff9769-w4x49" (UID: "2c4c4eef-b842-4810-b506-7094264f295f") : secret "image-registry-tls" not found Apr 16 22:15:21.644867 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:21.644836 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-v4dxt" event={"ID":"dba969ac-c53c-4354-ab6e-cc853b8c1449","Type":"ContainerStarted","Data":"f2e0d1b42f5f6c8764f38d8678829d82a62eeff4eb7db7e130e6dc517b6df2aa"} Apr 16 22:15:22.640341 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:22.640302 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-registry-tls\") pod \"image-registry-5fb7ff9769-w4x49\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:22.640712 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:15:22.640426 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:15:22.640712 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:15:22.640438 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5fb7ff9769-w4x49: secret "image-registry-tls" not found Apr 16 22:15:22.640712 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:15:22.640489 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-registry-tls podName:2c4c4eef-b842-4810-b506-7094264f295f nodeName:}" failed. No retries permitted until 2026-04-16 22:15:24.640472436 +0000 UTC m=+115.008878564 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-registry-tls") pod "image-registry-5fb7ff9769-w4x49" (UID: "2c4c4eef-b842-4810-b506-7094264f295f") : secret "image-registry-tls" not found Apr 16 22:15:23.653344 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:23.653267 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v4dxt_dba969ac-c53c-4354-ab6e-cc853b8c1449/console-operator/0.log" Apr 16 22:15:23.653344 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:23.653310 2574 generic.go:358] "Generic (PLEG): container finished" podID="dba969ac-c53c-4354-ab6e-cc853b8c1449" containerID="985c9ec44b6e8bfd7952609e2091f2fbb0ceb694d1671a0e987faaf8047f4473" exitCode=255 Apr 16 22:15:23.653714 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:23.653344 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-v4dxt" event={"ID":"dba969ac-c53c-4354-ab6e-cc853b8c1449","Type":"ContainerDied","Data":"985c9ec44b6e8bfd7952609e2091f2fbb0ceb694d1671a0e987faaf8047f4473"} Apr 16 22:15:23.653714 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:23.653578 2574 scope.go:117] "RemoveContainer" containerID="985c9ec44b6e8bfd7952609e2091f2fbb0ceb694d1671a0e987faaf8047f4473" Apr 16 22:15:24.656401 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:24.656360 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-registry-tls\") pod \"image-registry-5fb7ff9769-w4x49\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:24.656805 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:15:24.656468 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:15:24.656805 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:15:24.656484 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5fb7ff9769-w4x49: secret "image-registry-tls" not found Apr 16 22:15:24.656805 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:15:24.656542 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-registry-tls podName:2c4c4eef-b842-4810-b506-7094264f295f nodeName:}" failed. No retries permitted until 2026-04-16 22:15:28.656527682 +0000 UTC m=+119.024933815 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-registry-tls") pod "image-registry-5fb7ff9769-w4x49" (UID: "2c4c4eef-b842-4810-b506-7094264f295f") : secret "image-registry-tls" not found Apr 16 22:15:24.656934 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:24.656924 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v4dxt_dba969ac-c53c-4354-ab6e-cc853b8c1449/console-operator/1.log" Apr 16 22:15:24.657271 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:24.657256 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v4dxt_dba969ac-c53c-4354-ab6e-cc853b8c1449/console-operator/0.log" Apr 16 22:15:24.657314 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:24.657290 2574 generic.go:358] "Generic (PLEG): container finished" podID="dba969ac-c53c-4354-ab6e-cc853b8c1449" containerID="22691f19e0c5e57931f1f272d2ab559aa1e63c0cdb300e4c6441c5b43cb526ec" exitCode=255 Apr 16 22:15:24.657347 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:24.657317 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-v4dxt" event={"ID":"dba969ac-c53c-4354-ab6e-cc853b8c1449","Type":"ContainerDied","Data":"22691f19e0c5e57931f1f272d2ab559aa1e63c0cdb300e4c6441c5b43cb526ec"} Apr 16 22:15:24.657382 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:24.657353 2574 scope.go:117] "RemoveContainer" containerID="985c9ec44b6e8bfd7952609e2091f2fbb0ceb694d1671a0e987faaf8047f4473" Apr 16 22:15:24.657576 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:24.657558 2574 scope.go:117] "RemoveContainer" containerID="22691f19e0c5e57931f1f272d2ab559aa1e63c0cdb300e4c6441c5b43cb526ec" Apr 16 22:15:24.657780 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:15:24.657762 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-v4dxt_openshift-console-operator(dba969ac-c53c-4354-ab6e-cc853b8c1449)\"" pod="openshift-console-operator/console-operator-9d4b6777b-v4dxt" podUID="dba969ac-c53c-4354-ab6e-cc853b8c1449" Apr 16 22:15:25.660888 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:25.660860 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v4dxt_dba969ac-c53c-4354-ab6e-cc853b8c1449/console-operator/1.log" Apr 16 22:15:25.661244 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:25.661193 2574 scope.go:117] "RemoveContainer" containerID="22691f19e0c5e57931f1f272d2ab559aa1e63c0cdb300e4c6441c5b43cb526ec" Apr 16 22:15:25.661374 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:15:25.661356 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-v4dxt_openshift-console-operator(dba969ac-c53c-4354-ab6e-cc853b8c1449)\"" pod="openshift-console-operator/console-operator-9d4b6777b-v4dxt" podUID="dba969ac-c53c-4354-ab6e-cc853b8c1449" Apr 16 22:15:27.698108 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:27.698077 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9cdfs_76fa2914-d925-4555-87be-d9837e6295d8/dns-node-resolver/0.log" Apr 16 22:15:28.684497 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:28.684467 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-registry-tls\") pod \"image-registry-5fb7ff9769-w4x49\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:28.684658 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:15:28.684578 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:15:28.684658 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:15:28.684589 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5fb7ff9769-w4x49: secret "image-registry-tls" not found Apr 16 22:15:28.684658 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:15:28.684638 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-registry-tls podName:2c4c4eef-b842-4810-b506-7094264f295f nodeName:}" failed. No retries permitted until 2026-04-16 22:15:36.684623564 +0000 UTC m=+127.053029693 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-registry-tls") pod "image-registry-5fb7ff9769-w4x49" (UID: "2c4c4eef-b842-4810-b506-7094264f295f") : secret "image-registry-tls" not found Apr 16 22:15:29.098244 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:29.098215 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zfz79_46cb706e-dcb9-4950-aa25-14e582448ea8/node-ca/0.log" Apr 16 22:15:31.249776 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:31.249727 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-v4dxt" Apr 16 22:15:31.249776 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:31.249775 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-v4dxt" Apr 16 22:15:31.250148 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:31.250132 2574 scope.go:117] "RemoveContainer" containerID="22691f19e0c5e57931f1f272d2ab559aa1e63c0cdb300e4c6441c5b43cb526ec" Apr 16 22:15:31.250311 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:15:31.250292 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-v4dxt_openshift-console-operator(dba969ac-c53c-4354-ab6e-cc853b8c1449)\"" pod="openshift-console-operator/console-operator-9d4b6777b-v4dxt" podUID="dba969ac-c53c-4354-ab6e-cc853b8c1449" Apr 16 22:15:36.746163 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:36.746122 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-registry-tls\") pod \"image-registry-5fb7ff9769-w4x49\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:36.748621 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:36.748595 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-registry-tls\") pod \"image-registry-5fb7ff9769-w4x49\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:36.844962 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:36.844938 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-s46c2\"" Apr 16 22:15:36.852802 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:36.852784 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:36.965855 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:36.965824 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5fb7ff9769-w4x49"] Apr 16 22:15:36.969212 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:15:36.969185 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c4c4eef_b842_4810_b506_7094264f295f.slice/crio-96a45c5abe9b0ac6959f56b09569a3fb550db57e71ee8fcae5eee4932da0edcd WatchSource:0}: Error finding container 96a45c5abe9b0ac6959f56b09569a3fb550db57e71ee8fcae5eee4932da0edcd: Status 404 returned error can't find the container with id 96a45c5abe9b0ac6959f56b09569a3fb550db57e71ee8fcae5eee4932da0edcd Apr 16 22:15:37.689253 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:37.689213 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" event={"ID":"2c4c4eef-b842-4810-b506-7094264f295f","Type":"ContainerStarted","Data":"37dfb12a85401b78e7299f78bdddd655814e345e636e5096f8e95399f0b409d5"} Apr 16 22:15:37.689253 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:37.689253 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" event={"ID":"2c4c4eef-b842-4810-b506-7094264f295f","Type":"ContainerStarted","Data":"96a45c5abe9b0ac6959f56b09569a3fb550db57e71ee8fcae5eee4932da0edcd"} Apr 16 22:15:37.689456 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:37.689277 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:15:37.706873 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:37.706836 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" podStartSLOduration=17.706823528 podStartE2EDuration="17.706823528s" podCreationTimestamp="2026-04-16 22:15:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:15:37.705875122 +0000 UTC m=+128.074281287" watchObservedRunningTime="2026-04-16 22:15:37.706823528 +0000 UTC m=+128.075229678" Apr 16 22:15:38.964400 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:38.964369 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c220c5af-4b42-4b44-a789-17aa37d44b90-metrics-certs\") pod \"network-metrics-daemon-wvq6s\" (UID: \"c220c5af-4b42-4b44-a789-17aa37d44b90\") " pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:15:38.966625 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:38.966596 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c220c5af-4b42-4b44-a789-17aa37d44b90-metrics-certs\") pod \"network-metrics-daemon-wvq6s\" (UID: \"c220c5af-4b42-4b44-a789-17aa37d44b90\") " pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:15:39.181845 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:39.181815 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-hqwt5\"" Apr 16 22:15:39.190480 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:39.190460 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wvq6s" Apr 16 22:15:39.301512 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:39.301351 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wvq6s"] Apr 16 22:15:39.303810 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:15:39.303785 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc220c5af_4b42_4b44_a789_17aa37d44b90.slice/crio-91933bc01748c6094f734ebbf428509b2f6a5072b89565ff217b66491e7db6ba WatchSource:0}: Error finding container 91933bc01748c6094f734ebbf428509b2f6a5072b89565ff217b66491e7db6ba: Status 404 returned error can't find the container with id 91933bc01748c6094f734ebbf428509b2f6a5072b89565ff217b66491e7db6ba Apr 16 22:15:39.696576 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:39.696544 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wvq6s" event={"ID":"c220c5af-4b42-4b44-a789-17aa37d44b90","Type":"ContainerStarted","Data":"91933bc01748c6094f734ebbf428509b2f6a5072b89565ff217b66491e7db6ba"} Apr 16 22:15:41.703128 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:41.703093 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wvq6s" event={"ID":"c220c5af-4b42-4b44-a789-17aa37d44b90","Type":"ContainerStarted","Data":"afc5919a172f8b1c3f1b1699c6da0578a86b7c53a0e4ecfb1e45ae29feed6c18"} Apr 16 22:15:41.703128 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:41.703132 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wvq6s" event={"ID":"c220c5af-4b42-4b44-a789-17aa37d44b90","Type":"ContainerStarted","Data":"3ce5a051aada114502bd6083146b116af5fb5b3f140890798e411fd892b58ecd"} Apr 16 22:15:41.718566 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:41.718525 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wvq6s" podStartSLOduration=130.388389297 podStartE2EDuration="2m11.718513241s" podCreationTimestamp="2026-04-16 22:13:30 +0000 UTC" firstStartedPulling="2026-04-16 22:15:39.305668009 +0000 UTC m=+129.674074144" lastFinishedPulling="2026-04-16 22:15:40.635791937 +0000 UTC m=+131.004198088" observedRunningTime="2026-04-16 22:15:41.71769937 +0000 UTC m=+132.086105521" watchObservedRunningTime="2026-04-16 22:15:41.718513241 +0000 UTC m=+132.086919414" Apr 16 22:15:42.258900 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:42.258866 2574 scope.go:117] "RemoveContainer" containerID="22691f19e0c5e57931f1f272d2ab559aa1e63c0cdb300e4c6441c5b43cb526ec" Apr 16 22:15:42.707166 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:42.707089 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v4dxt_dba969ac-c53c-4354-ab6e-cc853b8c1449/console-operator/1.log" Apr 16 22:15:42.707574 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:42.707210 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-v4dxt" event={"ID":"dba969ac-c53c-4354-ab6e-cc853b8c1449","Type":"ContainerStarted","Data":"899dc34bb526c82d2bef6ab4a5a4f9b55cebcf841d1997a9b153025eec569d74"} Apr 16 22:15:42.707574 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:42.707546 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-v4dxt" Apr 16 22:15:42.724692 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:42.724649 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-v4dxt" podStartSLOduration=20.796029409 podStartE2EDuration="22.724638305s" podCreationTimestamp="2026-04-16 22:15:20 +0000 UTC" firstStartedPulling="2026-04-16 22:15:21.365304727 +0000 UTC m=+111.733710872" lastFinishedPulling="2026-04-16 22:15:23.293913639 +0000 UTC m=+113.662319768" observedRunningTime="2026-04-16 22:15:42.724308218 +0000 UTC m=+133.092714369" watchObservedRunningTime="2026-04-16 22:15:42.724638305 +0000 UTC m=+133.093044456" Apr 16 22:15:43.180685 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:43.180657 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-v4dxt" Apr 16 22:15:51.889781 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:51.889735 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-vdsxv"] Apr 16 22:15:51.892933 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:51.892908 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-vdsxv" Apr 16 22:15:51.896336 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:51.896316 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 22:15:51.896449 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:51.896317 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 22:15:51.897089 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:51.897067 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-bblwd\"" Apr 16 22:15:51.897198 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:51.897104 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 22:15:51.897198 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:51.897141 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 22:15:51.906475 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:51.906454 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-vdsxv"] Apr 16 22:15:51.919228 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:51.919207 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5fb7ff9769-w4x49"] Apr 16 22:15:51.956368 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:51.956345 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/031bb873-9d06-48ea-b341-4885a796a0eb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vdsxv\" (UID: \"031bb873-9d06-48ea-b341-4885a796a0eb\") " pod="openshift-insights/insights-runtime-extractor-vdsxv" Apr 16 22:15:51.956482 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:51.956376 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlws6\" (UniqueName: \"kubernetes.io/projected/031bb873-9d06-48ea-b341-4885a796a0eb-kube-api-access-mlws6\") pod \"insights-runtime-extractor-vdsxv\" (UID: \"031bb873-9d06-48ea-b341-4885a796a0eb\") " pod="openshift-insights/insights-runtime-extractor-vdsxv" Apr 16 22:15:51.956482 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:51.956399 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/031bb873-9d06-48ea-b341-4885a796a0eb-crio-socket\") pod \"insights-runtime-extractor-vdsxv\" (UID: \"031bb873-9d06-48ea-b341-4885a796a0eb\") " pod="openshift-insights/insights-runtime-extractor-vdsxv" Apr 16 22:15:51.956482 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:51.956427 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/031bb873-9d06-48ea-b341-4885a796a0eb-data-volume\") pod \"insights-runtime-extractor-vdsxv\" (UID: \"031bb873-9d06-48ea-b341-4885a796a0eb\") " pod="openshift-insights/insights-runtime-extractor-vdsxv" Apr 16 22:15:51.956614 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:51.956504 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/031bb873-9d06-48ea-b341-4885a796a0eb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vdsxv\" (UID: \"031bb873-9d06-48ea-b341-4885a796a0eb\") " pod="openshift-insights/insights-runtime-extractor-vdsxv" Apr 16 22:15:52.056978 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:52.056948 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/031bb873-9d06-48ea-b341-4885a796a0eb-data-volume\") pod \"insights-runtime-extractor-vdsxv\" (UID: \"031bb873-9d06-48ea-b341-4885a796a0eb\") " pod="openshift-insights/insights-runtime-extractor-vdsxv" Apr 16 22:15:52.056978 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:52.056982 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/031bb873-9d06-48ea-b341-4885a796a0eb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vdsxv\" (UID: \"031bb873-9d06-48ea-b341-4885a796a0eb\") " pod="openshift-insights/insights-runtime-extractor-vdsxv" Apr 16 22:15:52.057223 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:52.057040 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/031bb873-9d06-48ea-b341-4885a796a0eb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vdsxv\" (UID: \"031bb873-9d06-48ea-b341-4885a796a0eb\") " pod="openshift-insights/insights-runtime-extractor-vdsxv" Apr 16 22:15:52.057223 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:52.057061 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mlws6\" (UniqueName: \"kubernetes.io/projected/031bb873-9d06-48ea-b341-4885a796a0eb-kube-api-access-mlws6\") pod \"insights-runtime-extractor-vdsxv\" (UID: \"031bb873-9d06-48ea-b341-4885a796a0eb\") " pod="openshift-insights/insights-runtime-extractor-vdsxv" Apr 16 22:15:52.057223 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:52.057077 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/031bb873-9d06-48ea-b341-4885a796a0eb-crio-socket\") pod \"insights-runtime-extractor-vdsxv\" (UID: \"031bb873-9d06-48ea-b341-4885a796a0eb\") " pod="openshift-insights/insights-runtime-extractor-vdsxv" Apr 16 22:15:52.057223 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:52.057159 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/031bb873-9d06-48ea-b341-4885a796a0eb-crio-socket\") pod \"insights-runtime-extractor-vdsxv\" (UID: \"031bb873-9d06-48ea-b341-4885a796a0eb\") " pod="openshift-insights/insights-runtime-extractor-vdsxv" Apr 16 22:15:52.057416 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:52.057368 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/031bb873-9d06-48ea-b341-4885a796a0eb-data-volume\") pod \"insights-runtime-extractor-vdsxv\" (UID: \"031bb873-9d06-48ea-b341-4885a796a0eb\") " pod="openshift-insights/insights-runtime-extractor-vdsxv" Apr 16 22:15:52.057581 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:52.057562 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/031bb873-9d06-48ea-b341-4885a796a0eb-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vdsxv\" (UID: \"031bb873-9d06-48ea-b341-4885a796a0eb\") " pod="openshift-insights/insights-runtime-extractor-vdsxv" Apr 16 22:15:52.059261 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:52.059244 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/031bb873-9d06-48ea-b341-4885a796a0eb-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vdsxv\" (UID: \"031bb873-9d06-48ea-b341-4885a796a0eb\") " pod="openshift-insights/insights-runtime-extractor-vdsxv" Apr 16 22:15:52.065277 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:52.065257 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlws6\" (UniqueName: \"kubernetes.io/projected/031bb873-9d06-48ea-b341-4885a796a0eb-kube-api-access-mlws6\") pod \"insights-runtime-extractor-vdsxv\" (UID: \"031bb873-9d06-48ea-b341-4885a796a0eb\") " pod="openshift-insights/insights-runtime-extractor-vdsxv" Apr 16 22:15:52.201574 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:52.201502 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-vdsxv" Apr 16 22:15:52.314467 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:52.314434 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-vdsxv"] Apr 16 22:15:52.317437 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:15:52.317407 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod031bb873_9d06_48ea_b341_4885a796a0eb.slice/crio-2990df6968598fcb5bb91364bd0add083dc14967d65fd46679d62c98da68f7b4 WatchSource:0}: Error finding container 2990df6968598fcb5bb91364bd0add083dc14967d65fd46679d62c98da68f7b4: Status 404 returned error can't find the container with id 2990df6968598fcb5bb91364bd0add083dc14967d65fd46679d62c98da68f7b4 Apr 16 22:15:52.735708 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:52.735672 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vdsxv" event={"ID":"031bb873-9d06-48ea-b341-4885a796a0eb","Type":"ContainerStarted","Data":"1bc80736fd53a01910e43fac4c151a425343d0612a8f2acf51344b8b038a8fb8"} Apr 16 22:15:52.735708 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:52.735712 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vdsxv" event={"ID":"031bb873-9d06-48ea-b341-4885a796a0eb","Type":"ContainerStarted","Data":"2990df6968598fcb5bb91364bd0add083dc14967d65fd46679d62c98da68f7b4"} Apr 16 22:15:53.740133 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:53.740096 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vdsxv" event={"ID":"031bb873-9d06-48ea-b341-4885a796a0eb","Type":"ContainerStarted","Data":"741d26de492fa753753937ae9ee88fe771331b44c64c3344fc79f5d9967e00a3"} Apr 16 22:15:54.746482 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:54.746444 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vdsxv" event={"ID":"031bb873-9d06-48ea-b341-4885a796a0eb","Type":"ContainerStarted","Data":"541178fb8a4bcd641662bbed1de3f2fe4b887332a876c19d797ac0bcb3f7e86a"} Apr 16 22:15:54.766805 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:54.766755 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-vdsxv" podStartSLOduration=1.7323357719999999 podStartE2EDuration="3.766724165s" podCreationTimestamp="2026-04-16 22:15:51 +0000 UTC" firstStartedPulling="2026-04-16 22:15:52.372626233 +0000 UTC m=+142.741032365" lastFinishedPulling="2026-04-16 22:15:54.407014629 +0000 UTC m=+144.775420758" observedRunningTime="2026-04-16 22:15:54.76615384 +0000 UTC m=+145.134560005" watchObservedRunningTime="2026-04-16 22:15:54.766724165 +0000 UTC m=+145.135130317" Apr 16 22:15:55.473611 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:55.473584 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-h4l4q"] Apr 16 22:15:55.476543 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:55.476528 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-h4l4q" Apr 16 22:15:55.478623 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:55.478605 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 22:15:55.478693 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:55.478665 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-kw8qt\"" Apr 16 22:15:55.486558 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:55.486537 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-h4l4q"] Apr 16 22:15:55.583164 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:55.583133 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/82c95629-2e19-4874-bb54-906974bfced4-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-h4l4q\" (UID: \"82c95629-2e19-4874-bb54-906974bfced4\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-h4l4q" Apr 16 22:15:55.683618 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:55.683585 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/82c95629-2e19-4874-bb54-906974bfced4-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-h4l4q\" (UID: \"82c95629-2e19-4874-bb54-906974bfced4\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-h4l4q" Apr 16 22:15:55.686216 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:55.686189 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/82c95629-2e19-4874-bb54-906974bfced4-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-h4l4q\" (UID: \"82c95629-2e19-4874-bb54-906974bfced4\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-h4l4q" Apr 16 22:15:55.784817 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:55.784789 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-h4l4q" Apr 16 22:15:55.895038 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:55.895007 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-h4l4q"] Apr 16 22:15:55.898907 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:15:55.898879 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82c95629_2e19_4874_bb54_906974bfced4.slice/crio-8efd4e09ea26f26edf7b0cee3519e44021cc743f67bbfb8ede2510ab2f00a2c6 WatchSource:0}: Error finding container 8efd4e09ea26f26edf7b0cee3519e44021cc743f67bbfb8ede2510ab2f00a2c6: Status 404 returned error can't find the container with id 8efd4e09ea26f26edf7b0cee3519e44021cc743f67bbfb8ede2510ab2f00a2c6 Apr 16 22:15:56.753203 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:56.753160 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-h4l4q" event={"ID":"82c95629-2e19-4874-bb54-906974bfced4","Type":"ContainerStarted","Data":"8efd4e09ea26f26edf7b0cee3519e44021cc743f67bbfb8ede2510ab2f00a2c6"} Apr 16 22:15:57.756690 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:57.756655 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-h4l4q" event={"ID":"82c95629-2e19-4874-bb54-906974bfced4","Type":"ContainerStarted","Data":"95dda0032e723a85089a92805b7b5bfdb85c1e841a1a943c628e9c2c31858cb1"} Apr 16 22:15:57.757099 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:57.756873 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-h4l4q" Apr 16 22:15:57.762057 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:57.762035 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-h4l4q" Apr 16 22:15:57.771722 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:57.771682 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-h4l4q" podStartSLOduration=1.802930852 podStartE2EDuration="2.771666534s" podCreationTimestamp="2026-04-16 22:15:55 +0000 UTC" firstStartedPulling="2026-04-16 22:15:55.901132671 +0000 UTC m=+146.269538804" lastFinishedPulling="2026-04-16 22:15:56.869868339 +0000 UTC m=+147.238274486" observedRunningTime="2026-04-16 22:15:57.770773681 +0000 UTC m=+148.139179832" watchObservedRunningTime="2026-04-16 22:15:57.771666534 +0000 UTC m=+148.140072686" Apr 16 22:15:58.533240 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:58.533208 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-hw9tm"] Apr 16 22:15:58.535831 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:58.535813 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-hw9tm" Apr 16 22:15:58.538762 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:58.538713 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 22:15:58.538762 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:58.538727 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 22:15:58.538928 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:58.538765 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 22:15:58.538928 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:58.538773 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 22:15:58.538928 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:58.538805 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-b5xhd\"" Apr 16 22:15:58.538928 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:58.538728 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 22:15:58.547285 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:58.547247 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-hw9tm"] Apr 16 22:15:58.602717 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:58.602690 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/89f42b6c-a082-4cb0-8a46-a52958adcbac-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-hw9tm\" (UID: \"89f42b6c-a082-4cb0-8a46-a52958adcbac\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hw9tm" Apr 16 22:15:58.602867 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:58.602724 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89f42b6c-a082-4cb0-8a46-a52958adcbac-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-hw9tm\" (UID: \"89f42b6c-a082-4cb0-8a46-a52958adcbac\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hw9tm" Apr 16 22:15:58.602867 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:58.602762 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p27sn\" (UniqueName: \"kubernetes.io/projected/89f42b6c-a082-4cb0-8a46-a52958adcbac-kube-api-access-p27sn\") pod \"prometheus-operator-5676c8c784-hw9tm\" (UID: \"89f42b6c-a082-4cb0-8a46-a52958adcbac\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hw9tm" Apr 16 22:15:58.602867 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:58.602834 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/89f42b6c-a082-4cb0-8a46-a52958adcbac-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-hw9tm\" (UID: \"89f42b6c-a082-4cb0-8a46-a52958adcbac\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hw9tm" Apr 16 22:15:58.703887 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:58.703842 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/89f42b6c-a082-4cb0-8a46-a52958adcbac-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-hw9tm\" (UID: \"89f42b6c-a082-4cb0-8a46-a52958adcbac\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hw9tm" Apr 16 22:15:58.704096 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:58.703899 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89f42b6c-a082-4cb0-8a46-a52958adcbac-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-hw9tm\" (UID: \"89f42b6c-a082-4cb0-8a46-a52958adcbac\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hw9tm" Apr 16 22:15:58.704096 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:58.703931 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p27sn\" (UniqueName: \"kubernetes.io/projected/89f42b6c-a082-4cb0-8a46-a52958adcbac-kube-api-access-p27sn\") pod \"prometheus-operator-5676c8c784-hw9tm\" (UID: \"89f42b6c-a082-4cb0-8a46-a52958adcbac\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hw9tm" Apr 16 22:15:58.704096 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:58.703967 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/89f42b6c-a082-4cb0-8a46-a52958adcbac-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-hw9tm\" (UID: \"89f42b6c-a082-4cb0-8a46-a52958adcbac\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hw9tm" Apr 16 22:15:58.704096 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:15:58.704057 2574 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 16 22:15:58.704236 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:15:58.704122 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89f42b6c-a082-4cb0-8a46-a52958adcbac-prometheus-operator-tls podName:89f42b6c-a082-4cb0-8a46-a52958adcbac nodeName:}" failed. No retries permitted until 2026-04-16 22:15:59.204102245 +0000 UTC m=+149.572508375 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/89f42b6c-a082-4cb0-8a46-a52958adcbac-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-hw9tm" (UID: "89f42b6c-a082-4cb0-8a46-a52958adcbac") : secret "prometheus-operator-tls" not found Apr 16 22:15:58.705150 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:58.705129 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89f42b6c-a082-4cb0-8a46-a52958adcbac-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-hw9tm\" (UID: \"89f42b6c-a082-4cb0-8a46-a52958adcbac\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hw9tm" Apr 16 22:15:58.706114 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:58.706096 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/89f42b6c-a082-4cb0-8a46-a52958adcbac-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-hw9tm\" (UID: \"89f42b6c-a082-4cb0-8a46-a52958adcbac\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hw9tm" Apr 16 22:15:58.712332 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:58.712308 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p27sn\" (UniqueName: \"kubernetes.io/projected/89f42b6c-a082-4cb0-8a46-a52958adcbac-kube-api-access-p27sn\") pod \"prometheus-operator-5676c8c784-hw9tm\" (UID: \"89f42b6c-a082-4cb0-8a46-a52958adcbac\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hw9tm" Apr 16 22:15:59.207984 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:59.207948 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/89f42b6c-a082-4cb0-8a46-a52958adcbac-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-hw9tm\" (UID: \"89f42b6c-a082-4cb0-8a46-a52958adcbac\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hw9tm" Apr 16 22:15:59.211090 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:59.211061 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/89f42b6c-a082-4cb0-8a46-a52958adcbac-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-hw9tm\" (UID: \"89f42b6c-a082-4cb0-8a46-a52958adcbac\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-hw9tm" Apr 16 22:15:59.443734 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:59.443689 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-hw9tm" Apr 16 22:15:59.558140 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:59.558111 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-hw9tm"] Apr 16 22:15:59.561302 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:15:59.561267 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89f42b6c_a082_4cb0_8a46_a52958adcbac.slice/crio-d5e042a5dad52f1577626c1017ff134d6b6f491f9b13ac5a1f4618ec7f0f6503 WatchSource:0}: Error finding container d5e042a5dad52f1577626c1017ff134d6b6f491f9b13ac5a1f4618ec7f0f6503: Status 404 returned error can't find the container with id d5e042a5dad52f1577626c1017ff134d6b6f491f9b13ac5a1f4618ec7f0f6503 Apr 16 22:15:59.762572 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:15:59.762537 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-hw9tm" event={"ID":"89f42b6c-a082-4cb0-8a46-a52958adcbac","Type":"ContainerStarted","Data":"d5e042a5dad52f1577626c1017ff134d6b6f491f9b13ac5a1f4618ec7f0f6503"} Apr 16 22:16:01.272608 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.272576 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-97f84ccf4-2cnfj"] Apr 16 22:16:01.274652 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.274635 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-97f84ccf4-2cnfj" Apr 16 22:16:01.277064 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.277034 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 22:16:01.278001 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.277935 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-kf4z8\"" Apr 16 22:16:01.278001 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.277963 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 22:16:01.278001 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.277973 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 22:16:01.278211 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.278026 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 22:16:01.278211 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.278070 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 22:16:01.278211 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.278167 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 22:16:01.278314 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.278268 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 22:16:01.284289 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.284268 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-97f84ccf4-2cnfj"] Apr 16 22:16:01.324833 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.324811 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/12f32bdf-eadc-4da6-b3d3-55490357709e-service-ca\") pod \"console-97f84ccf4-2cnfj\" (UID: \"12f32bdf-eadc-4da6-b3d3-55490357709e\") " pod="openshift-console/console-97f84ccf4-2cnfj" Apr 16 22:16:01.324919 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.324836 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/12f32bdf-eadc-4da6-b3d3-55490357709e-console-config\") pod \"console-97f84ccf4-2cnfj\" (UID: \"12f32bdf-eadc-4da6-b3d3-55490357709e\") " pod="openshift-console/console-97f84ccf4-2cnfj" Apr 16 22:16:01.324919 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.324890 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/12f32bdf-eadc-4da6-b3d3-55490357709e-console-serving-cert\") pod \"console-97f84ccf4-2cnfj\" (UID: \"12f32bdf-eadc-4da6-b3d3-55490357709e\") " pod="openshift-console/console-97f84ccf4-2cnfj" Apr 16 22:16:01.324919 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.324908 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhhbw\" (UniqueName: \"kubernetes.io/projected/12f32bdf-eadc-4da6-b3d3-55490357709e-kube-api-access-lhhbw\") pod \"console-97f84ccf4-2cnfj\" (UID: \"12f32bdf-eadc-4da6-b3d3-55490357709e\") " pod="openshift-console/console-97f84ccf4-2cnfj" Apr 16 22:16:01.325018 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.324940 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/12f32bdf-eadc-4da6-b3d3-55490357709e-console-oauth-config\") pod \"console-97f84ccf4-2cnfj\" (UID: \"12f32bdf-eadc-4da6-b3d3-55490357709e\") " pod="openshift-console/console-97f84ccf4-2cnfj" Apr 16 22:16:01.325061 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.325033 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/12f32bdf-eadc-4da6-b3d3-55490357709e-oauth-serving-cert\") pod \"console-97f84ccf4-2cnfj\" (UID: \"12f32bdf-eadc-4da6-b3d3-55490357709e\") " pod="openshift-console/console-97f84ccf4-2cnfj" Apr 16 22:16:01.426193 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.426158 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/12f32bdf-eadc-4da6-b3d3-55490357709e-console-serving-cert\") pod \"console-97f84ccf4-2cnfj\" (UID: \"12f32bdf-eadc-4da6-b3d3-55490357709e\") " pod="openshift-console/console-97f84ccf4-2cnfj" Apr 16 22:16:01.426365 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.426196 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhhbw\" (UniqueName: \"kubernetes.io/projected/12f32bdf-eadc-4da6-b3d3-55490357709e-kube-api-access-lhhbw\") pod \"console-97f84ccf4-2cnfj\" (UID: \"12f32bdf-eadc-4da6-b3d3-55490357709e\") " pod="openshift-console/console-97f84ccf4-2cnfj" Apr 16 22:16:01.426365 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.426237 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/12f32bdf-eadc-4da6-b3d3-55490357709e-console-oauth-config\") pod \"console-97f84ccf4-2cnfj\" (UID: \"12f32bdf-eadc-4da6-b3d3-55490357709e\") " pod="openshift-console/console-97f84ccf4-2cnfj" Apr 16 22:16:01.426365 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.426296 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/12f32bdf-eadc-4da6-b3d3-55490357709e-oauth-serving-cert\") pod \"console-97f84ccf4-2cnfj\" (UID: \"12f32bdf-eadc-4da6-b3d3-55490357709e\") " pod="openshift-console/console-97f84ccf4-2cnfj" Apr 16 22:16:01.426365 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.426335 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/12f32bdf-eadc-4da6-b3d3-55490357709e-service-ca\") pod \"console-97f84ccf4-2cnfj\" (UID: \"12f32bdf-eadc-4da6-b3d3-55490357709e\") " pod="openshift-console/console-97f84ccf4-2cnfj" Apr 16 22:16:01.426365 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.426359 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/12f32bdf-eadc-4da6-b3d3-55490357709e-console-config\") pod \"console-97f84ccf4-2cnfj\" (UID: \"12f32bdf-eadc-4da6-b3d3-55490357709e\") " pod="openshift-console/console-97f84ccf4-2cnfj" Apr 16 22:16:01.427088 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.427063 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/12f32bdf-eadc-4da6-b3d3-55490357709e-service-ca\") pod \"console-97f84ccf4-2cnfj\" (UID: \"12f32bdf-eadc-4da6-b3d3-55490357709e\") " pod="openshift-console/console-97f84ccf4-2cnfj" Apr 16 22:16:01.427221 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.427063 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/12f32bdf-eadc-4da6-b3d3-55490357709e-oauth-serving-cert\") pod \"console-97f84ccf4-2cnfj\" (UID: \"12f32bdf-eadc-4da6-b3d3-55490357709e\") " pod="openshift-console/console-97f84ccf4-2cnfj" Apr 16 22:16:01.427221 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.427171 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/12f32bdf-eadc-4da6-b3d3-55490357709e-console-config\") pod \"console-97f84ccf4-2cnfj\" (UID: \"12f32bdf-eadc-4da6-b3d3-55490357709e\") " pod="openshift-console/console-97f84ccf4-2cnfj" Apr 16 22:16:01.428678 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.428650 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/12f32bdf-eadc-4da6-b3d3-55490357709e-console-serving-cert\") pod \"console-97f84ccf4-2cnfj\" (UID: \"12f32bdf-eadc-4da6-b3d3-55490357709e\") " pod="openshift-console/console-97f84ccf4-2cnfj" Apr 16 22:16:01.428678 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.428658 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/12f32bdf-eadc-4da6-b3d3-55490357709e-console-oauth-config\") pod \"console-97f84ccf4-2cnfj\" (UID: \"12f32bdf-eadc-4da6-b3d3-55490357709e\") " pod="openshift-console/console-97f84ccf4-2cnfj" Apr 16 22:16:01.433757 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.433723 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhhbw\" (UniqueName: \"kubernetes.io/projected/12f32bdf-eadc-4da6-b3d3-55490357709e-kube-api-access-lhhbw\") pod \"console-97f84ccf4-2cnfj\" (UID: \"12f32bdf-eadc-4da6-b3d3-55490357709e\") " pod="openshift-console/console-97f84ccf4-2cnfj" Apr 16 22:16:01.584674 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.584592 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-97f84ccf4-2cnfj" Apr 16 22:16:01.698321 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.698289 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-97f84ccf4-2cnfj"] Apr 16 22:16:01.701386 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:16:01.701359 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12f32bdf_eadc_4da6_b3d3_55490357709e.slice/crio-9aab182ce3e9480cd50ecf3405a193c6fe680dc599abf3eaa43ff590c5934f4f WatchSource:0}: Error finding container 9aab182ce3e9480cd50ecf3405a193c6fe680dc599abf3eaa43ff590c5934f4f: Status 404 returned error can't find the container with id 9aab182ce3e9480cd50ecf3405a193c6fe680dc599abf3eaa43ff590c5934f4f Apr 16 22:16:01.768649 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.768616 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-hw9tm" event={"ID":"89f42b6c-a082-4cb0-8a46-a52958adcbac","Type":"ContainerStarted","Data":"652505d1bb6fc658d00775cb986b3773b1b5d2a7b7f5ccbf045560d781ef8e53"} Apr 16 22:16:01.768779 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.768654 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-hw9tm" event={"ID":"89f42b6c-a082-4cb0-8a46-a52958adcbac","Type":"ContainerStarted","Data":"23528252eb171f5a6dc7d6355682e41b94039cc1b8e9ea720b915fc15ccc3905"} Apr 16 22:16:01.769682 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.769658 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-97f84ccf4-2cnfj" event={"ID":"12f32bdf-eadc-4da6-b3d3-55490357709e","Type":"ContainerStarted","Data":"9aab182ce3e9480cd50ecf3405a193c6fe680dc599abf3eaa43ff590c5934f4f"} Apr 16 22:16:01.788732 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.788689 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-hw9tm" podStartSLOduration=2.492327429 podStartE2EDuration="3.78867622s" podCreationTimestamp="2026-04-16 22:15:58 +0000 UTC" firstStartedPulling="2026-04-16 22:15:59.562997892 +0000 UTC m=+149.931404028" lastFinishedPulling="2026-04-16 22:16:00.85934669 +0000 UTC m=+151.227752819" observedRunningTime="2026-04-16 22:16:01.787208808 +0000 UTC m=+152.155614960" watchObservedRunningTime="2026-04-16 22:16:01.78867622 +0000 UTC m=+152.157082371" Apr 16 22:16:01.924424 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.924348 2574 patch_prober.go:28] interesting pod/image-registry-5fb7ff9769-w4x49 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 22:16:01.924546 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:01.924407 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" podUID="2c4c4eef-b842-4810-b506-7094264f295f" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:16:03.141033 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:03.140999 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-84d7d95f67-wn52j"] Apr 16 22:16:03.142949 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:03.142933 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84d7d95f67-wn52j" Apr 16 22:16:03.152790 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:03.152768 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 22:16:03.156107 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:03.155514 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84d7d95f67-wn52j"] Apr 16 22:16:03.239507 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:03.239474 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-trusted-ca-bundle\") pod \"console-84d7d95f67-wn52j\" (UID: \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\") " pod="openshift-console/console-84d7d95f67-wn52j" Apr 16 22:16:03.239730 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:03.239520 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-oauth-serving-cert\") pod \"console-84d7d95f67-wn52j\" (UID: \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\") " pod="openshift-console/console-84d7d95f67-wn52j" Apr 16 22:16:03.239730 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:03.239573 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c25g7\" (UniqueName: \"kubernetes.io/projected/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-kube-api-access-c25g7\") pod \"console-84d7d95f67-wn52j\" (UID: \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\") " pod="openshift-console/console-84d7d95f67-wn52j" Apr 16 22:16:03.239730 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:03.239662 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-console-config\") pod \"console-84d7d95f67-wn52j\" (UID: \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\") " pod="openshift-console/console-84d7d95f67-wn52j" Apr 16 22:16:03.239730 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:03.239690 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-console-serving-cert\") pod \"console-84d7d95f67-wn52j\" (UID: \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\") " pod="openshift-console/console-84d7d95f67-wn52j" Apr 16 22:16:03.239977 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:03.239774 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-console-oauth-config\") pod \"console-84d7d95f67-wn52j\" (UID: \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\") " pod="openshift-console/console-84d7d95f67-wn52j" Apr 16 22:16:03.239977 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:03.239807 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-service-ca\") pod \"console-84d7d95f67-wn52j\" (UID: \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\") " pod="openshift-console/console-84d7d95f67-wn52j" Apr 16 22:16:03.341012 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:03.340981 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-trusted-ca-bundle\") pod \"console-84d7d95f67-wn52j\" (UID: \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\") " pod="openshift-console/console-84d7d95f67-wn52j" Apr 16 22:16:03.341012 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:03.341022 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-oauth-serving-cert\") pod \"console-84d7d95f67-wn52j\" (UID: \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\") " pod="openshift-console/console-84d7d95f67-wn52j" Apr 16 22:16:03.341262 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:03.341056 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c25g7\" (UniqueName: \"kubernetes.io/projected/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-kube-api-access-c25g7\") pod \"console-84d7d95f67-wn52j\" (UID: \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\") " pod="openshift-console/console-84d7d95f67-wn52j" Apr 16 22:16:03.341262 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:03.341128 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-console-config\") pod \"console-84d7d95f67-wn52j\" (UID: \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\") " pod="openshift-console/console-84d7d95f67-wn52j" Apr 16 22:16:03.341375 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:03.341308 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-console-serving-cert\") pod \"console-84d7d95f67-wn52j\" (UID: \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\") " pod="openshift-console/console-84d7d95f67-wn52j" Apr 16 22:16:03.341428 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:03.341389 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-console-oauth-config\") pod \"console-84d7d95f67-wn52j\" (UID: \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\") " pod="openshift-console/console-84d7d95f67-wn52j" Apr 16 22:16:03.341428 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:03.341418 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-service-ca\") pod \"console-84d7d95f67-wn52j\" (UID: \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\") " pod="openshift-console/console-84d7d95f67-wn52j" Apr 16 22:16:03.341866 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:03.341839 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-oauth-serving-cert\") pod \"console-84d7d95f67-wn52j\" (UID: \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\") " pod="openshift-console/console-84d7d95f67-wn52j" Apr 16 22:16:03.341976 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:03.341862 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-console-config\") pod \"console-84d7d95f67-wn52j\" (UID: \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\") " pod="openshift-console/console-84d7d95f67-wn52j" Apr 16 22:16:03.342368 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:03.342343 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-trusted-ca-bundle\") pod \"console-84d7d95f67-wn52j\" (UID: \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\") " pod="openshift-console/console-84d7d95f67-wn52j" Apr 16 22:16:03.342951 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:03.342933 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-service-ca\") pod \"console-84d7d95f67-wn52j\" (UID: \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\") " pod="openshift-console/console-84d7d95f67-wn52j" Apr 16 22:16:03.344186 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:03.344161 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-console-oauth-config\") pod \"console-84d7d95f67-wn52j\" (UID: \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\") " pod="openshift-console/console-84d7d95f67-wn52j" Apr 16 22:16:03.344291 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:03.344162 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-console-serving-cert\") pod \"console-84d7d95f67-wn52j\" (UID: \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\") " pod="openshift-console/console-84d7d95f67-wn52j" Apr 16 22:16:03.351111 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:03.351075 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c25g7\" (UniqueName: \"kubernetes.io/projected/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-kube-api-access-c25g7\") pod \"console-84d7d95f67-wn52j\" (UID: \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\") " pod="openshift-console/console-84d7d95f67-wn52j" Apr 16 22:16:03.451550 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:03.451465 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84d7d95f67-wn52j" Apr 16 22:16:03.610241 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:03.610206 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84d7d95f67-wn52j"] Apr 16 22:16:03.615865 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:16:03.615830 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabe770ce_fdb9_4d0e_a512_ee45b0c236d7.slice/crio-d3c6e875388c450b454ed97c7510023a4d89e2f0208747314afe5aadaab71f61 WatchSource:0}: Error finding container d3c6e875388c450b454ed97c7510023a4d89e2f0208747314afe5aadaab71f61: Status 404 returned error can't find the container with id d3c6e875388c450b454ed97c7510023a4d89e2f0208747314afe5aadaab71f61 Apr 16 22:16:03.775394 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:03.775347 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84d7d95f67-wn52j" event={"ID":"abe770ce-fdb9-4d0e-a512-ee45b0c236d7","Type":"ContainerStarted","Data":"d3c6e875388c450b454ed97c7510023a4d89e2f0208747314afe5aadaab71f61"} Apr 16 22:16:03.991652 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:03.991612 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-d7wxq"] Apr 16 22:16:03.994127 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:03.994105 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-d7wxq" Apr 16 22:16:04.000367 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.000342 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 22:16:04.002442 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.002157 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 22:16:04.002442 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.002300 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 22:16:04.002442 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.002331 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-9rs8n\"" Apr 16 22:16:04.015554 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.015531 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-d7wxq"] Apr 16 22:16:04.036529 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.036445 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-lr78b"] Apr 16 22:16:04.039710 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.039262 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lr78b" Apr 16 22:16:04.043981 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.043830 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-4vs7f\"" Apr 16 22:16:04.044420 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.044401 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 22:16:04.044572 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.044552 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 22:16:04.045886 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.045400 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 22:16:04.048111 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.047592 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ead46a40-67d9-4343-a6dd-14f5815b264c-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-d7wxq\" (UID: \"ead46a40-67d9-4343-a6dd-14f5815b264c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d7wxq" Apr 16 22:16:04.048111 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.047645 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ead46a40-67d9-4343-a6dd-14f5815b264c-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-d7wxq\" (UID: \"ead46a40-67d9-4343-a6dd-14f5815b264c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d7wxq" Apr 16 22:16:04.048111 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.047698 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ead46a40-67d9-4343-a6dd-14f5815b264c-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-d7wxq\" (UID: \"ead46a40-67d9-4343-a6dd-14f5815b264c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d7wxq" Apr 16 22:16:04.048111 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.047821 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ead46a40-67d9-4343-a6dd-14f5815b264c-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-d7wxq\" (UID: \"ead46a40-67d9-4343-a6dd-14f5815b264c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d7wxq" Apr 16 22:16:04.048111 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.047995 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7jw6\" (UniqueName: \"kubernetes.io/projected/ead46a40-67d9-4343-a6dd-14f5815b264c-kube-api-access-b7jw6\") pod \"kube-state-metrics-69db897b98-d7wxq\" (UID: \"ead46a40-67d9-4343-a6dd-14f5815b264c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d7wxq" Apr 16 22:16:04.048111 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.048037 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/ead46a40-67d9-4343-a6dd-14f5815b264c-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-d7wxq\" (UID: \"ead46a40-67d9-4343-a6dd-14f5815b264c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d7wxq" Apr 16 22:16:04.148627 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.148544 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/ead46a40-67d9-4343-a6dd-14f5815b264c-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-d7wxq\" (UID: \"ead46a40-67d9-4343-a6dd-14f5815b264c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d7wxq" Apr 16 22:16:04.148627 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.148618 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/430b1bf1-900c-4030-8524-0be782a10fc1-node-exporter-textfile\") pod \"node-exporter-lr78b\" (UID: \"430b1bf1-900c-4030-8524-0be782a10fc1\") " pod="openshift-monitoring/node-exporter-lr78b" Apr 16 22:16:04.149180 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.148660 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ead46a40-67d9-4343-a6dd-14f5815b264c-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-d7wxq\" (UID: \"ead46a40-67d9-4343-a6dd-14f5815b264c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d7wxq" Apr 16 22:16:04.149180 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.148685 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ead46a40-67d9-4343-a6dd-14f5815b264c-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-d7wxq\" (UID: \"ead46a40-67d9-4343-a6dd-14f5815b264c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d7wxq" Apr 16 22:16:04.149180 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.148758 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/430b1bf1-900c-4030-8524-0be782a10fc1-node-exporter-wtmp\") pod \"node-exporter-lr78b\" (UID: \"430b1bf1-900c-4030-8524-0be782a10fc1\") " pod="openshift-monitoring/node-exporter-lr78b" Apr 16 22:16:04.149180 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.148797 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/430b1bf1-900c-4030-8524-0be782a10fc1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lr78b\" (UID: \"430b1bf1-900c-4030-8524-0be782a10fc1\") " pod="openshift-monitoring/node-exporter-lr78b" Apr 16 22:16:04.149180 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.148832 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/430b1bf1-900c-4030-8524-0be782a10fc1-metrics-client-ca\") pod \"node-exporter-lr78b\" (UID: \"430b1bf1-900c-4030-8524-0be782a10fc1\") " pod="openshift-monitoring/node-exporter-lr78b" Apr 16 22:16:04.149180 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.148879 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ead46a40-67d9-4343-a6dd-14f5815b264c-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-d7wxq\" (UID: \"ead46a40-67d9-4343-a6dd-14f5815b264c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d7wxq" Apr 16 22:16:04.149180 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:16:04.148914 2574 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 22:16:04.149180 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.148904 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/430b1bf1-900c-4030-8524-0be782a10fc1-sys\") pod \"node-exporter-lr78b\" (UID: \"430b1bf1-900c-4030-8524-0be782a10fc1\") " pod="openshift-monitoring/node-exporter-lr78b" Apr 16 22:16:04.149180 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.148983 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/ead46a40-67d9-4343-a6dd-14f5815b264c-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-d7wxq\" (UID: \"ead46a40-67d9-4343-a6dd-14f5815b264c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d7wxq" Apr 16 22:16:04.149180 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:16:04.148992 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ead46a40-67d9-4343-a6dd-14f5815b264c-kube-state-metrics-tls podName:ead46a40-67d9-4343-a6dd-14f5815b264c nodeName:}" failed. No retries permitted until 2026-04-16 22:16:04.648971621 +0000 UTC m=+155.017377749 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/ead46a40-67d9-4343-a6dd-14f5815b264c-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-d7wxq" (UID: "ead46a40-67d9-4343-a6dd-14f5815b264c") : secret "kube-state-metrics-tls" not found Apr 16 22:16:04.149180 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.149032 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42lfn\" (UniqueName: \"kubernetes.io/projected/430b1bf1-900c-4030-8524-0be782a10fc1-kube-api-access-42lfn\") pod \"node-exporter-lr78b\" (UID: \"430b1bf1-900c-4030-8524-0be782a10fc1\") " pod="openshift-monitoring/node-exporter-lr78b" Apr 16 22:16:04.149756 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.149196 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ead46a40-67d9-4343-a6dd-14f5815b264c-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-d7wxq\" (UID: \"ead46a40-67d9-4343-a6dd-14f5815b264c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d7wxq" Apr 16 22:16:04.149756 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.149236 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/430b1bf1-900c-4030-8524-0be782a10fc1-node-exporter-accelerators-collector-config\") pod \"node-exporter-lr78b\" (UID: \"430b1bf1-900c-4030-8524-0be782a10fc1\") " pod="openshift-monitoring/node-exporter-lr78b" Apr 16 22:16:04.149756 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.149295 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7jw6\" (UniqueName: \"kubernetes.io/projected/ead46a40-67d9-4343-a6dd-14f5815b264c-kube-api-access-b7jw6\") pod \"kube-state-metrics-69db897b98-d7wxq\" (UID: \"ead46a40-67d9-4343-a6dd-14f5815b264c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d7wxq" Apr 16 22:16:04.149756 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.149322 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/430b1bf1-900c-4030-8524-0be782a10fc1-node-exporter-tls\") pod \"node-exporter-lr78b\" (UID: \"430b1bf1-900c-4030-8524-0be782a10fc1\") " pod="openshift-monitoring/node-exporter-lr78b" Apr 16 22:16:04.149756 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.149352 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/430b1bf1-900c-4030-8524-0be782a10fc1-root\") pod \"node-exporter-lr78b\" (UID: \"430b1bf1-900c-4030-8524-0be782a10fc1\") " pod="openshift-monitoring/node-exporter-lr78b" Apr 16 22:16:04.149756 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.149530 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ead46a40-67d9-4343-a6dd-14f5815b264c-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-d7wxq\" (UID: \"ead46a40-67d9-4343-a6dd-14f5815b264c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d7wxq" Apr 16 22:16:04.150028 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.149881 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/ead46a40-67d9-4343-a6dd-14f5815b264c-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-d7wxq\" (UID: \"ead46a40-67d9-4343-a6dd-14f5815b264c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d7wxq" Apr 16 22:16:04.152058 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.152035 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ead46a40-67d9-4343-a6dd-14f5815b264c-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-d7wxq\" (UID: \"ead46a40-67d9-4343-a6dd-14f5815b264c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d7wxq" Apr 16 22:16:04.208633 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.208601 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7jw6\" (UniqueName: \"kubernetes.io/projected/ead46a40-67d9-4343-a6dd-14f5815b264c-kube-api-access-b7jw6\") pod \"kube-state-metrics-69db897b98-d7wxq\" (UID: \"ead46a40-67d9-4343-a6dd-14f5815b264c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d7wxq" Apr 16 22:16:04.249893 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.249811 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/430b1bf1-900c-4030-8524-0be782a10fc1-node-exporter-accelerators-collector-config\") pod \"node-exporter-lr78b\" (UID: \"430b1bf1-900c-4030-8524-0be782a10fc1\") " pod="openshift-monitoring/node-exporter-lr78b" Apr 16 22:16:04.250074 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.249986 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/430b1bf1-900c-4030-8524-0be782a10fc1-node-exporter-tls\") pod \"node-exporter-lr78b\" (UID: \"430b1bf1-900c-4030-8524-0be782a10fc1\") " pod="openshift-monitoring/node-exporter-lr78b" Apr 16 22:16:04.250074 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.250038 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/430b1bf1-900c-4030-8524-0be782a10fc1-root\") pod \"node-exporter-lr78b\" (UID: \"430b1bf1-900c-4030-8524-0be782a10fc1\") " pod="openshift-monitoring/node-exporter-lr78b" Apr 16 22:16:04.250183 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.250095 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/430b1bf1-900c-4030-8524-0be782a10fc1-node-exporter-textfile\") pod \"node-exporter-lr78b\" (UID: \"430b1bf1-900c-4030-8524-0be782a10fc1\") " pod="openshift-monitoring/node-exporter-lr78b" Apr 16 22:16:04.250241 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:16:04.250180 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 22:16:04.250241 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.250224 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/430b1bf1-900c-4030-8524-0be782a10fc1-node-exporter-wtmp\") pod \"node-exporter-lr78b\" (UID: \"430b1bf1-900c-4030-8524-0be782a10fc1\") " pod="openshift-monitoring/node-exporter-lr78b" Apr 16 22:16:04.250338 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:16:04.250248 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/430b1bf1-900c-4030-8524-0be782a10fc1-node-exporter-tls podName:430b1bf1-900c-4030-8524-0be782a10fc1 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:04.750227612 +0000 UTC m=+155.118633754 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/430b1bf1-900c-4030-8524-0be782a10fc1-node-exporter-tls") pod "node-exporter-lr78b" (UID: "430b1bf1-900c-4030-8524-0be782a10fc1") : secret "node-exporter-tls" not found Apr 16 22:16:04.250338 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.250282 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/430b1bf1-900c-4030-8524-0be782a10fc1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lr78b\" (UID: \"430b1bf1-900c-4030-8524-0be782a10fc1\") " pod="openshift-monitoring/node-exporter-lr78b" Apr 16 22:16:04.250338 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.250319 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/430b1bf1-900c-4030-8524-0be782a10fc1-metrics-client-ca\") pod \"node-exporter-lr78b\" (UID: \"430b1bf1-900c-4030-8524-0be782a10fc1\") " pod="openshift-monitoring/node-exporter-lr78b" Apr 16 22:16:04.250485 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.250345 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/430b1bf1-900c-4030-8524-0be782a10fc1-node-exporter-wtmp\") pod \"node-exporter-lr78b\" (UID: \"430b1bf1-900c-4030-8524-0be782a10fc1\") " pod="openshift-monitoring/node-exporter-lr78b" Apr 16 22:16:04.250485 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.250361 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/430b1bf1-900c-4030-8524-0be782a10fc1-sys\") pod \"node-exporter-lr78b\" (UID: \"430b1bf1-900c-4030-8524-0be782a10fc1\") " pod="openshift-monitoring/node-exporter-lr78b" Apr 16 22:16:04.250485 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.250387 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42lfn\" (UniqueName: \"kubernetes.io/projected/430b1bf1-900c-4030-8524-0be782a10fc1-kube-api-access-42lfn\") pod \"node-exporter-lr78b\" (UID: \"430b1bf1-900c-4030-8524-0be782a10fc1\") " pod="openshift-monitoring/node-exporter-lr78b" Apr 16 22:16:04.250485 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.250401 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/430b1bf1-900c-4030-8524-0be782a10fc1-node-exporter-textfile\") pod \"node-exporter-lr78b\" (UID: \"430b1bf1-900c-4030-8524-0be782a10fc1\") " pod="openshift-monitoring/node-exporter-lr78b" Apr 16 22:16:04.250485 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.250177 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/430b1bf1-900c-4030-8524-0be782a10fc1-root\") pod \"node-exporter-lr78b\" (UID: \"430b1bf1-900c-4030-8524-0be782a10fc1\") " pod="openshift-monitoring/node-exporter-lr78b" Apr 16 22:16:04.250700 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.250501 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/430b1bf1-900c-4030-8524-0be782a10fc1-node-exporter-accelerators-collector-config\") pod \"node-exporter-lr78b\" (UID: \"430b1bf1-900c-4030-8524-0be782a10fc1\") " pod="openshift-monitoring/node-exporter-lr78b" Apr 16 22:16:04.250700 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.250563 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/430b1bf1-900c-4030-8524-0be782a10fc1-sys\") pod \"node-exporter-lr78b\" (UID: \"430b1bf1-900c-4030-8524-0be782a10fc1\") " pod="openshift-monitoring/node-exporter-lr78b" Apr 16 22:16:04.250849 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.250830 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/430b1bf1-900c-4030-8524-0be782a10fc1-metrics-client-ca\") pod \"node-exporter-lr78b\" (UID: \"430b1bf1-900c-4030-8524-0be782a10fc1\") " pod="openshift-monitoring/node-exporter-lr78b" Apr 16 22:16:04.253179 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.253154 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/430b1bf1-900c-4030-8524-0be782a10fc1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lr78b\" (UID: \"430b1bf1-900c-4030-8524-0be782a10fc1\") " pod="openshift-monitoring/node-exporter-lr78b" Apr 16 22:16:04.277219 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.277150 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-42lfn\" (UniqueName: \"kubernetes.io/projected/430b1bf1-900c-4030-8524-0be782a10fc1-kube-api-access-42lfn\") pod \"node-exporter-lr78b\" (UID: \"430b1bf1-900c-4030-8524-0be782a10fc1\") " pod="openshift-monitoring/node-exporter-lr78b" Apr 16 22:16:04.654354 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.654317 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ead46a40-67d9-4343-a6dd-14f5815b264c-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-d7wxq\" (UID: \"ead46a40-67d9-4343-a6dd-14f5815b264c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d7wxq" Apr 16 22:16:04.657658 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.657630 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ead46a40-67d9-4343-a6dd-14f5815b264c-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-d7wxq\" (UID: \"ead46a40-67d9-4343-a6dd-14f5815b264c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-d7wxq" Apr 16 22:16:04.755432 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.755388 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/430b1bf1-900c-4030-8524-0be782a10fc1-node-exporter-tls\") pod \"node-exporter-lr78b\" (UID: \"430b1bf1-900c-4030-8524-0be782a10fc1\") " pod="openshift-monitoring/node-exporter-lr78b" Apr 16 22:16:04.758454 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.758428 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/430b1bf1-900c-4030-8524-0be782a10fc1-node-exporter-tls\") pod \"node-exporter-lr78b\" (UID: \"430b1bf1-900c-4030-8524-0be782a10fc1\") " pod="openshift-monitoring/node-exporter-lr78b" Apr 16 22:16:04.906591 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.906508 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-d7wxq" Apr 16 22:16:04.952088 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:04.952056 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lr78b" Apr 16 22:16:05.050246 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:16:05.050201 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-ffxpx" podUID="d146454c-862c-4665-b411-fd4c29e30335" Apr 16 22:16:05.060426 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:16:05.060386 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-xmqcp" podUID="cbfccc2d-f25a-4d8a-bd22-25a929f12d64" Apr 16 22:16:05.552863 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:16:05.552735 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podead46a40_67d9_4343_a6dd_14f5815b264c.slice/crio-91a24cf26dcb24ff715ce9c2db493d95fc3b7e4755d5ef69a4d001c0c9b73dc4 WatchSource:0}: Error finding container 91a24cf26dcb24ff715ce9c2db493d95fc3b7e4755d5ef69a4d001c0c9b73dc4: Status 404 returned error can't find the container with id 91a24cf26dcb24ff715ce9c2db493d95fc3b7e4755d5ef69a4d001c0c9b73dc4 Apr 16 22:16:05.554612 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:05.554592 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-d7wxq"] Apr 16 22:16:05.782014 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:05.781977 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84d7d95f67-wn52j" event={"ID":"abe770ce-fdb9-4d0e-a512-ee45b0c236d7","Type":"ContainerStarted","Data":"c9e2d781686ff3be4ff14eb1b971c0dab80f17909a28bd1246c828e7cfaf931a"} Apr 16 22:16:05.783346 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:05.783318 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-97f84ccf4-2cnfj" event={"ID":"12f32bdf-eadc-4da6-b3d3-55490357709e","Type":"ContainerStarted","Data":"301b19d4c9376ac4c77b677928496c778c5fa38abd9b78c965d062db07ea6cad"} Apr 16 22:16:05.784327 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:05.784306 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-d7wxq" event={"ID":"ead46a40-67d9-4343-a6dd-14f5815b264c","Type":"ContainerStarted","Data":"91a24cf26dcb24ff715ce9c2db493d95fc3b7e4755d5ef69a4d001c0c9b73dc4"} Apr 16 22:16:05.785342 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:05.785304 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lr78b" event={"ID":"430b1bf1-900c-4030-8524-0be782a10fc1","Type":"ContainerStarted","Data":"cc17ca4aa1086ef7272253e92842e40a16f3a0e0181b574f9533709048da11eb"} Apr 16 22:16:05.785434 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:05.785324 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ffxpx" Apr 16 22:16:05.806693 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:05.806605 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-84d7d95f67-wn52j" podStartSLOduration=1.137533829 podStartE2EDuration="2.806591528s" podCreationTimestamp="2026-04-16 22:16:03 +0000 UTC" firstStartedPulling="2026-04-16 22:16:03.618031601 +0000 UTC m=+153.986437733" lastFinishedPulling="2026-04-16 22:16:05.287089289 +0000 UTC m=+155.655495432" observedRunningTime="2026-04-16 22:16:05.805839526 +0000 UTC m=+156.174245667" watchObservedRunningTime="2026-04-16 22:16:05.806591528 +0000 UTC m=+156.174997679" Apr 16 22:16:05.837320 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:05.837279 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-97f84ccf4-2cnfj" podStartSLOduration=1.2624210310000001 podStartE2EDuration="4.837264149s" podCreationTimestamp="2026-04-16 22:16:01 +0000 UTC" firstStartedPulling="2026-04-16 22:16:01.703234856 +0000 UTC m=+152.071640988" lastFinishedPulling="2026-04-16 22:16:05.278077974 +0000 UTC m=+155.646484106" observedRunningTime="2026-04-16 22:16:05.834814317 +0000 UTC m=+156.203220469" watchObservedRunningTime="2026-04-16 22:16:05.837264149 +0000 UTC m=+156.205670301" Apr 16 22:16:06.789147 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:06.789115 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-d7wxq" event={"ID":"ead46a40-67d9-4343-a6dd-14f5815b264c","Type":"ContainerStarted","Data":"2333e81d472fce3e590368ec4f0118e5c791f15db6981a94c1851ca380aa4118"} Apr 16 22:16:06.790585 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:06.790558 2574 generic.go:358] "Generic (PLEG): container finished" podID="430b1bf1-900c-4030-8524-0be782a10fc1" containerID="1c09172ea7b081088759c39a7ce9aacd77250cf054b0316995cda6d708630c7a" exitCode=0 Apr 16 22:16:06.790692 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:06.790641 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lr78b" event={"ID":"430b1bf1-900c-4030-8524-0be782a10fc1","Type":"ContainerDied","Data":"1c09172ea7b081088759c39a7ce9aacd77250cf054b0316995cda6d708630c7a"} Apr 16 22:16:07.795572 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:07.795540 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-d7wxq" event={"ID":"ead46a40-67d9-4343-a6dd-14f5815b264c","Type":"ContainerStarted","Data":"b00f2002c1499fbf8931bfeb7514153528b7193c050fc3a878c9ffdc5447a6c3"} Apr 16 22:16:07.795986 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:07.795577 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-d7wxq" event={"ID":"ead46a40-67d9-4343-a6dd-14f5815b264c","Type":"ContainerStarted","Data":"ff133d5e41cb1bdd4fd2e3c47f949bc90438c5ae45c99c23fe4a06eba8691909"} Apr 16 22:16:07.797383 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:07.797357 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lr78b" event={"ID":"430b1bf1-900c-4030-8524-0be782a10fc1","Type":"ContainerStarted","Data":"060ba3d6c3d5c8fdf6b8990d490d1a88bfa1330d432a42b71b936a6120f4b073"} Apr 16 22:16:07.797383 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:07.797385 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lr78b" event={"ID":"430b1bf1-900c-4030-8524-0be782a10fc1","Type":"ContainerStarted","Data":"395cc1a60e766c02f3eab4d022583b5b6393428d1098540f754b3f2f04fafaff"} Apr 16 22:16:07.871991 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:07.871946 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-d7wxq" podStartSLOduration=3.750594422 podStartE2EDuration="4.871931977s" podCreationTimestamp="2026-04-16 22:16:03 +0000 UTC" firstStartedPulling="2026-04-16 22:16:05.555820472 +0000 UTC m=+155.924226616" lastFinishedPulling="2026-04-16 22:16:06.677158034 +0000 UTC m=+157.045564171" observedRunningTime="2026-04-16 22:16:07.834352137 +0000 UTC m=+158.202758287" watchObservedRunningTime="2026-04-16 22:16:07.871931977 +0000 UTC m=+158.240338126" Apr 16 22:16:07.872791 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:07.872765 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-lr78b" podStartSLOduration=4.196861494 podStartE2EDuration="4.872755874s" podCreationTimestamp="2026-04-16 22:16:03 +0000 UTC" firstStartedPulling="2026-04-16 22:16:05.267833969 +0000 UTC m=+155.636240105" lastFinishedPulling="2026-04-16 22:16:05.943728353 +0000 UTC m=+156.312134485" observedRunningTime="2026-04-16 22:16:07.871568646 +0000 UTC m=+158.239974797" watchObservedRunningTime="2026-04-16 22:16:07.872755874 +0000 UTC m=+158.241162016" Apr 16 22:16:08.738192 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:08.738156 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-97f84ccf4-2cnfj"] Apr 16 22:16:10.004915 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.004875 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d146454c-862c-4665-b411-fd4c29e30335-cert\") pod \"ingress-canary-ffxpx\" (UID: \"d146454c-862c-4665-b411-fd4c29e30335\") " pod="openshift-ingress-canary/ingress-canary-ffxpx" Apr 16 22:16:10.004915 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.004924 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-metrics-tls\") pod \"dns-default-xmqcp\" (UID: \"cbfccc2d-f25a-4d8a-bd22-25a929f12d64\") " pod="openshift-dns/dns-default-xmqcp" Apr 16 22:16:10.007405 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.007375 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cbfccc2d-f25a-4d8a-bd22-25a929f12d64-metrics-tls\") pod \"dns-default-xmqcp\" (UID: \"cbfccc2d-f25a-4d8a-bd22-25a929f12d64\") " pod="openshift-dns/dns-default-xmqcp" Apr 16 22:16:10.007573 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.007528 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d146454c-862c-4665-b411-fd4c29e30335-cert\") pod \"ingress-canary-ffxpx\" (UID: \"d146454c-862c-4665-b411-fd4c29e30335\") " pod="openshift-ingress-canary/ingress-canary-ffxpx" Apr 16 22:16:10.289362 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.289285 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-skfrk\"" Apr 16 22:16:10.296500 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.296461 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ffxpx" Apr 16 22:16:10.302193 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.302168 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:16:10.307020 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.306994 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.312256 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.311729 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 22:16:10.312256 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.311840 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 22:16:10.312256 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.311967 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 22:16:10.312256 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.312097 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-cn5fns6cp64d8\"" Apr 16 22:16:10.312256 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.312101 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 22:16:10.312638 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.312385 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 22:16:10.313294 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.313100 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 22:16:10.313294 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.313146 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 22:16:10.313448 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.313309 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 22:16:10.313541 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.313501 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 22:16:10.313728 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.313611 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-k6bnx\"" Apr 16 22:16:10.313840 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.313772 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 22:16:10.314352 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.314322 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 22:16:10.316121 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.316103 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 22:16:10.319160 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.319121 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 22:16:10.329609 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.329584 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:16:10.408281 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.408175 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.408281 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.408210 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.408281 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.408231 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-config\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.408281 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.408248 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.408505 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.408308 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-config-out\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.408505 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.408344 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.408505 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.408372 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.408505 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.408390 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.408505 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.408426 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.408505 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.408448 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.408505 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.408470 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.408505 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.408487 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.408505 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.408504 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.408778 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.408533 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.408778 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.408550 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.408778 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.408569 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.408778 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.408585 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tx97\" (UniqueName: \"kubernetes.io/projected/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-kube-api-access-5tx97\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.408778 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.408605 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-web-config\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.440934 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.440910 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ffxpx"] Apr 16 22:16:10.443246 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:16:10.443220 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd146454c_862c_4665_b411_fd4c29e30335.slice/crio-6bf501f6ae05346cbf5130c141ff5e8e69706e361fc456fa1de7516fd412b245 WatchSource:0}: Error finding container 6bf501f6ae05346cbf5130c141ff5e8e69706e361fc456fa1de7516fd412b245: Status 404 returned error can't find the container with id 6bf501f6ae05346cbf5130c141ff5e8e69706e361fc456fa1de7516fd412b245 Apr 16 22:16:10.509522 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.509488 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.509694 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.509537 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.509694 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.509574 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.509694 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.509603 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.509694 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.509629 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.509954 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.509767 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.509954 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.509797 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.509954 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.509828 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.509954 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.509855 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5tx97\" (UniqueName: \"kubernetes.io/projected/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-kube-api-access-5tx97\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.509954 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.509890 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-web-config\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.509954 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.509933 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.510231 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.509961 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.510231 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.509990 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-config\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.510231 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.510014 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.510231 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.510055 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-config-out\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.510231 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.510092 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.510231 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.510133 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.510231 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.510162 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.511875 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.510874 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.511875 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.511354 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.511875 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.511520 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.513156 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.513128 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.513310 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.513286 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.515995 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.515964 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.517166 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.516548 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-config-out\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.517166 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.516616 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.517166 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.516633 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.517166 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.516616 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.517166 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.517051 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-config\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.517166 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.517122 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.517612 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.517585 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.517612 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.517599 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-web-config\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.518046 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.518017 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.518189 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.518166 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.518777 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.518731 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.525652 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.525629 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tx97\" (UniqueName: \"kubernetes.io/projected/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-kube-api-access-5tx97\") pod \"prometheus-k8s-0\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.629722 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.629644 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:10.763453 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.763427 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:16:10.765623 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:16:10.765596 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a702e4d_e5b1_4f01_87b9_89a3e10b70b3.slice/crio-d9458608d49f6ce669dc4fa3252d67dfa9d30d34c5c907ee61e369391f8513e8 WatchSource:0}: Error finding container d9458608d49f6ce669dc4fa3252d67dfa9d30d34c5c907ee61e369391f8513e8: Status 404 returned error can't find the container with id d9458608d49f6ce669dc4fa3252d67dfa9d30d34c5c907ee61e369391f8513e8 Apr 16 22:16:10.806458 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.806427 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ffxpx" event={"ID":"d146454c-862c-4665-b411-fd4c29e30335","Type":"ContainerStarted","Data":"6bf501f6ae05346cbf5130c141ff5e8e69706e361fc456fa1de7516fd412b245"} Apr 16 22:16:10.807291 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:10.807269 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3","Type":"ContainerStarted","Data":"d9458608d49f6ce669dc4fa3252d67dfa9d30d34c5c907ee61e369391f8513e8"} Apr 16 22:16:11.585768 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:11.585714 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-97f84ccf4-2cnfj" Apr 16 22:16:11.923597 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:11.923568 2574 patch_prober.go:28] interesting pod/image-registry-5fb7ff9769-w4x49 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 22:16:11.923705 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:11.923615 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" podUID="2c4c4eef-b842-4810-b506-7094264f295f" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:16:12.815762 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:12.815661 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ffxpx" event={"ID":"d146454c-862c-4665-b411-fd4c29e30335","Type":"ContainerStarted","Data":"17e938f008652d9bc12d89890eb164942fcf8fd02f71c3083f23c3fe41ea454f"} Apr 16 22:16:12.817069 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:12.817036 2574 generic.go:358] "Generic (PLEG): container finished" podID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerID="2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694" exitCode=0 Apr 16 22:16:12.817149 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:12.817092 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3","Type":"ContainerDied","Data":"2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694"} Apr 16 22:16:12.832141 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:12.832093 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-ffxpx" podStartSLOduration=129.760752848 podStartE2EDuration="2m11.832079346s" podCreationTimestamp="2026-04-16 22:14:01 +0000 UTC" firstStartedPulling="2026-04-16 22:16:10.445317483 +0000 UTC m=+160.813723626" lastFinishedPulling="2026-04-16 22:16:12.516643996 +0000 UTC m=+162.885050124" observedRunningTime="2026-04-16 22:16:12.831253875 +0000 UTC m=+163.199660029" watchObservedRunningTime="2026-04-16 22:16:12.832079346 +0000 UTC m=+163.200485497" Apr 16 22:16:13.452371 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:13.452329 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-84d7d95f67-wn52j" Apr 16 22:16:13.452546 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:13.452385 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-84d7d95f67-wn52j" Apr 16 22:16:13.453950 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:13.453925 2574 patch_prober.go:28] interesting pod/console-84d7d95f67-wn52j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.133.0.16:8443/health\": dial tcp 10.133.0.16:8443: connect: connection refused" start-of-body= Apr 16 22:16:13.454073 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:13.453966 2574 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-84d7d95f67-wn52j" podUID="abe770ce-fdb9-4d0e-a512-ee45b0c236d7" containerName="console" probeResult="failure" output="Get \"https://10.133.0.16:8443/health\": dial tcp 10.133.0.16:8443: connect: connection refused" Apr 16 22:16:15.257874 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:15.257834 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xmqcp" Apr 16 22:16:15.260974 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:15.260946 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-nlsdl\"" Apr 16 22:16:15.268766 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:15.268725 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xmqcp" Apr 16 22:16:15.463906 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:15.463821 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xmqcp"] Apr 16 22:16:15.826853 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:15.826772 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xmqcp" event={"ID":"cbfccc2d-f25a-4d8a-bd22-25a929f12d64","Type":"ContainerStarted","Data":"9d9d0d2fab2820fcbbcc61d3b39a00b69c772620b6942b5e69fabc55a3293346"} Apr 16 22:16:15.828691 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:15.828666 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3","Type":"ContainerStarted","Data":"ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206"} Apr 16 22:16:15.828821 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:15.828696 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3","Type":"ContainerStarted","Data":"8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578"} Apr 16 22:16:16.937667 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:16.937628 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" podUID="2c4c4eef-b842-4810-b506-7094264f295f" containerName="registry" containerID="cri-o://37dfb12a85401b78e7299f78bdddd655814e345e636e5096f8e95399f0b409d5" gracePeriod=30 Apr 16 22:16:17.669440 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.669102 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:16:17.778850 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.778599 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2c4c4eef-b842-4810-b506-7094264f295f-ca-trust-extracted\") pod \"2c4c4eef-b842-4810-b506-7094264f295f\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " Apr 16 22:16:17.778850 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.778666 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2f5j\" (UniqueName: \"kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-kube-api-access-r2f5j\") pod \"2c4c4eef-b842-4810-b506-7094264f295f\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " Apr 16 22:16:17.778850 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.778704 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2c4c4eef-b842-4810-b506-7094264f295f-image-registry-private-configuration\") pod \"2c4c4eef-b842-4810-b506-7094264f295f\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " Apr 16 22:16:17.778850 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.778773 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2c4c4eef-b842-4810-b506-7094264f295f-registry-certificates\") pod \"2c4c4eef-b842-4810-b506-7094264f295f\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " Apr 16 22:16:17.778850 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.778807 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c4c4eef-b842-4810-b506-7094264f295f-trusted-ca\") pod \"2c4c4eef-b842-4810-b506-7094264f295f\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " Apr 16 22:16:17.778850 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.778838 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-registry-tls\") pod \"2c4c4eef-b842-4810-b506-7094264f295f\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " Apr 16 22:16:17.779109 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.778879 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-bound-sa-token\") pod \"2c4c4eef-b842-4810-b506-7094264f295f\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " Apr 16 22:16:17.779109 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.778930 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2c4c4eef-b842-4810-b506-7094264f295f-installation-pull-secrets\") pod \"2c4c4eef-b842-4810-b506-7094264f295f\" (UID: \"2c4c4eef-b842-4810-b506-7094264f295f\") " Apr 16 22:16:17.779262 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.779194 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c4c4eef-b842-4810-b506-7094264f295f-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "2c4c4eef-b842-4810-b506-7094264f295f" (UID: "2c4c4eef-b842-4810-b506-7094264f295f"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:16:17.780391 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.780352 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c4c4eef-b842-4810-b506-7094264f295f-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2c4c4eef-b842-4810-b506-7094264f295f" (UID: "2c4c4eef-b842-4810-b506-7094264f295f"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:16:17.781916 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.781840 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c4c4eef-b842-4810-b506-7094264f295f-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "2c4c4eef-b842-4810-b506-7094264f295f" (UID: "2c4c4eef-b842-4810-b506-7094264f295f"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:16:17.782356 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.782316 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "2c4c4eef-b842-4810-b506-7094264f295f" (UID: "2c4c4eef-b842-4810-b506-7094264f295f"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:16:17.782356 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.782334 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-kube-api-access-r2f5j" (OuterVolumeSpecName: "kube-api-access-r2f5j") pod "2c4c4eef-b842-4810-b506-7094264f295f" (UID: "2c4c4eef-b842-4810-b506-7094264f295f"). InnerVolumeSpecName "kube-api-access-r2f5j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:16:17.782557 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.782526 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c4c4eef-b842-4810-b506-7094264f295f-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "2c4c4eef-b842-4810-b506-7094264f295f" (UID: "2c4c4eef-b842-4810-b506-7094264f295f"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:16:17.782663 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.782558 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "2c4c4eef-b842-4810-b506-7094264f295f" (UID: "2c4c4eef-b842-4810-b506-7094264f295f"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:16:17.791109 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.791078 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c4c4eef-b842-4810-b506-7094264f295f-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "2c4c4eef-b842-4810-b506-7094264f295f" (UID: "2c4c4eef-b842-4810-b506-7094264f295f"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:16:17.839165 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.839126 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3","Type":"ContainerStarted","Data":"a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb"} Apr 16 22:16:17.839259 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.839166 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3","Type":"ContainerStarted","Data":"b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8"} Apr 16 22:16:17.839259 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.839181 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3","Type":"ContainerStarted","Data":"849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d"} Apr 16 22:16:17.840345 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.840322 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xmqcp" event={"ID":"cbfccc2d-f25a-4d8a-bd22-25a929f12d64","Type":"ContainerStarted","Data":"43111e691b7f435f40b7f0edb28a89f97cfe2c307c41354b3ed62b55b6ac1eef"} Apr 16 22:16:17.841522 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.841502 2574 generic.go:358] "Generic (PLEG): container finished" podID="2c4c4eef-b842-4810-b506-7094264f295f" containerID="37dfb12a85401b78e7299f78bdddd655814e345e636e5096f8e95399f0b409d5" exitCode=0 Apr 16 22:16:17.841594 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.841570 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" event={"ID":"2c4c4eef-b842-4810-b506-7094264f295f","Type":"ContainerDied","Data":"37dfb12a85401b78e7299f78bdddd655814e345e636e5096f8e95399f0b409d5"} Apr 16 22:16:17.841594 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.841589 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" event={"ID":"2c4c4eef-b842-4810-b506-7094264f295f","Type":"ContainerDied","Data":"96a45c5abe9b0ac6959f56b09569a3fb550db57e71ee8fcae5eee4932da0edcd"} Apr 16 22:16:17.841658 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.841602 2574 scope.go:117] "RemoveContainer" containerID="37dfb12a85401b78e7299f78bdddd655814e345e636e5096f8e95399f0b409d5" Apr 16 22:16:17.841775 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.841729 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5fb7ff9769-w4x49" Apr 16 22:16:17.851538 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.851464 2574 scope.go:117] "RemoveContainer" containerID="37dfb12a85401b78e7299f78bdddd655814e345e636e5096f8e95399f0b409d5" Apr 16 22:16:17.852025 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:16:17.851950 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37dfb12a85401b78e7299f78bdddd655814e345e636e5096f8e95399f0b409d5\": container with ID starting with 37dfb12a85401b78e7299f78bdddd655814e345e636e5096f8e95399f0b409d5 not found: ID does not exist" containerID="37dfb12a85401b78e7299f78bdddd655814e345e636e5096f8e95399f0b409d5" Apr 16 22:16:17.852140 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.852027 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37dfb12a85401b78e7299f78bdddd655814e345e636e5096f8e95399f0b409d5"} err="failed to get container status \"37dfb12a85401b78e7299f78bdddd655814e345e636e5096f8e95399f0b409d5\": rpc error: code = NotFound desc = could not find container \"37dfb12a85401b78e7299f78bdddd655814e345e636e5096f8e95399f0b409d5\": container with ID starting with 37dfb12a85401b78e7299f78bdddd655814e345e636e5096f8e95399f0b409d5 not found: ID does not exist" Apr 16 22:16:17.865538 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.865509 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5fb7ff9769-w4x49"] Apr 16 22:16:17.872292 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.872269 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5fb7ff9769-w4x49"] Apr 16 22:16:17.880032 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.880007 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r2f5j\" (UniqueName: \"kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-kube-api-access-r2f5j\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:16:17.880032 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.880032 2574 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2c4c4eef-b842-4810-b506-7094264f295f-image-registry-private-configuration\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:16:17.880226 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.880043 2574 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2c4c4eef-b842-4810-b506-7094264f295f-registry-certificates\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:16:17.880226 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.880054 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c4c4eef-b842-4810-b506-7094264f295f-trusted-ca\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:16:17.880226 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.880064 2574 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-registry-tls\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:16:17.880226 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.880072 2574 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2c4c4eef-b842-4810-b506-7094264f295f-bound-sa-token\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:16:17.880226 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.880082 2574 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2c4c4eef-b842-4810-b506-7094264f295f-installation-pull-secrets\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:16:17.880226 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:17.880090 2574 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2c4c4eef-b842-4810-b506-7094264f295f-ca-trust-extracted\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:16:18.262780 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:18.262728 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c4c4eef-b842-4810-b506-7094264f295f" path="/var/lib/kubelet/pods/2c4c4eef-b842-4810-b506-7094264f295f/volumes" Apr 16 22:16:18.524632 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:18.524287 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-789c5c85c5-nmwzr" podUID="684d3477-17da-4111-9ac7-84233c26da51" containerName="acm-agent" probeResult="failure" output="Get \"http://10.133.0.9:8000/readyz\": dial tcp 10.133.0.9:8000: connect: connection refused" Apr 16 22:16:18.847595 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:18.847474 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3","Type":"ContainerStarted","Data":"88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634"} Apr 16 22:16:18.849044 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:18.849011 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xmqcp" event={"ID":"cbfccc2d-f25a-4d8a-bd22-25a929f12d64","Type":"ContainerStarted","Data":"4bf6bc4a5d48f00bc734355519f327ee486042a8ec7a0355ae81be9ec74efdae"} Apr 16 22:16:18.849150 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:18.849124 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-xmqcp" Apr 16 22:16:18.850225 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:18.850204 2574 generic.go:358] "Generic (PLEG): container finished" podID="684d3477-17da-4111-9ac7-84233c26da51" containerID="445a2e79c60c865fa187f508e24f6b5ae375bbb6c72b4735d2ed6b96ed73a4f7" exitCode=1 Apr 16 22:16:18.850307 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:18.850273 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-789c5c85c5-nmwzr" event={"ID":"684d3477-17da-4111-9ac7-84233c26da51","Type":"ContainerDied","Data":"445a2e79c60c865fa187f508e24f6b5ae375bbb6c72b4735d2ed6b96ed73a4f7"} Apr 16 22:16:18.850550 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:18.850531 2574 scope.go:117] "RemoveContainer" containerID="445a2e79c60c865fa187f508e24f6b5ae375bbb6c72b4735d2ed6b96ed73a4f7" Apr 16 22:16:18.852229 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:18.852211 2574 generic.go:358] "Generic (PLEG): container finished" podID="8c091a7d-1fb3-42d3-992d-581e854f4fd6" containerID="1907217c064553a77c288e54a46f57de636591a417b153f995101d7644fa9af4" exitCode=255 Apr 16 22:16:18.852291 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:18.852267 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-bfc757575-sd5j9" event={"ID":"8c091a7d-1fb3-42d3-992d-581e854f4fd6","Type":"ContainerDied","Data":"1907217c064553a77c288e54a46f57de636591a417b153f995101d7644fa9af4"} Apr 16 22:16:18.858346 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:18.858331 2574 scope.go:117] "RemoveContainer" containerID="1907217c064553a77c288e54a46f57de636591a417b153f995101d7644fa9af4" Apr 16 22:16:18.876835 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:18.876788 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.109606302 podStartE2EDuration="8.876773002s" podCreationTimestamp="2026-04-16 22:16:10 +0000 UTC" firstStartedPulling="2026-04-16 22:16:10.767532314 +0000 UTC m=+161.135938446" lastFinishedPulling="2026-04-16 22:16:17.534698953 +0000 UTC m=+167.903105146" observedRunningTime="2026-04-16 22:16:18.875873481 +0000 UTC m=+169.244279631" watchObservedRunningTime="2026-04-16 22:16:18.876773002 +0000 UTC m=+169.245179150" Apr 16 22:16:18.931012 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:18.930942 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-xmqcp" podStartSLOduration=135.866452246 podStartE2EDuration="2m17.93092294s" podCreationTimestamp="2026-04-16 22:14:01 +0000 UTC" firstStartedPulling="2026-04-16 22:16:15.46851981 +0000 UTC m=+165.836925941" lastFinishedPulling="2026-04-16 22:16:17.532990492 +0000 UTC m=+167.901396635" observedRunningTime="2026-04-16 22:16:18.929403285 +0000 UTC m=+169.297809441" watchObservedRunningTime="2026-04-16 22:16:18.93092294 +0000 UTC m=+169.299329096" Apr 16 22:16:19.857062 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:19.857013 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-789c5c85c5-nmwzr" event={"ID":"684d3477-17da-4111-9ac7-84233c26da51","Type":"ContainerStarted","Data":"143d3971bac12af1184c527c72d948a71f825b341c352e1f6fbeee8e6661eb42"} Apr 16 22:16:19.857475 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:19.857335 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-789c5c85c5-nmwzr" Apr 16 22:16:19.858005 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:19.857982 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-789c5c85c5-nmwzr" Apr 16 22:16:19.858708 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:19.858688 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-bfc757575-sd5j9" event={"ID":"8c091a7d-1fb3-42d3-992d-581e854f4fd6","Type":"ContainerStarted","Data":"8d55cef0547225953200ae8496ac7655c521d15bf98cb8784738e16186e22bc6"} Apr 16 22:16:20.630640 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:20.630603 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:23.452170 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:23.452129 2574 patch_prober.go:28] interesting pod/console-84d7d95f67-wn52j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.133.0.16:8443/health\": dial tcp 10.133.0.16:8443: connect: connection refused" start-of-body= Apr 16 22:16:23.452563 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:23.452206 2574 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-84d7d95f67-wn52j" podUID="abe770ce-fdb9-4d0e-a512-ee45b0c236d7" containerName="console" probeResult="failure" output="Get \"https://10.133.0.16:8443/health\": dial tcp 10.133.0.16:8443: connect: connection refused" Apr 16 22:16:28.860898 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:28.860861 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-xmqcp" Apr 16 22:16:33.455736 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:33.455703 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-84d7d95f67-wn52j" Apr 16 22:16:33.459387 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:33.459362 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-84d7d95f67-wn52j" Apr 16 22:16:33.757050 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:33.756999 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-97f84ccf4-2cnfj" podUID="12f32bdf-eadc-4da6-b3d3-55490357709e" containerName="console" containerID="cri-o://301b19d4c9376ac4c77b677928496c778c5fa38abd9b78c965d062db07ea6cad" gracePeriod=15 Apr 16 22:16:33.907288 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:33.907264 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-97f84ccf4-2cnfj_12f32bdf-eadc-4da6-b3d3-55490357709e/console/0.log" Apr 16 22:16:33.907414 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:33.907302 2574 generic.go:358] "Generic (PLEG): container finished" podID="12f32bdf-eadc-4da6-b3d3-55490357709e" containerID="301b19d4c9376ac4c77b677928496c778c5fa38abd9b78c965d062db07ea6cad" exitCode=2 Apr 16 22:16:33.907414 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:33.907388 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-97f84ccf4-2cnfj" event={"ID":"12f32bdf-eadc-4da6-b3d3-55490357709e","Type":"ContainerDied","Data":"301b19d4c9376ac4c77b677928496c778c5fa38abd9b78c965d062db07ea6cad"} Apr 16 22:16:33.986533 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:33.986514 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-97f84ccf4-2cnfj_12f32bdf-eadc-4da6-b3d3-55490357709e/console/0.log" Apr 16 22:16:33.986643 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:33.986581 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-97f84ccf4-2cnfj" Apr 16 22:16:34.122453 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:34.122383 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/12f32bdf-eadc-4da6-b3d3-55490357709e-service-ca\") pod \"12f32bdf-eadc-4da6-b3d3-55490357709e\" (UID: \"12f32bdf-eadc-4da6-b3d3-55490357709e\") " Apr 16 22:16:34.122453 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:34.122432 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/12f32bdf-eadc-4da6-b3d3-55490357709e-console-oauth-config\") pod \"12f32bdf-eadc-4da6-b3d3-55490357709e\" (UID: \"12f32bdf-eadc-4da6-b3d3-55490357709e\") " Apr 16 22:16:34.122632 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:34.122469 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhhbw\" (UniqueName: \"kubernetes.io/projected/12f32bdf-eadc-4da6-b3d3-55490357709e-kube-api-access-lhhbw\") pod \"12f32bdf-eadc-4da6-b3d3-55490357709e\" (UID: \"12f32bdf-eadc-4da6-b3d3-55490357709e\") " Apr 16 22:16:34.122689 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:34.122639 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/12f32bdf-eadc-4da6-b3d3-55490357709e-oauth-serving-cert\") pod \"12f32bdf-eadc-4da6-b3d3-55490357709e\" (UID: \"12f32bdf-eadc-4da6-b3d3-55490357709e\") " Apr 16 22:16:34.122791 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:34.122775 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/12f32bdf-eadc-4da6-b3d3-55490357709e-console-config\") pod \"12f32bdf-eadc-4da6-b3d3-55490357709e\" (UID: \"12f32bdf-eadc-4da6-b3d3-55490357709e\") " Apr 16 22:16:34.122869 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:34.122808 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/12f32bdf-eadc-4da6-b3d3-55490357709e-console-serving-cert\") pod \"12f32bdf-eadc-4da6-b3d3-55490357709e\" (UID: \"12f32bdf-eadc-4da6-b3d3-55490357709e\") " Apr 16 22:16:34.122914 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:34.122865 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12f32bdf-eadc-4da6-b3d3-55490357709e-service-ca" (OuterVolumeSpecName: "service-ca") pod "12f32bdf-eadc-4da6-b3d3-55490357709e" (UID: "12f32bdf-eadc-4da6-b3d3-55490357709e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:16:34.123101 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:34.123073 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12f32bdf-eadc-4da6-b3d3-55490357709e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "12f32bdf-eadc-4da6-b3d3-55490357709e" (UID: "12f32bdf-eadc-4da6-b3d3-55490357709e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:16:34.123183 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:34.123093 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12f32bdf-eadc-4da6-b3d3-55490357709e-console-config" (OuterVolumeSpecName: "console-config") pod "12f32bdf-eadc-4da6-b3d3-55490357709e" (UID: "12f32bdf-eadc-4da6-b3d3-55490357709e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:16:34.123183 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:34.123172 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/12f32bdf-eadc-4da6-b3d3-55490357709e-console-config\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:16:34.123248 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:34.123189 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/12f32bdf-eadc-4da6-b3d3-55490357709e-service-ca\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:16:34.123248 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:34.123198 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/12f32bdf-eadc-4da6-b3d3-55490357709e-oauth-serving-cert\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:16:34.124911 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:34.124887 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12f32bdf-eadc-4da6-b3d3-55490357709e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "12f32bdf-eadc-4da6-b3d3-55490357709e" (UID: "12f32bdf-eadc-4da6-b3d3-55490357709e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:16:34.124911 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:34.124897 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12f32bdf-eadc-4da6-b3d3-55490357709e-kube-api-access-lhhbw" (OuterVolumeSpecName: "kube-api-access-lhhbw") pod "12f32bdf-eadc-4da6-b3d3-55490357709e" (UID: "12f32bdf-eadc-4da6-b3d3-55490357709e"). InnerVolumeSpecName "kube-api-access-lhhbw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:16:34.124911 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:34.124906 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12f32bdf-eadc-4da6-b3d3-55490357709e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "12f32bdf-eadc-4da6-b3d3-55490357709e" (UID: "12f32bdf-eadc-4da6-b3d3-55490357709e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:16:34.224421 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:34.224397 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/12f32bdf-eadc-4da6-b3d3-55490357709e-console-serving-cert\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:16:34.224421 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:34.224420 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/12f32bdf-eadc-4da6-b3d3-55490357709e-console-oauth-config\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:16:34.224555 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:34.224431 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lhhbw\" (UniqueName: \"kubernetes.io/projected/12f32bdf-eadc-4da6-b3d3-55490357709e-kube-api-access-lhhbw\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:16:34.912231 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:34.912204 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-97f84ccf4-2cnfj_12f32bdf-eadc-4da6-b3d3-55490357709e/console/0.log" Apr 16 22:16:34.912597 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:34.912309 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-97f84ccf4-2cnfj" Apr 16 22:16:34.912597 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:34.912306 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-97f84ccf4-2cnfj" event={"ID":"12f32bdf-eadc-4da6-b3d3-55490357709e","Type":"ContainerDied","Data":"9aab182ce3e9480cd50ecf3405a193c6fe680dc599abf3eaa43ff590c5934f4f"} Apr 16 22:16:34.912597 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:34.912415 2574 scope.go:117] "RemoveContainer" containerID="301b19d4c9376ac4c77b677928496c778c5fa38abd9b78c965d062db07ea6cad" Apr 16 22:16:34.927772 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:34.927731 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-97f84ccf4-2cnfj"] Apr 16 22:16:34.931733 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:34.931707 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-97f84ccf4-2cnfj"] Apr 16 22:16:36.261375 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:36.261342 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12f32bdf-eadc-4da6-b3d3-55490357709e" path="/var/lib/kubelet/pods/12f32bdf-eadc-4da6-b3d3-55490357709e/volumes" Apr 16 22:16:39.201334 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:39.201301 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-84d7d95f67-wn52j"] Apr 16 22:16:52.196969 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:16:52.196938 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-ffxpx_d146454c-862c-4665-b411-fd4c29e30335/serve-healthcheck-canary/0.log" Apr 16 22:17:04.220707 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:04.220636 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-84d7d95f67-wn52j" podUID="abe770ce-fdb9-4d0e-a512-ee45b0c236d7" containerName="console" containerID="cri-o://c9e2d781686ff3be4ff14eb1b971c0dab80f17909a28bd1246c828e7cfaf931a" gracePeriod=15 Apr 16 22:17:04.493872 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:04.493843 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84d7d95f67-wn52j_abe770ce-fdb9-4d0e-a512-ee45b0c236d7/console/0.log" Apr 16 22:17:04.493989 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:04.493931 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84d7d95f67-wn52j" Apr 16 22:17:04.558203 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:04.558172 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-oauth-serving-cert\") pod \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\" (UID: \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\") " Apr 16 22:17:04.558333 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:04.558223 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-console-config\") pod \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\" (UID: \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\") " Apr 16 22:17:04.558333 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:04.558252 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-console-oauth-config\") pod \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\" (UID: \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\") " Apr 16 22:17:04.558333 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:04.558276 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-service-ca\") pod \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\" (UID: \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\") " Apr 16 22:17:04.558333 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:04.558317 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c25g7\" (UniqueName: \"kubernetes.io/projected/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-kube-api-access-c25g7\") pod \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\" (UID: \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\") " Apr 16 22:17:04.558475 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:04.558352 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-console-serving-cert\") pod \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\" (UID: \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\") " Apr 16 22:17:04.558475 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:04.558370 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-trusted-ca-bundle\") pod \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\" (UID: \"abe770ce-fdb9-4d0e-a512-ee45b0c236d7\") " Apr 16 22:17:04.558884 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:04.558834 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "abe770ce-fdb9-4d0e-a512-ee45b0c236d7" (UID: "abe770ce-fdb9-4d0e-a512-ee45b0c236d7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:04.559031 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:04.558987 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-service-ca" (OuterVolumeSpecName: "service-ca") pod "abe770ce-fdb9-4d0e-a512-ee45b0c236d7" (UID: "abe770ce-fdb9-4d0e-a512-ee45b0c236d7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:04.559031 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:04.558996 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "abe770ce-fdb9-4d0e-a512-ee45b0c236d7" (UID: "abe770ce-fdb9-4d0e-a512-ee45b0c236d7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:04.559157 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:04.559120 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-console-config" (OuterVolumeSpecName: "console-config") pod "abe770ce-fdb9-4d0e-a512-ee45b0c236d7" (UID: "abe770ce-fdb9-4d0e-a512-ee45b0c236d7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:04.560624 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:04.560596 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "abe770ce-fdb9-4d0e-a512-ee45b0c236d7" (UID: "abe770ce-fdb9-4d0e-a512-ee45b0c236d7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:04.560838 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:04.560734 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-kube-api-access-c25g7" (OuterVolumeSpecName: "kube-api-access-c25g7") pod "abe770ce-fdb9-4d0e-a512-ee45b0c236d7" (UID: "abe770ce-fdb9-4d0e-a512-ee45b0c236d7"). InnerVolumeSpecName "kube-api-access-c25g7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:17:04.560838 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:04.560815 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "abe770ce-fdb9-4d0e-a512-ee45b0c236d7" (UID: "abe770ce-fdb9-4d0e-a512-ee45b0c236d7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:04.659535 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:04.659495 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-console-serving-cert\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:17:04.659535 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:04.659537 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-trusted-ca-bundle\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:17:04.659730 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:04.659552 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-oauth-serving-cert\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:17:04.659730 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:04.659567 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-console-config\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:17:04.659730 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:04.659582 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-console-oauth-config\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:17:04.659730 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:04.659596 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-service-ca\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:17:04.659730 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:04.659609 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c25g7\" (UniqueName: \"kubernetes.io/projected/abe770ce-fdb9-4d0e-a512-ee45b0c236d7-kube-api-access-c25g7\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:17:05.005520 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:05.005490 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84d7d95f67-wn52j_abe770ce-fdb9-4d0e-a512-ee45b0c236d7/console/0.log" Apr 16 22:17:05.005798 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:05.005767 2574 generic.go:358] "Generic (PLEG): container finished" podID="abe770ce-fdb9-4d0e-a512-ee45b0c236d7" containerID="c9e2d781686ff3be4ff14eb1b971c0dab80f17909a28bd1246c828e7cfaf931a" exitCode=2 Apr 16 22:17:05.005986 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:05.005966 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84d7d95f67-wn52j" event={"ID":"abe770ce-fdb9-4d0e-a512-ee45b0c236d7","Type":"ContainerDied","Data":"c9e2d781686ff3be4ff14eb1b971c0dab80f17909a28bd1246c828e7cfaf931a"} Apr 16 22:17:05.006096 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:05.006082 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84d7d95f67-wn52j" event={"ID":"abe770ce-fdb9-4d0e-a512-ee45b0c236d7","Type":"ContainerDied","Data":"d3c6e875388c450b454ed97c7510023a4d89e2f0208747314afe5aadaab71f61"} Apr 16 22:17:05.006285 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:05.006271 2574 scope.go:117] "RemoveContainer" containerID="c9e2d781686ff3be4ff14eb1b971c0dab80f17909a28bd1246c828e7cfaf931a" Apr 16 22:17:05.006640 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:05.006533 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84d7d95f67-wn52j" Apr 16 22:17:05.019789 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:05.019768 2574 scope.go:117] "RemoveContainer" containerID="c9e2d781686ff3be4ff14eb1b971c0dab80f17909a28bd1246c828e7cfaf931a" Apr 16 22:17:05.020085 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:17:05.020057 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9e2d781686ff3be4ff14eb1b971c0dab80f17909a28bd1246c828e7cfaf931a\": container with ID starting with c9e2d781686ff3be4ff14eb1b971c0dab80f17909a28bd1246c828e7cfaf931a not found: ID does not exist" containerID="c9e2d781686ff3be4ff14eb1b971c0dab80f17909a28bd1246c828e7cfaf931a" Apr 16 22:17:05.020147 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:05.020095 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9e2d781686ff3be4ff14eb1b971c0dab80f17909a28bd1246c828e7cfaf931a"} err="failed to get container status \"c9e2d781686ff3be4ff14eb1b971c0dab80f17909a28bd1246c828e7cfaf931a\": rpc error: code = NotFound desc = could not find container \"c9e2d781686ff3be4ff14eb1b971c0dab80f17909a28bd1246c828e7cfaf931a\": container with ID starting with c9e2d781686ff3be4ff14eb1b971c0dab80f17909a28bd1246c828e7cfaf931a not found: ID does not exist" Apr 16 22:17:05.032372 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:05.032339 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-84d7d95f67-wn52j"] Apr 16 22:17:05.034780 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:05.034756 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-84d7d95f67-wn52j"] Apr 16 22:17:06.262007 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:06.261975 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abe770ce-fdb9-4d0e-a512-ee45b0c236d7" path="/var/lib/kubelet/pods/abe770ce-fdb9-4d0e-a512-ee45b0c236d7/volumes" Apr 16 22:17:10.630484 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:10.630435 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:10.649384 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:10.649353 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:11.039243 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:11.039218 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:28.674715 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:28.674669 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:17:28.675227 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:28.675153 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerName="kube-rbac-proxy" containerID="cri-o://a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb" gracePeriod=600 Apr 16 22:17:28.675307 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:28.675191 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerName="config-reloader" containerID="cri-o://ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206" gracePeriod=600 Apr 16 22:17:28.675307 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:28.675284 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerName="kube-rbac-proxy-web" containerID="cri-o://b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8" gracePeriod=600 Apr 16 22:17:28.675405 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:28.675318 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerName="kube-rbac-proxy-thanos" containerID="cri-o://88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634" gracePeriod=600 Apr 16 22:17:28.675405 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:28.675140 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerName="prometheus" containerID="cri-o://8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578" gracePeriod=600 Apr 16 22:17:28.675592 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:28.675555 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerName="thanos-sidecar" containerID="cri-o://849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d" gracePeriod=600 Apr 16 22:17:28.940457 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:28.940427 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.060753 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.060717 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-web-config\") pod \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " Apr 16 22:17:29.060908 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.060795 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-configmap-serving-certs-ca-bundle\") pod \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " Apr 16 22:17:29.060908 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.060816 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-config\") pod \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " Apr 16 22:17:29.060908 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.060835 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-prometheus-k8s-rulefiles-0\") pod \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " Apr 16 22:17:29.060908 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.060862 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-kube-rbac-proxy\") pod \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " Apr 16 22:17:29.060908 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.060896 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-grpc-tls\") pod \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " Apr 16 22:17:29.061173 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.061077 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-configmap-kubelet-serving-ca-bundle\") pod \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " Apr 16 22:17:29.061173 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.061135 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-prometheus-k8s-db\") pod \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " Apr 16 22:17:29.061173 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.061170 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-thanos-prometheus-http-client-file\") pod \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " Apr 16 22:17:29.061333 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.061197 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " Apr 16 22:17:29.061333 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.061236 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-configmap-metrics-client-ca\") pod \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " Apr 16 22:17:29.061333 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.061238 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" (UID: "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:29.061333 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.061300 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " Apr 16 22:17:29.061333 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.061328 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-prometheus-k8s-tls\") pod \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " Apr 16 22:17:29.061581 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.061356 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-metrics-client-certs\") pod \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " Apr 16 22:17:29.061581 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.061380 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tx97\" (UniqueName: \"kubernetes.io/projected/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-kube-api-access-5tx97\") pod \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " Apr 16 22:17:29.061581 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.061411 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-config-out\") pod \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " Apr 16 22:17:29.061581 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.061441 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-prometheus-trusted-ca-bundle\") pod \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " Apr 16 22:17:29.061581 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.061467 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-tls-assets\") pod \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\" (UID: \"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3\") " Apr 16 22:17:29.061869 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.061723 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:17:29.062017 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.061984 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" (UID: "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:29.062861 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.062298 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" (UID: "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:29.062861 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.062465 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" (UID: "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:17:29.062861 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.062773 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" (UID: "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:29.063823 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.063795 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" (UID: "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:29.063927 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.063882 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-config" (OuterVolumeSpecName: "config") pod "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" (UID: "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:29.065220 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.065175 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" (UID: "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:29.065324 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.065218 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" (UID: "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:29.065324 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.065230 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-kube-api-access-5tx97" (OuterVolumeSpecName: "kube-api-access-5tx97") pod "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" (UID: "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3"). InnerVolumeSpecName "kube-api-access-5tx97". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:17:29.065324 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.065295 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" (UID: "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:29.065482 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.065419 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" (UID: "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:29.065686 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.065661 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" (UID: "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:17:29.065810 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.065663 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" (UID: "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:29.065810 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.065756 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-config-out" (OuterVolumeSpecName: "config-out") pod "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" (UID: "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:17:29.066152 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.066137 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" (UID: "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:29.066515 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.066497 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" (UID: "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:29.073648 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.073627 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-web-config" (OuterVolumeSpecName: "web-config") pod "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" (UID: "3a702e4d-e5b1-4f01-87b9-89a3e10b70b3"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:29.078417 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.078392 2574 generic.go:358] "Generic (PLEG): container finished" podID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerID="88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634" exitCode=0 Apr 16 22:17:29.078417 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.078418 2574 generic.go:358] "Generic (PLEG): container finished" podID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerID="a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb" exitCode=0 Apr 16 22:17:29.078566 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.078426 2574 generic.go:358] "Generic (PLEG): container finished" podID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerID="b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8" exitCode=0 Apr 16 22:17:29.078566 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.078433 2574 generic.go:358] "Generic (PLEG): container finished" podID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerID="849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d" exitCode=0 Apr 16 22:17:29.078566 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.078438 2574 generic.go:358] "Generic (PLEG): container finished" podID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerID="ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206" exitCode=0 Apr 16 22:17:29.078566 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.078443 2574 generic.go:358] "Generic (PLEG): container finished" podID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerID="8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578" exitCode=0 Apr 16 22:17:29.078566 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.078475 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3","Type":"ContainerDied","Data":"88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634"} Apr 16 22:17:29.078566 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.078524 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3","Type":"ContainerDied","Data":"a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb"} Apr 16 22:17:29.078566 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.078543 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3","Type":"ContainerDied","Data":"b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8"} Apr 16 22:17:29.078566 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.078557 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3","Type":"ContainerDied","Data":"849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d"} Apr 16 22:17:29.078566 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.078570 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3","Type":"ContainerDied","Data":"ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206"} Apr 16 22:17:29.078940 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.078525 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.078940 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.078584 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3","Type":"ContainerDied","Data":"8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578"} Apr 16 22:17:29.078940 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.078601 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3a702e4d-e5b1-4f01-87b9-89a3e10b70b3","Type":"ContainerDied","Data":"d9458608d49f6ce669dc4fa3252d67dfa9d30d34c5c907ee61e369391f8513e8"} Apr 16 22:17:29.078940 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.078579 2574 scope.go:117] "RemoveContainer" containerID="88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634" Apr 16 22:17:29.087348 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.087314 2574 scope.go:117] "RemoveContainer" containerID="a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb" Apr 16 22:17:29.095176 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.095155 2574 scope.go:117] "RemoveContainer" containerID="b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8" Apr 16 22:17:29.101056 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.101033 2574 scope.go:117] "RemoveContainer" containerID="849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d" Apr 16 22:17:29.111926 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.111909 2574 scope.go:117] "RemoveContainer" containerID="ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206" Apr 16 22:17:29.112000 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.111972 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:17:29.113156 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.113129 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:17:29.118053 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.118038 2574 scope.go:117] "RemoveContainer" containerID="8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578" Apr 16 22:17:29.124308 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.124291 2574 scope.go:117] "RemoveContainer" containerID="2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694" Apr 16 22:17:29.130348 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.130333 2574 scope.go:117] "RemoveContainer" containerID="88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634" Apr 16 22:17:29.130581 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:17:29.130561 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634\": container with ID starting with 88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634 not found: ID does not exist" containerID="88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634" Apr 16 22:17:29.130639 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.130591 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634"} err="failed to get container status \"88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634\": rpc error: code = NotFound desc = could not find container \"88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634\": container with ID starting with 88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634 not found: ID does not exist" Apr 16 22:17:29.130639 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.130610 2574 scope.go:117] "RemoveContainer" containerID="a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb" Apr 16 22:17:29.130898 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:17:29.130881 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb\": container with ID starting with a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb not found: ID does not exist" containerID="a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb" Apr 16 22:17:29.130949 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.130904 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb"} err="failed to get container status \"a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb\": rpc error: code = NotFound desc = could not find container \"a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb\": container with ID starting with a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb not found: ID does not exist" Apr 16 22:17:29.130949 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.130920 2574 scope.go:117] "RemoveContainer" containerID="b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8" Apr 16 22:17:29.131148 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:17:29.131133 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8\": container with ID starting with b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8 not found: ID does not exist" containerID="b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8" Apr 16 22:17:29.131192 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.131151 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8"} err="failed to get container status \"b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8\": rpc error: code = NotFound desc = could not find container \"b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8\": container with ID starting with b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8 not found: ID does not exist" Apr 16 22:17:29.131192 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.131164 2574 scope.go:117] "RemoveContainer" containerID="849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d" Apr 16 22:17:29.131359 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:17:29.131345 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d\": container with ID starting with 849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d not found: ID does not exist" containerID="849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d" Apr 16 22:17:29.131406 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.131361 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d"} err="failed to get container status \"849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d\": rpc error: code = NotFound desc = could not find container \"849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d\": container with ID starting with 849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d not found: ID does not exist" Apr 16 22:17:29.131406 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.131373 2574 scope.go:117] "RemoveContainer" containerID="ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206" Apr 16 22:17:29.131593 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:17:29.131575 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206\": container with ID starting with ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206 not found: ID does not exist" containerID="ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206" Apr 16 22:17:29.131632 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.131598 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206"} err="failed to get container status \"ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206\": rpc error: code = NotFound desc = could not find container \"ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206\": container with ID starting with ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206 not found: ID does not exist" Apr 16 22:17:29.131632 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.131612 2574 scope.go:117] "RemoveContainer" containerID="8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578" Apr 16 22:17:29.131876 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:17:29.131855 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578\": container with ID starting with 8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578 not found: ID does not exist" containerID="8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578" Apr 16 22:17:29.131923 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.131882 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578"} err="failed to get container status \"8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578\": rpc error: code = NotFound desc = could not find container \"8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578\": container with ID starting with 8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578 not found: ID does not exist" Apr 16 22:17:29.131923 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.131897 2574 scope.go:117] "RemoveContainer" containerID="2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694" Apr 16 22:17:29.132101 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:17:29.132085 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694\": container with ID starting with 2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694 not found: ID does not exist" containerID="2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694" Apr 16 22:17:29.132147 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.132103 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694"} err="failed to get container status \"2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694\": rpc error: code = NotFound desc = could not find container \"2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694\": container with ID starting with 2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694 not found: ID does not exist" Apr 16 22:17:29.132147 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.132115 2574 scope.go:117] "RemoveContainer" containerID="88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634" Apr 16 22:17:29.132331 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.132313 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634"} err="failed to get container status \"88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634\": rpc error: code = NotFound desc = could not find container \"88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634\": container with ID starting with 88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634 not found: ID does not exist" Apr 16 22:17:29.132371 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.132332 2574 scope.go:117] "RemoveContainer" containerID="a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb" Apr 16 22:17:29.132528 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.132511 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb"} err="failed to get container status \"a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb\": rpc error: code = NotFound desc = could not find container \"a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb\": container with ID starting with a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb not found: ID does not exist" Apr 16 22:17:29.132528 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.132528 2574 scope.go:117] "RemoveContainer" containerID="b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8" Apr 16 22:17:29.132751 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.132723 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8"} err="failed to get container status \"b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8\": rpc error: code = NotFound desc = could not find container \"b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8\": container with ID starting with b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8 not found: ID does not exist" Apr 16 22:17:29.132796 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.132755 2574 scope.go:117] "RemoveContainer" containerID="849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d" Apr 16 22:17:29.132981 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.132966 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d"} err="failed to get container status \"849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d\": rpc error: code = NotFound desc = could not find container \"849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d\": container with ID starting with 849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d not found: ID does not exist" Apr 16 22:17:29.133030 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.132981 2574 scope.go:117] "RemoveContainer" containerID="ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206" Apr 16 22:17:29.133157 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.133140 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206"} err="failed to get container status \"ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206\": rpc error: code = NotFound desc = could not find container \"ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206\": container with ID starting with ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206 not found: ID does not exist" Apr 16 22:17:29.133203 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.133158 2574 scope.go:117] "RemoveContainer" containerID="8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578" Apr 16 22:17:29.133358 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.133344 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578"} err="failed to get container status \"8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578\": rpc error: code = NotFound desc = could not find container \"8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578\": container with ID starting with 8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578 not found: ID does not exist" Apr 16 22:17:29.133358 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.133357 2574 scope.go:117] "RemoveContainer" containerID="2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694" Apr 16 22:17:29.133551 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.133534 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694"} err="failed to get container status \"2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694\": rpc error: code = NotFound desc = could not find container \"2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694\": container with ID starting with 2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694 not found: ID does not exist" Apr 16 22:17:29.133594 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.133551 2574 scope.go:117] "RemoveContainer" containerID="88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634" Apr 16 22:17:29.133734 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.133721 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634"} err="failed to get container status \"88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634\": rpc error: code = NotFound desc = could not find container \"88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634\": container with ID starting with 88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634 not found: ID does not exist" Apr 16 22:17:29.133734 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.133734 2574 scope.go:117] "RemoveContainer" containerID="a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb" Apr 16 22:17:29.133953 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.133935 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb"} err="failed to get container status \"a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb\": rpc error: code = NotFound desc = could not find container \"a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb\": container with ID starting with a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb not found: ID does not exist" Apr 16 22:17:29.134023 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.133955 2574 scope.go:117] "RemoveContainer" containerID="b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8" Apr 16 22:17:29.134173 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.134151 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8"} err="failed to get container status \"b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8\": rpc error: code = NotFound desc = could not find container \"b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8\": container with ID starting with b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8 not found: ID does not exist" Apr 16 22:17:29.134248 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.134171 2574 scope.go:117] "RemoveContainer" containerID="849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d" Apr 16 22:17:29.134451 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.134407 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d"} err="failed to get container status \"849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d\": rpc error: code = NotFound desc = could not find container \"849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d\": container with ID starting with 849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d not found: ID does not exist" Apr 16 22:17:29.134451 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.134437 2574 scope.go:117] "RemoveContainer" containerID="ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206" Apr 16 22:17:29.135054 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.135027 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206"} err="failed to get container status \"ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206\": rpc error: code = NotFound desc = could not find container \"ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206\": container with ID starting with ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206 not found: ID does not exist" Apr 16 22:17:29.135054 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.135054 2574 scope.go:117] "RemoveContainer" containerID="8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578" Apr 16 22:17:29.135335 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.135314 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578"} err="failed to get container status \"8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578\": rpc error: code = NotFound desc = could not find container \"8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578\": container with ID starting with 8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578 not found: ID does not exist" Apr 16 22:17:29.135335 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.135334 2574 scope.go:117] "RemoveContainer" containerID="2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694" Apr 16 22:17:29.135597 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.135575 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694"} err="failed to get container status \"2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694\": rpc error: code = NotFound desc = could not find container \"2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694\": container with ID starting with 2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694 not found: ID does not exist" Apr 16 22:17:29.135655 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.135598 2574 scope.go:117] "RemoveContainer" containerID="88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634" Apr 16 22:17:29.135842 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.135825 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634"} err="failed to get container status \"88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634\": rpc error: code = NotFound desc = could not find container \"88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634\": container with ID starting with 88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634 not found: ID does not exist" Apr 16 22:17:29.135842 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.135842 2574 scope.go:117] "RemoveContainer" containerID="a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb" Apr 16 22:17:29.135984 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.135962 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:17:29.136106 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136087 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb"} err="failed to get container status \"a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb\": rpc error: code = NotFound desc = could not find container \"a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb\": container with ID starting with a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb not found: ID does not exist" Apr 16 22:17:29.136164 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136107 2574 scope.go:117] "RemoveContainer" containerID="b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8" Apr 16 22:17:29.136318 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136301 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c4c4eef-b842-4810-b506-7094264f295f" containerName="registry" Apr 16 22:17:29.136395 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136321 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4c4eef-b842-4810-b506-7094264f295f" containerName="registry" Apr 16 22:17:29.136395 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136321 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8"} err="failed to get container status \"b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8\": rpc error: code = NotFound desc = could not find container \"b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8\": container with ID starting with b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8 not found: ID does not exist" Apr 16 22:17:29.136395 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136336 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerName="config-reloader" Apr 16 22:17:29.136395 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136346 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerName="config-reloader" Apr 16 22:17:29.136395 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136360 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerName="init-config-reloader" Apr 16 22:17:29.136395 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136370 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerName="init-config-reloader" Apr 16 22:17:29.136395 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136386 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12f32bdf-eadc-4da6-b3d3-55490357709e" containerName="console" Apr 16 22:17:29.136395 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136394 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="12f32bdf-eadc-4da6-b3d3-55490357709e" containerName="console" Apr 16 22:17:29.136886 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136403 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerName="prometheus" Apr 16 22:17:29.136886 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136413 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerName="prometheus" Apr 16 22:17:29.136886 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136338 2574 scope.go:117] "RemoveContainer" containerID="849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d" Apr 16 22:17:29.136886 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136427 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerName="kube-rbac-proxy-web" Apr 16 22:17:29.136886 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136436 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerName="kube-rbac-proxy-web" Apr 16 22:17:29.136886 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136444 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerName="kube-rbac-proxy-thanos" Apr 16 22:17:29.136886 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136453 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerName="kube-rbac-proxy-thanos" Apr 16 22:17:29.136886 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136465 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerName="thanos-sidecar" Apr 16 22:17:29.136886 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136473 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerName="thanos-sidecar" Apr 16 22:17:29.136886 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136484 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="abe770ce-fdb9-4d0e-a512-ee45b0c236d7" containerName="console" Apr 16 22:17:29.136886 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136492 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="abe770ce-fdb9-4d0e-a512-ee45b0c236d7" containerName="console" Apr 16 22:17:29.136886 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136507 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerName="kube-rbac-proxy" Apr 16 22:17:29.136886 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136515 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerName="kube-rbac-proxy" Apr 16 22:17:29.136886 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136583 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerName="kube-rbac-proxy-thanos" Apr 16 22:17:29.136886 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136596 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c4c4eef-b842-4810-b506-7094264f295f" containerName="registry" Apr 16 22:17:29.136886 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136607 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerName="kube-rbac-proxy-web" Apr 16 22:17:29.136886 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136617 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerName="kube-rbac-proxy" Apr 16 22:17:29.136886 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136626 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="abe770ce-fdb9-4d0e-a512-ee45b0c236d7" containerName="console" Apr 16 22:17:29.136886 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136627 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d"} err="failed to get container status \"849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d\": rpc error: code = NotFound desc = could not find container \"849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d\": container with ID starting with 849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d not found: ID does not exist" Apr 16 22:17:29.136886 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136644 2574 scope.go:117] "RemoveContainer" containerID="ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206" Apr 16 22:17:29.136886 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136636 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="12f32bdf-eadc-4da6-b3d3-55490357709e" containerName="console" Apr 16 22:17:29.136886 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136692 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerName="thanos-sidecar" Apr 16 22:17:29.136886 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136700 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerName="prometheus" Apr 16 22:17:29.136886 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136706 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" containerName="config-reloader" Apr 16 22:17:29.136886 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136881 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206"} err="failed to get container status \"ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206\": rpc error: code = NotFound desc = could not find container \"ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206\": container with ID starting with ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206 not found: ID does not exist" Apr 16 22:17:29.136886 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.136902 2574 scope.go:117] "RemoveContainer" containerID="8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578" Apr 16 22:17:29.137730 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.137149 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578"} err="failed to get container status \"8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578\": rpc error: code = NotFound desc = could not find container \"8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578\": container with ID starting with 8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578 not found: ID does not exist" Apr 16 22:17:29.137730 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.137171 2574 scope.go:117] "RemoveContainer" containerID="2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694" Apr 16 22:17:29.137730 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.137370 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694"} err="failed to get container status \"2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694\": rpc error: code = NotFound desc = could not find container \"2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694\": container with ID starting with 2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694 not found: ID does not exist" Apr 16 22:17:29.137730 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.137391 2574 scope.go:117] "RemoveContainer" containerID="88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634" Apr 16 22:17:29.137730 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.137611 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634"} err="failed to get container status \"88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634\": rpc error: code = NotFound desc = could not find container \"88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634\": container with ID starting with 88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634 not found: ID does not exist" Apr 16 22:17:29.137730 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.137628 2574 scope.go:117] "RemoveContainer" containerID="a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb" Apr 16 22:17:29.137972 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.137922 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb"} err="failed to get container status \"a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb\": rpc error: code = NotFound desc = could not find container \"a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb\": container with ID starting with a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb not found: ID does not exist" Apr 16 22:17:29.137972 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.137944 2574 scope.go:117] "RemoveContainer" containerID="b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8" Apr 16 22:17:29.138159 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.138139 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8"} err="failed to get container status \"b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8\": rpc error: code = NotFound desc = could not find container \"b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8\": container with ID starting with b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8 not found: ID does not exist" Apr 16 22:17:29.138196 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.138163 2574 scope.go:117] "RemoveContainer" containerID="849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d" Apr 16 22:17:29.138353 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.138338 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d"} err="failed to get container status \"849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d\": rpc error: code = NotFound desc = could not find container \"849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d\": container with ID starting with 849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d not found: ID does not exist" Apr 16 22:17:29.138393 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.138354 2574 scope.go:117] "RemoveContainer" containerID="ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206" Apr 16 22:17:29.138545 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.138528 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206"} err="failed to get container status \"ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206\": rpc error: code = NotFound desc = could not find container \"ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206\": container with ID starting with ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206 not found: ID does not exist" Apr 16 22:17:29.138545 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.138544 2574 scope.go:117] "RemoveContainer" containerID="8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578" Apr 16 22:17:29.138712 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.138692 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578"} err="failed to get container status \"8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578\": rpc error: code = NotFound desc = could not find container \"8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578\": container with ID starting with 8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578 not found: ID does not exist" Apr 16 22:17:29.138816 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.138713 2574 scope.go:117] "RemoveContainer" containerID="2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694" Apr 16 22:17:29.138931 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.138907 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694"} err="failed to get container status \"2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694\": rpc error: code = NotFound desc = could not find container \"2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694\": container with ID starting with 2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694 not found: ID does not exist" Apr 16 22:17:29.138982 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.138934 2574 scope.go:117] "RemoveContainer" containerID="88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634" Apr 16 22:17:29.139162 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.139140 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634"} err="failed to get container status \"88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634\": rpc error: code = NotFound desc = could not find container \"88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634\": container with ID starting with 88180290bf568b7e1a53cb825b7d33b07aafb0d45565aec893767e9ffead4634 not found: ID does not exist" Apr 16 22:17:29.139216 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.139163 2574 scope.go:117] "RemoveContainer" containerID="a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb" Apr 16 22:17:29.139357 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.139339 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb"} err="failed to get container status \"a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb\": rpc error: code = NotFound desc = could not find container \"a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb\": container with ID starting with a1a72f571e418ae4feb6e5b04f833bcbdfac6f4507417c8d1280b0f061fc04cb not found: ID does not exist" Apr 16 22:17:29.139397 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.139359 2574 scope.go:117] "RemoveContainer" containerID="b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8" Apr 16 22:17:29.139526 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.139510 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8"} err="failed to get container status \"b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8\": rpc error: code = NotFound desc = could not find container \"b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8\": container with ID starting with b37ef933e40ea21bcbcdbf48d9e0d286e910b9491e9650ee543e140df33447a8 not found: ID does not exist" Apr 16 22:17:29.139561 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.139527 2574 scope.go:117] "RemoveContainer" containerID="849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d" Apr 16 22:17:29.139692 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.139677 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d"} err="failed to get container status \"849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d\": rpc error: code = NotFound desc = could not find container \"849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d\": container with ID starting with 849e0e7fb8aac4a80ae0a2c2afe35d6773f345e2c07599174becc4259ef2a82d not found: ID does not exist" Apr 16 22:17:29.139733 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.139693 2574 scope.go:117] "RemoveContainer" containerID="ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206" Apr 16 22:17:29.139883 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.139869 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206"} err="failed to get container status \"ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206\": rpc error: code = NotFound desc = could not find container \"ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206\": container with ID starting with ddeaaa89162e79be43f5690a8169db25eb384e725fa17d09a0c7e9afc0fa1206 not found: ID does not exist" Apr 16 22:17:29.139942 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.139883 2574 scope.go:117] "RemoveContainer" containerID="8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578" Apr 16 22:17:29.140122 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.140098 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578"} err="failed to get container status \"8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578\": rpc error: code = NotFound desc = could not find container \"8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578\": container with ID starting with 8c6cf62a5b52960dba30ad36a5924c668da6aa482c1f30932d5289db85276578 not found: ID does not exist" Apr 16 22:17:29.140122 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.140121 2574 scope.go:117] "RemoveContainer" containerID="2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694" Apr 16 22:17:29.140386 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.140358 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694"} err="failed to get container status \"2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694\": rpc error: code = NotFound desc = could not find container \"2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694\": container with ID starting with 2834f053a99fa4f5b941294086328deeaad41c64dceb306ccb7fdb6374749694 not found: ID does not exist" Apr 16 22:17:29.142077 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.142061 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.147590 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.147566 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 22:17:29.147666 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.147638 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 22:17:29.152849 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.152826 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 22:17:29.152849 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.152842 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-k6bnx\"" Apr 16 22:17:29.153067 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.152853 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-cn5fns6cp64d8\"" Apr 16 22:17:29.153067 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.152863 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 22:17:29.153067 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.152855 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 22:17:29.153067 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.152930 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 22:17:29.153067 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.152842 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 22:17:29.153221 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.153153 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 22:17:29.153221 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.153195 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 22:17:29.153221 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.153212 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 22:17:29.157206 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.157191 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 22:17:29.158213 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.158197 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 22:17:29.162014 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.161998 2574 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-grpc-tls\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:17:29.162117 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.162016 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:17:29.162117 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.162027 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-prometheus-k8s-db\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:17:29.162117 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.162036 2574 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-thanos-prometheus-http-client-file\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:17:29.162117 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.162046 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:17:29.162117 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.162055 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-configmap-metrics-client-ca\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:17:29.162117 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.162065 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:17:29.162117 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.162074 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-prometheus-k8s-tls\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:17:29.162117 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.162084 2574 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-metrics-client-certs\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:17:29.162117 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.162093 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5tx97\" (UniqueName: \"kubernetes.io/projected/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-kube-api-access-5tx97\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:17:29.162117 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.162101 2574 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-config-out\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:17:29.162117 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.162109 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-prometheus-trusted-ca-bundle\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:17:29.162117 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.162119 2574 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-tls-assets\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:17:29.162830 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.162131 2574 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-web-config\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:17:29.162830 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.162140 2574 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-config\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:17:29.162830 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.162148 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:17:29.162830 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.162156 2574 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3-secret-kube-rbac-proxy\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:17:29.164149 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.164128 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 22:17:29.164677 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.164660 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:17:29.262855 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.262828 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.263017 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.262860 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-web-config\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.263017 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.262877 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-config\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.263017 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.262896 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.263017 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.262915 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.263017 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.262969 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.263017 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.263008 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.263196 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.263038 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.263196 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.263069 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfbph\" (UniqueName: \"kubernetes.io/projected/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-kube-api-access-wfbph\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.263196 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.263102 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.263196 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.263136 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.263196 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.263166 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.263340 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.263208 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.263340 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.263232 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-config-out\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.263340 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.263249 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.263340 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.263271 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.263340 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.263295 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.263340 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.263315 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.363837 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.363811 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.363967 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.363853 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-config-out\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.363967 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.363886 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.363967 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.363911 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.363967 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.363941 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.364179 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.363974 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.364179 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.364004 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.364179 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.364025 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-web-config\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.364179 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.364048 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-config\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.364179 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.364091 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.364179 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.364134 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.364467 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.364205 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.364467 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.364233 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.364467 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.364274 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.364467 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.364297 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wfbph\" (UniqueName: \"kubernetes.io/projected/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-kube-api-access-wfbph\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.364467 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.364338 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.364467 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.364366 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.364467 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.364393 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.365065 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.365031 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.365180 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.365081 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.366867 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.366838 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-config-out\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.367121 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.367099 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.367208 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.367166 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.367277 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.367253 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.367414 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.367394 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.367701 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.367680 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.367806 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.367707 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.367806 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.367726 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.368397 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.368369 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.368503 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.368403 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.368669 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.368652 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.369417 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.369383 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.369417 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.369399 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-web-config\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.369692 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.369672 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-config\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.370482 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.370460 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.375804 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.375788 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfbph\" (UniqueName: \"kubernetes.io/projected/c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca-kube-api-access-wfbph\") pod \"prometheus-k8s-0\" (UID: \"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.450526 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.450491 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:29.574697 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:29.574672 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:17:29.577650 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:17:29.577624 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc79cb3bf_d26c_4ea4_bd87_b46ec18fbaca.slice/crio-cba33c409bb17d04660673ce5ff1bcb900030e041d4130298c6aecb9522ebe4a WatchSource:0}: Error finding container cba33c409bb17d04660673ce5ff1bcb900030e041d4130298c6aecb9522ebe4a: Status 404 returned error can't find the container with id cba33c409bb17d04660673ce5ff1bcb900030e041d4130298c6aecb9522ebe4a Apr 16 22:17:30.083029 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:30.082993 2574 generic.go:358] "Generic (PLEG): container finished" podID="c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca" containerID="70fe1f021dec25144116cb26fceff58d1e4174244883da578c81197978743818" exitCode=0 Apr 16 22:17:30.083400 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:30.083081 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca","Type":"ContainerDied","Data":"70fe1f021dec25144116cb26fceff58d1e4174244883da578c81197978743818"} Apr 16 22:17:30.083400 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:30.083110 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca","Type":"ContainerStarted","Data":"cba33c409bb17d04660673ce5ff1bcb900030e041d4130298c6aecb9522ebe4a"} Apr 16 22:17:30.263308 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:30.263261 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a702e4d-e5b1-4f01-87b9-89a3e10b70b3" path="/var/lib/kubelet/pods/3a702e4d-e5b1-4f01-87b9-89a3e10b70b3/volumes" Apr 16 22:17:31.091554 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:31.091518 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca","Type":"ContainerStarted","Data":"7ef4b82f0e7f664aaa6b1809f3decadcc5dcc9b6cf99e5c0b1abd2204d72badc"} Apr 16 22:17:31.091554 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:31.091552 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca","Type":"ContainerStarted","Data":"40ef3a2c79c4526e970dd68d1e5fc1132e1601231895cf4c993c2a4a63416f42"} Apr 16 22:17:31.092033 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:31.091565 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca","Type":"ContainerStarted","Data":"9f8d132dcb6c051cd37b3060af92a9de4aa40229237d19b5edbed13b319fcd62"} Apr 16 22:17:31.092033 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:31.091578 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca","Type":"ContainerStarted","Data":"0228f421c6b43aca893bbeea60641dc7c25afc1ec06406ac9f44d5c839f7833c"} Apr 16 22:17:31.092033 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:31.091589 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca","Type":"ContainerStarted","Data":"e21d4dab82ce9fe681d3719e6a6b5a7481954d28f0797cc84a838d1447c80630"} Apr 16 22:17:31.092033 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:31.091599 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca","Type":"ContainerStarted","Data":"67cd958acd3456d8a7caa02f9510af19689b9dd6fa651e854e7d960e4ca13e67"} Apr 16 22:17:31.120356 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:31.120312 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.120296369 podStartE2EDuration="2.120296369s" podCreationTimestamp="2026-04-16 22:17:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:17:31.118598049 +0000 UTC m=+241.487004222" watchObservedRunningTime="2026-04-16 22:17:31.120296369 +0000 UTC m=+241.488702521" Apr 16 22:17:34.451549 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:17:34.451514 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:29.450821 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:18:29.450782 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:29.466057 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:18:29.466033 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:30.135400 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:18:30.135367 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v4dxt_dba969ac-c53c-4354-ab6e-cc853b8c1449/console-operator/1.log" Apr 16 22:18:30.135684 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:18:30.135661 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v4dxt_dba969ac-c53c-4354-ab6e-cc853b8c1449/console-operator/1.log" Apr 16 22:18:30.142210 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:18:30.142192 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hm4t5_cdab8cce-9b55-478d-b1b5-740aa9746143/ovn-acl-logging/0.log" Apr 16 22:18:30.142320 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:18:30.142222 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hm4t5_cdab8cce-9b55-478d-b1b5-740aa9746143/ovn-acl-logging/0.log" Apr 16 22:18:30.145648 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:18:30.145631 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 22:18:30.269514 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:18:30.269489 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:21:16.667206 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:16.667166 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-sxxqp"] Apr 16 22:21:16.670431 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:16.670409 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-sxxqp" Apr 16 22:21:16.672614 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:16.672592 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 22:21:16.672728 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:16.672640 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 22:21:16.672728 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:16.672672 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-6pf88\"" Apr 16 22:21:16.677167 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:16.676979 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-sxxqp"] Apr 16 22:21:16.804026 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:16.803994 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f4e881b9-4d44-4782-a722-7ad70977240a-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-sxxqp\" (UID: \"f4e881b9-4d44-4782-a722-7ad70977240a\") " pod="cert-manager/cert-manager-webhook-597b96b99b-sxxqp" Apr 16 22:21:16.804209 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:16.804064 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnqdp\" (UniqueName: \"kubernetes.io/projected/f4e881b9-4d44-4782-a722-7ad70977240a-kube-api-access-pnqdp\") pod \"cert-manager-webhook-597b96b99b-sxxqp\" (UID: \"f4e881b9-4d44-4782-a722-7ad70977240a\") " pod="cert-manager/cert-manager-webhook-597b96b99b-sxxqp" Apr 16 22:21:16.904543 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:16.904511 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f4e881b9-4d44-4782-a722-7ad70977240a-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-sxxqp\" (UID: \"f4e881b9-4d44-4782-a722-7ad70977240a\") " pod="cert-manager/cert-manager-webhook-597b96b99b-sxxqp" Apr 16 22:21:16.904704 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:16.904580 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pnqdp\" (UniqueName: \"kubernetes.io/projected/f4e881b9-4d44-4782-a722-7ad70977240a-kube-api-access-pnqdp\") pod \"cert-manager-webhook-597b96b99b-sxxqp\" (UID: \"f4e881b9-4d44-4782-a722-7ad70977240a\") " pod="cert-manager/cert-manager-webhook-597b96b99b-sxxqp" Apr 16 22:21:16.913180 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:16.913156 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f4e881b9-4d44-4782-a722-7ad70977240a-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-sxxqp\" (UID: \"f4e881b9-4d44-4782-a722-7ad70977240a\") " pod="cert-manager/cert-manager-webhook-597b96b99b-sxxqp" Apr 16 22:21:16.913312 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:16.913284 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnqdp\" (UniqueName: \"kubernetes.io/projected/f4e881b9-4d44-4782-a722-7ad70977240a-kube-api-access-pnqdp\") pod \"cert-manager-webhook-597b96b99b-sxxqp\" (UID: \"f4e881b9-4d44-4782-a722-7ad70977240a\") " pod="cert-manager/cert-manager-webhook-597b96b99b-sxxqp" Apr 16 22:21:16.979988 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:16.979904 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-sxxqp" Apr 16 22:21:17.097909 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:17.097869 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-sxxqp"] Apr 16 22:21:17.100923 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:21:17.100897 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4e881b9_4d44_4782_a722_7ad70977240a.slice/crio-69e623b190af9ca8f28c84c7e620c35eace863ec4ea6be201123aeb61c131240 WatchSource:0}: Error finding container 69e623b190af9ca8f28c84c7e620c35eace863ec4ea6be201123aeb61c131240: Status 404 returned error can't find the container with id 69e623b190af9ca8f28c84c7e620c35eace863ec4ea6be201123aeb61c131240 Apr 16 22:21:17.103029 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:17.102908 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:21:17.701066 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:17.701032 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-sxxqp" event={"ID":"f4e881b9-4d44-4782-a722-7ad70977240a","Type":"ContainerStarted","Data":"69e623b190af9ca8f28c84c7e620c35eace863ec4ea6be201123aeb61c131240"} Apr 16 22:21:20.710730 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:20.710697 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-sxxqp" event={"ID":"f4e881b9-4d44-4782-a722-7ad70977240a","Type":"ContainerStarted","Data":"7013e98b65f0102cdf7789077870e2d0207ee1a821c82ec0f53a64a105b03989"} Apr 16 22:21:20.711097 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:20.710881 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-sxxqp" Apr 16 22:21:20.726162 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:20.726112 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-sxxqp" podStartSLOduration=1.232454915 podStartE2EDuration="4.726097383s" podCreationTimestamp="2026-04-16 22:21:16 +0000 UTC" firstStartedPulling="2026-04-16 22:21:17.103034034 +0000 UTC m=+467.471440167" lastFinishedPulling="2026-04-16 22:21:20.596676491 +0000 UTC m=+470.965082635" observedRunningTime="2026-04-16 22:21:20.725366654 +0000 UTC m=+471.093772880" watchObservedRunningTime="2026-04-16 22:21:20.726097383 +0000 UTC m=+471.094503534" Apr 16 22:21:26.716125 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:26.716095 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-sxxqp" Apr 16 22:21:27.508552 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:27.508514 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-v5fqv"] Apr 16 22:21:27.511768 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:27.511732 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-v5fqv" Apr 16 22:21:27.513887 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:27.513861 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-mgq2q\"" Apr 16 22:21:27.521150 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:27.521129 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-v5fqv"] Apr 16 22:21:27.591056 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:27.591010 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rdkq\" (UniqueName: \"kubernetes.io/projected/bb0c0b4b-5e62-4772-9210-58aad0a9b5ae-kube-api-access-5rdkq\") pod \"cert-manager-759f64656b-v5fqv\" (UID: \"bb0c0b4b-5e62-4772-9210-58aad0a9b5ae\") " pod="cert-manager/cert-manager-759f64656b-v5fqv" Apr 16 22:21:27.591223 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:27.591103 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb0c0b4b-5e62-4772-9210-58aad0a9b5ae-bound-sa-token\") pod \"cert-manager-759f64656b-v5fqv\" (UID: \"bb0c0b4b-5e62-4772-9210-58aad0a9b5ae\") " pod="cert-manager/cert-manager-759f64656b-v5fqv" Apr 16 22:21:27.691468 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:27.691437 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rdkq\" (UniqueName: \"kubernetes.io/projected/bb0c0b4b-5e62-4772-9210-58aad0a9b5ae-kube-api-access-5rdkq\") pod \"cert-manager-759f64656b-v5fqv\" (UID: \"bb0c0b4b-5e62-4772-9210-58aad0a9b5ae\") " pod="cert-manager/cert-manager-759f64656b-v5fqv" Apr 16 22:21:27.691584 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:27.691491 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb0c0b4b-5e62-4772-9210-58aad0a9b5ae-bound-sa-token\") pod \"cert-manager-759f64656b-v5fqv\" (UID: \"bb0c0b4b-5e62-4772-9210-58aad0a9b5ae\") " pod="cert-manager/cert-manager-759f64656b-v5fqv" Apr 16 22:21:27.699362 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:27.699330 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bb0c0b4b-5e62-4772-9210-58aad0a9b5ae-bound-sa-token\") pod \"cert-manager-759f64656b-v5fqv\" (UID: \"bb0c0b4b-5e62-4772-9210-58aad0a9b5ae\") " pod="cert-manager/cert-manager-759f64656b-v5fqv" Apr 16 22:21:27.699568 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:27.699546 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rdkq\" (UniqueName: \"kubernetes.io/projected/bb0c0b4b-5e62-4772-9210-58aad0a9b5ae-kube-api-access-5rdkq\") pod \"cert-manager-759f64656b-v5fqv\" (UID: \"bb0c0b4b-5e62-4772-9210-58aad0a9b5ae\") " pod="cert-manager/cert-manager-759f64656b-v5fqv" Apr 16 22:21:27.821379 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:27.821287 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-v5fqv" Apr 16 22:21:27.952752 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:27.952713 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-v5fqv"] Apr 16 22:21:27.955202 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:21:27.955170 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb0c0b4b_5e62_4772_9210_58aad0a9b5ae.slice/crio-401b96899091f22e9302247d07091c8bce654c323181367bd8fd2df98583bb35 WatchSource:0}: Error finding container 401b96899091f22e9302247d07091c8bce654c323181367bd8fd2df98583bb35: Status 404 returned error can't find the container with id 401b96899091f22e9302247d07091c8bce654c323181367bd8fd2df98583bb35 Apr 16 22:21:28.731932 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:28.731898 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-v5fqv" event={"ID":"bb0c0b4b-5e62-4772-9210-58aad0a9b5ae","Type":"ContainerStarted","Data":"864cf0f60340aebc5bf14bb9d1191add9ca0dbff289b2618f24bf8aabee560a6"} Apr 16 22:21:28.731932 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:28.731935 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-v5fqv" event={"ID":"bb0c0b4b-5e62-4772-9210-58aad0a9b5ae","Type":"ContainerStarted","Data":"401b96899091f22e9302247d07091c8bce654c323181367bd8fd2df98583bb35"} Apr 16 22:21:28.746812 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:28.746764 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-v5fqv" podStartSLOduration=1.746730256 podStartE2EDuration="1.746730256s" podCreationTimestamp="2026-04-16 22:21:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:21:28.746235891 +0000 UTC m=+479.114642042" watchObservedRunningTime="2026-04-16 22:21:28.746730256 +0000 UTC m=+479.115136406" Apr 16 22:21:57.309974 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:57.309934 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5cdcb589b5-sdjkx"] Apr 16 22:21:57.314483 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:57.314462 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-sdjkx" Apr 16 22:21:57.316920 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:57.316902 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 22:21:57.317674 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:57.317662 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 22:21:57.317994 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:57.317977 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:21:57.318093 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:57.317980 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-7nvn9\"" Apr 16 22:21:57.318466 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:57.318448 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 22:21:57.318466 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:57.318461 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 22:21:57.322883 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:57.322864 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5cdcb589b5-sdjkx"] Apr 16 22:21:57.426272 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:57.426238 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csxsm\" (UniqueName: \"kubernetes.io/projected/f45534a0-756e-4125-b237-74268df6b42d-kube-api-access-csxsm\") pod \"lws-controller-manager-5cdcb589b5-sdjkx\" (UID: \"f45534a0-756e-4125-b237-74268df6b42d\") " pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-sdjkx" Apr 16 22:21:57.426423 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:57.426280 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/f45534a0-756e-4125-b237-74268df6b42d-metrics-cert\") pod \"lws-controller-manager-5cdcb589b5-sdjkx\" (UID: \"f45534a0-756e-4125-b237-74268df6b42d\") " pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-sdjkx" Apr 16 22:21:57.426423 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:57.426314 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f45534a0-756e-4125-b237-74268df6b42d-cert\") pod \"lws-controller-manager-5cdcb589b5-sdjkx\" (UID: \"f45534a0-756e-4125-b237-74268df6b42d\") " pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-sdjkx" Apr 16 22:21:57.426423 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:57.426346 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/f45534a0-756e-4125-b237-74268df6b42d-manager-config\") pod \"lws-controller-manager-5cdcb589b5-sdjkx\" (UID: \"f45534a0-756e-4125-b237-74268df6b42d\") " pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-sdjkx" Apr 16 22:21:57.527147 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:57.527115 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-csxsm\" (UniqueName: \"kubernetes.io/projected/f45534a0-756e-4125-b237-74268df6b42d-kube-api-access-csxsm\") pod \"lws-controller-manager-5cdcb589b5-sdjkx\" (UID: \"f45534a0-756e-4125-b237-74268df6b42d\") " pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-sdjkx" Apr 16 22:21:57.527147 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:57.527152 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/f45534a0-756e-4125-b237-74268df6b42d-metrics-cert\") pod \"lws-controller-manager-5cdcb589b5-sdjkx\" (UID: \"f45534a0-756e-4125-b237-74268df6b42d\") " pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-sdjkx" Apr 16 22:21:57.527337 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:57.527179 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f45534a0-756e-4125-b237-74268df6b42d-cert\") pod \"lws-controller-manager-5cdcb589b5-sdjkx\" (UID: \"f45534a0-756e-4125-b237-74268df6b42d\") " pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-sdjkx" Apr 16 22:21:57.527337 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:57.527202 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/f45534a0-756e-4125-b237-74268df6b42d-manager-config\") pod \"lws-controller-manager-5cdcb589b5-sdjkx\" (UID: \"f45534a0-756e-4125-b237-74268df6b42d\") " pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-sdjkx" Apr 16 22:21:57.527851 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:57.527831 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/f45534a0-756e-4125-b237-74268df6b42d-manager-config\") pod \"lws-controller-manager-5cdcb589b5-sdjkx\" (UID: \"f45534a0-756e-4125-b237-74268df6b42d\") " pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-sdjkx" Apr 16 22:21:57.529562 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:57.529542 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/f45534a0-756e-4125-b237-74268df6b42d-metrics-cert\") pod \"lws-controller-manager-5cdcb589b5-sdjkx\" (UID: \"f45534a0-756e-4125-b237-74268df6b42d\") " pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-sdjkx" Apr 16 22:21:57.529641 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:57.529565 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f45534a0-756e-4125-b237-74268df6b42d-cert\") pod \"lws-controller-manager-5cdcb589b5-sdjkx\" (UID: \"f45534a0-756e-4125-b237-74268df6b42d\") " pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-sdjkx" Apr 16 22:21:57.536792 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:57.536773 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-csxsm\" (UniqueName: \"kubernetes.io/projected/f45534a0-756e-4125-b237-74268df6b42d-kube-api-access-csxsm\") pod \"lws-controller-manager-5cdcb589b5-sdjkx\" (UID: \"f45534a0-756e-4125-b237-74268df6b42d\") " pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-sdjkx" Apr 16 22:21:57.624144 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:57.624063 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-sdjkx" Apr 16 22:21:57.755102 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:57.755037 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5cdcb589b5-sdjkx"] Apr 16 22:21:57.757757 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:21:57.757718 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf45534a0_756e_4125_b237_74268df6b42d.slice/crio-d385e290e643edaedf4aecafe0c3cbfccadc57d03f45a5f872078aa0869aa26a WatchSource:0}: Error finding container d385e290e643edaedf4aecafe0c3cbfccadc57d03f45a5f872078aa0869aa26a: Status 404 returned error can't find the container with id d385e290e643edaedf4aecafe0c3cbfccadc57d03f45a5f872078aa0869aa26a Apr 16 22:21:57.814797 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:21:57.814758 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-sdjkx" event={"ID":"f45534a0-756e-4125-b237-74268df6b42d","Type":"ContainerStarted","Data":"d385e290e643edaedf4aecafe0c3cbfccadc57d03f45a5f872078aa0869aa26a"} Apr 16 22:22:00.827196 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:00.827161 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-sdjkx" event={"ID":"f45534a0-756e-4125-b237-74268df6b42d","Type":"ContainerStarted","Data":"aa1d09eb1ae5ca18f598ad89aed67305bbdd7901af7799a0fa09c6c22c11f31f"} Apr 16 22:22:00.827571 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:00.827295 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-sdjkx" Apr 16 22:22:00.841723 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:00.841679 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-sdjkx" podStartSLOduration=1.584338161 podStartE2EDuration="3.841666248s" podCreationTimestamp="2026-04-16 22:21:57 +0000 UTC" firstStartedPulling="2026-04-16 22:21:57.759563182 +0000 UTC m=+508.127969313" lastFinishedPulling="2026-04-16 22:22:00.016891269 +0000 UTC m=+510.385297400" observedRunningTime="2026-04-16 22:22:00.841563743 +0000 UTC m=+511.209969909" watchObservedRunningTime="2026-04-16 22:22:00.841666248 +0000 UTC m=+511.210072400" Apr 16 22:22:11.832913 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:11.832882 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5cdcb589b5-sdjkx" Apr 16 22:22:26.221113 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.221034 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj"] Apr 16 22:22:26.223441 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.223409 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:26.226381 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.226185 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 22:22:26.226381 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.226306 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-wbx98\"" Apr 16 22:22:26.226381 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.226189 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 22:22:26.226774 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.226754 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 22:22:26.235688 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.235663 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj"] Apr 16 22:22:26.255193 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.255164 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/cd3dff7e-3d30-4c96-8e98-be1bceff5295-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-t6fzj\" (UID: \"cd3dff7e-3d30-4c96-8e98-be1bceff5295\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:26.255193 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.255201 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/cd3dff7e-3d30-4c96-8e98-be1bceff5295-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-t6fzj\" (UID: \"cd3dff7e-3d30-4c96-8e98-be1bceff5295\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:26.255405 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.255219 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/cd3dff7e-3d30-4c96-8e98-be1bceff5295-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-t6fzj\" (UID: \"cd3dff7e-3d30-4c96-8e98-be1bceff5295\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:26.255405 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.255269 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/cd3dff7e-3d30-4c96-8e98-be1bceff5295-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-t6fzj\" (UID: \"cd3dff7e-3d30-4c96-8e98-be1bceff5295\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:26.255405 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.255284 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cd3dff7e-3d30-4c96-8e98-be1bceff5295-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-t6fzj\" (UID: \"cd3dff7e-3d30-4c96-8e98-be1bceff5295\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:26.255405 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.255302 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdhhw\" (UniqueName: \"kubernetes.io/projected/cd3dff7e-3d30-4c96-8e98-be1bceff5295-kube-api-access-bdhhw\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-t6fzj\" (UID: \"cd3dff7e-3d30-4c96-8e98-be1bceff5295\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:26.255527 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.255415 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cd3dff7e-3d30-4c96-8e98-be1bceff5295-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-t6fzj\" (UID: \"cd3dff7e-3d30-4c96-8e98-be1bceff5295\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:26.255527 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.255452 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/cd3dff7e-3d30-4c96-8e98-be1bceff5295-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-t6fzj\" (UID: \"cd3dff7e-3d30-4c96-8e98-be1bceff5295\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:26.255588 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.255526 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/cd3dff7e-3d30-4c96-8e98-be1bceff5295-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-t6fzj\" (UID: \"cd3dff7e-3d30-4c96-8e98-be1bceff5295\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:26.356177 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.356135 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/cd3dff7e-3d30-4c96-8e98-be1bceff5295-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-t6fzj\" (UID: \"cd3dff7e-3d30-4c96-8e98-be1bceff5295\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:26.356177 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.356175 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cd3dff7e-3d30-4c96-8e98-be1bceff5295-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-t6fzj\" (UID: \"cd3dff7e-3d30-4c96-8e98-be1bceff5295\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:26.356434 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.356196 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bdhhw\" (UniqueName: \"kubernetes.io/projected/cd3dff7e-3d30-4c96-8e98-be1bceff5295-kube-api-access-bdhhw\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-t6fzj\" (UID: \"cd3dff7e-3d30-4c96-8e98-be1bceff5295\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:26.356434 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.356260 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cd3dff7e-3d30-4c96-8e98-be1bceff5295-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-t6fzj\" (UID: \"cd3dff7e-3d30-4c96-8e98-be1bceff5295\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:26.356434 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.356287 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/cd3dff7e-3d30-4c96-8e98-be1bceff5295-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-t6fzj\" (UID: \"cd3dff7e-3d30-4c96-8e98-be1bceff5295\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:26.356434 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.356330 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/cd3dff7e-3d30-4c96-8e98-be1bceff5295-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-t6fzj\" (UID: \"cd3dff7e-3d30-4c96-8e98-be1bceff5295\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:26.356434 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.356357 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/cd3dff7e-3d30-4c96-8e98-be1bceff5295-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-t6fzj\" (UID: \"cd3dff7e-3d30-4c96-8e98-be1bceff5295\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:26.356434 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.356385 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/cd3dff7e-3d30-4c96-8e98-be1bceff5295-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-t6fzj\" (UID: \"cd3dff7e-3d30-4c96-8e98-be1bceff5295\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:26.356434 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.356407 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/cd3dff7e-3d30-4c96-8e98-be1bceff5295-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-t6fzj\" (UID: \"cd3dff7e-3d30-4c96-8e98-be1bceff5295\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:26.356832 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.356800 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/cd3dff7e-3d30-4c96-8e98-be1bceff5295-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-t6fzj\" (UID: \"cd3dff7e-3d30-4c96-8e98-be1bceff5295\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:26.356951 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.356894 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/cd3dff7e-3d30-4c96-8e98-be1bceff5295-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-t6fzj\" (UID: \"cd3dff7e-3d30-4c96-8e98-be1bceff5295\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:26.356951 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.356900 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/cd3dff7e-3d30-4c96-8e98-be1bceff5295-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-t6fzj\" (UID: \"cd3dff7e-3d30-4c96-8e98-be1bceff5295\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:26.356951 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.356946 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/cd3dff7e-3d30-4c96-8e98-be1bceff5295-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-t6fzj\" (UID: \"cd3dff7e-3d30-4c96-8e98-be1bceff5295\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:26.357080 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.357059 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/cd3dff7e-3d30-4c96-8e98-be1bceff5295-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-t6fzj\" (UID: \"cd3dff7e-3d30-4c96-8e98-be1bceff5295\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:26.358617 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.358589 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/cd3dff7e-3d30-4c96-8e98-be1bceff5295-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-t6fzj\" (UID: \"cd3dff7e-3d30-4c96-8e98-be1bceff5295\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:26.358735 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.358719 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cd3dff7e-3d30-4c96-8e98-be1bceff5295-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-t6fzj\" (UID: \"cd3dff7e-3d30-4c96-8e98-be1bceff5295\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:26.363323 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.363300 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cd3dff7e-3d30-4c96-8e98-be1bceff5295-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-t6fzj\" (UID: \"cd3dff7e-3d30-4c96-8e98-be1bceff5295\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:26.364003 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.363981 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdhhw\" (UniqueName: \"kubernetes.io/projected/cd3dff7e-3d30-4c96-8e98-be1bceff5295-kube-api-access-bdhhw\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-t6fzj\" (UID: \"cd3dff7e-3d30-4c96-8e98-be1bceff5295\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:26.536889 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.536841 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:26.655266 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.655229 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj"] Apr 16 22:22:26.658317 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:22:26.658290 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd3dff7e_3d30_4c96_8e98_be1bceff5295.slice/crio-8f748b06ee2562f5747f77c87a57ededc80054ba6d63b7cb99eaddfeecca2b9a WatchSource:0}: Error finding container 8f748b06ee2562f5747f77c87a57ededc80054ba6d63b7cb99eaddfeecca2b9a: Status 404 returned error can't find the container with id 8f748b06ee2562f5747f77c87a57ededc80054ba6d63b7cb99eaddfeecca2b9a Apr 16 22:22:26.899591 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:26.899504 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" event={"ID":"cd3dff7e-3d30-4c96-8e98-be1bceff5295","Type":"ContainerStarted","Data":"8f748b06ee2562f5747f77c87a57ededc80054ba6d63b7cb99eaddfeecca2b9a"} Apr 16 22:22:30.044020 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:30.043977 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 22:22:30.044264 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:30.044060 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 22:22:30.044264 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:30.044088 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 22:22:30.911897 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:30.911861 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" event={"ID":"cd3dff7e-3d30-4c96-8e98-be1bceff5295","Type":"ContainerStarted","Data":"5e9f6967bd33be1c1c6148f4edda8fccad6ad07c976a5ae8d861c3e6d3882788"} Apr 16 22:22:30.932055 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:30.932007 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" podStartSLOduration=1.548508829 podStartE2EDuration="4.93198897s" podCreationTimestamp="2026-04-16 22:22:26 +0000 UTC" firstStartedPulling="2026-04-16 22:22:26.660245415 +0000 UTC m=+537.028651562" lastFinishedPulling="2026-04-16 22:22:30.043725564 +0000 UTC m=+540.412131703" observedRunningTime="2026-04-16 22:22:30.930594455 +0000 UTC m=+541.299000600" watchObservedRunningTime="2026-04-16 22:22:30.93198897 +0000 UTC m=+541.300395122" Apr 16 22:22:31.537721 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:31.537678 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:31.542391 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:31.542368 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:31.914844 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:31.914767 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:31.915712 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:31.915689 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-t6fzj" Apr 16 22:22:41.893394 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:41.893347 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7dfd6686cc-mrxlk"] Apr 16 22:22:41.899925 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:41.899902 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7dfd6686cc-mrxlk" Apr 16 22:22:41.903332 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:41.903304 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 22:22:41.903587 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:41.903566 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 22:22:41.903636 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:41.903574 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 22:22:41.903878 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:41.903863 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-kf4z8\"" Apr 16 22:22:41.903946 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:41.903913 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 22:22:41.904575 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:41.904555 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 22:22:41.905319 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:41.905298 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 22:22:41.905724 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:41.905703 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 22:22:41.909444 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:41.909425 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 22:22:41.911859 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:41.911840 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7dfd6686cc-mrxlk"] Apr 16 22:22:41.983460 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:41.983420 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202-console-serving-cert\") pod \"console-7dfd6686cc-mrxlk\" (UID: \"aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202\") " pod="openshift-console/console-7dfd6686cc-mrxlk" Apr 16 22:22:41.983642 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:41.983466 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202-oauth-serving-cert\") pod \"console-7dfd6686cc-mrxlk\" (UID: \"aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202\") " pod="openshift-console/console-7dfd6686cc-mrxlk" Apr 16 22:22:41.983642 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:41.983538 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202-trusted-ca-bundle\") pod \"console-7dfd6686cc-mrxlk\" (UID: \"aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202\") " pod="openshift-console/console-7dfd6686cc-mrxlk" Apr 16 22:22:41.983642 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:41.983595 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202-console-config\") pod \"console-7dfd6686cc-mrxlk\" (UID: \"aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202\") " pod="openshift-console/console-7dfd6686cc-mrxlk" Apr 16 22:22:41.983826 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:41.983656 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw47w\" (UniqueName: \"kubernetes.io/projected/aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202-kube-api-access-qw47w\") pod \"console-7dfd6686cc-mrxlk\" (UID: \"aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202\") " pod="openshift-console/console-7dfd6686cc-mrxlk" Apr 16 22:22:41.983826 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:41.983772 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202-service-ca\") pod \"console-7dfd6686cc-mrxlk\" (UID: \"aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202\") " pod="openshift-console/console-7dfd6686cc-mrxlk" Apr 16 22:22:41.983924 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:41.983826 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202-console-oauth-config\") pod \"console-7dfd6686cc-mrxlk\" (UID: \"aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202\") " pod="openshift-console/console-7dfd6686cc-mrxlk" Apr 16 22:22:42.084875 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:42.084833 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202-trusted-ca-bundle\") pod \"console-7dfd6686cc-mrxlk\" (UID: \"aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202\") " pod="openshift-console/console-7dfd6686cc-mrxlk" Apr 16 22:22:42.085097 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:42.084904 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202-console-config\") pod \"console-7dfd6686cc-mrxlk\" (UID: \"aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202\") " pod="openshift-console/console-7dfd6686cc-mrxlk" Apr 16 22:22:42.085097 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:42.084936 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qw47w\" (UniqueName: \"kubernetes.io/projected/aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202-kube-api-access-qw47w\") pod \"console-7dfd6686cc-mrxlk\" (UID: \"aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202\") " pod="openshift-console/console-7dfd6686cc-mrxlk" Apr 16 22:22:42.085097 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:42.084984 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202-service-ca\") pod \"console-7dfd6686cc-mrxlk\" (UID: \"aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202\") " pod="openshift-console/console-7dfd6686cc-mrxlk" Apr 16 22:22:42.085097 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:42.085042 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202-console-oauth-config\") pod \"console-7dfd6686cc-mrxlk\" (UID: \"aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202\") " pod="openshift-console/console-7dfd6686cc-mrxlk" Apr 16 22:22:42.085097 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:42.085068 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202-console-serving-cert\") pod \"console-7dfd6686cc-mrxlk\" (UID: \"aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202\") " pod="openshift-console/console-7dfd6686cc-mrxlk" Apr 16 22:22:42.085097 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:42.085086 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202-oauth-serving-cert\") pod \"console-7dfd6686cc-mrxlk\" (UID: \"aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202\") " pod="openshift-console/console-7dfd6686cc-mrxlk" Apr 16 22:22:42.085820 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:42.085789 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202-console-config\") pod \"console-7dfd6686cc-mrxlk\" (UID: \"aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202\") " pod="openshift-console/console-7dfd6686cc-mrxlk" Apr 16 22:22:42.085820 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:42.085813 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202-service-ca\") pod \"console-7dfd6686cc-mrxlk\" (UID: \"aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202\") " pod="openshift-console/console-7dfd6686cc-mrxlk" Apr 16 22:22:42.085966 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:42.085791 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202-oauth-serving-cert\") pod \"console-7dfd6686cc-mrxlk\" (UID: \"aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202\") " pod="openshift-console/console-7dfd6686cc-mrxlk" Apr 16 22:22:42.085966 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:42.085827 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202-trusted-ca-bundle\") pod \"console-7dfd6686cc-mrxlk\" (UID: \"aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202\") " pod="openshift-console/console-7dfd6686cc-mrxlk" Apr 16 22:22:42.087439 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:42.087415 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202-console-oauth-config\") pod \"console-7dfd6686cc-mrxlk\" (UID: \"aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202\") " pod="openshift-console/console-7dfd6686cc-mrxlk" Apr 16 22:22:42.087645 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:42.087623 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202-console-serving-cert\") pod \"console-7dfd6686cc-mrxlk\" (UID: \"aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202\") " pod="openshift-console/console-7dfd6686cc-mrxlk" Apr 16 22:22:42.092902 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:42.092881 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw47w\" (UniqueName: \"kubernetes.io/projected/aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202-kube-api-access-qw47w\") pod \"console-7dfd6686cc-mrxlk\" (UID: \"aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202\") " pod="openshift-console/console-7dfd6686cc-mrxlk" Apr 16 22:22:42.208883 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:42.208806 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7dfd6686cc-mrxlk" Apr 16 22:22:42.338529 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:42.338471 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7dfd6686cc-mrxlk"] Apr 16 22:22:42.341239 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:22:42.341210 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaadb8510_0cd8_45e8_b0a4_9f6bf4c3a202.slice/crio-e170e59ceade083c9d26e7cb9de61f973f07bb3421922d8314d537b662f919eb WatchSource:0}: Error finding container e170e59ceade083c9d26e7cb9de61f973f07bb3421922d8314d537b662f919eb: Status 404 returned error can't find the container with id e170e59ceade083c9d26e7cb9de61f973f07bb3421922d8314d537b662f919eb Apr 16 22:22:42.946281 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:42.946247 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7dfd6686cc-mrxlk" event={"ID":"aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202","Type":"ContainerStarted","Data":"b849b51a9a4a260e93db97da62a29123074b4d9bde903227ed529e061948470e"} Apr 16 22:22:42.946281 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:42.946280 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7dfd6686cc-mrxlk" event={"ID":"aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202","Type":"ContainerStarted","Data":"e170e59ceade083c9d26e7cb9de61f973f07bb3421922d8314d537b662f919eb"} Apr 16 22:22:42.967983 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:42.967935 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7dfd6686cc-mrxlk" podStartSLOduration=1.967921478 podStartE2EDuration="1.967921478s" podCreationTimestamp="2026-04-16 22:22:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:22:42.966291619 +0000 UTC m=+553.334697811" watchObservedRunningTime="2026-04-16 22:22:42.967921478 +0000 UTC m=+553.336327626" Apr 16 22:22:50.795234 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:50.795198 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-59knw"] Apr 16 22:22:50.798344 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:50.798327 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-59knw" Apr 16 22:22:50.800878 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:50.800851 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 22:22:50.801019 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:50.800891 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-w2qlr\"" Apr 16 22:22:50.801019 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:50.800913 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 22:22:50.801019 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:50.800970 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 16 22:22:50.809582 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:50.809562 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-59knw"] Apr 16 22:22:50.864916 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:50.864879 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28d55\" (UniqueName: \"kubernetes.io/projected/11810e52-ac90-4e12-ad4b-5be0969566a6-kube-api-access-28d55\") pod \"dns-operator-controller-manager-844548ff4c-59knw\" (UID: \"11810e52-ac90-4e12-ad4b-5be0969566a6\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-59knw" Apr 16 22:22:50.965624 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:50.965589 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-28d55\" (UniqueName: \"kubernetes.io/projected/11810e52-ac90-4e12-ad4b-5be0969566a6-kube-api-access-28d55\") pod \"dns-operator-controller-manager-844548ff4c-59knw\" (UID: \"11810e52-ac90-4e12-ad4b-5be0969566a6\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-59knw" Apr 16 22:22:50.974899 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:50.974871 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-28d55\" (UniqueName: \"kubernetes.io/projected/11810e52-ac90-4e12-ad4b-5be0969566a6-kube-api-access-28d55\") pod \"dns-operator-controller-manager-844548ff4c-59knw\" (UID: \"11810e52-ac90-4e12-ad4b-5be0969566a6\") " pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-59knw" Apr 16 22:22:51.111286 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:51.111183 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-59knw" Apr 16 22:22:51.234541 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:51.234518 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-844548ff4c-59knw"] Apr 16 22:22:51.237304 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:22:51.237275 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11810e52_ac90_4e12_ad4b_5be0969566a6.slice/crio-54aabf914382ef6f8b55f6dab534124461fffe1c00f67b128cf16dedcd17a013 WatchSource:0}: Error finding container 54aabf914382ef6f8b55f6dab534124461fffe1c00f67b128cf16dedcd17a013: Status 404 returned error can't find the container with id 54aabf914382ef6f8b55f6dab534124461fffe1c00f67b128cf16dedcd17a013 Apr 16 22:22:51.973937 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:51.973895 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-59knw" event={"ID":"11810e52-ac90-4e12-ad4b-5be0969566a6","Type":"ContainerStarted","Data":"54aabf914382ef6f8b55f6dab534124461fffe1c00f67b128cf16dedcd17a013"} Apr 16 22:22:52.209336 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:52.209079 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7dfd6686cc-mrxlk" Apr 16 22:22:52.209336 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:52.209129 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7dfd6686cc-mrxlk" Apr 16 22:22:52.215646 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:52.215609 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7dfd6686cc-mrxlk" Apr 16 22:22:52.981475 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:52.981444 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7dfd6686cc-mrxlk" Apr 16 22:22:57.993510 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:57.993472 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-59knw" event={"ID":"11810e52-ac90-4e12-ad4b-5be0969566a6","Type":"ContainerStarted","Data":"4e2412d52cf71d164d3c5dbff4b28c554ab3843bc0b86485fe983d863e3d069d"} Apr 16 22:22:57.993919 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:57.993675 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-59knw" Apr 16 22:22:58.012574 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:22:58.012531 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-59knw" podStartSLOduration=2.028969555 podStartE2EDuration="8.012517165s" podCreationTimestamp="2026-04-16 22:22:50 +0000 UTC" firstStartedPulling="2026-04-16 22:22:51.239293399 +0000 UTC m=+561.607699531" lastFinishedPulling="2026-04-16 22:22:57.222840999 +0000 UTC m=+567.591247141" observedRunningTime="2026-04-16 22:22:58.010566676 +0000 UTC m=+568.378972830" watchObservedRunningTime="2026-04-16 22:22:58.012517165 +0000 UTC m=+568.380923315" Apr 16 22:23:08.999339 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:08.999307 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-844548ff4c-59knw" Apr 16 22:23:30.163006 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:30.162965 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v4dxt_dba969ac-c53c-4354-ab6e-cc853b8c1449/console-operator/1.log" Apr 16 22:23:30.163452 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:30.163413 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v4dxt_dba969ac-c53c-4354-ab6e-cc853b8c1449/console-operator/1.log" Apr 16 22:23:30.168827 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:30.168806 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hm4t5_cdab8cce-9b55-478d-b1b5-740aa9746143/ovn-acl-logging/0.log" Apr 16 22:23:30.169190 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:30.169175 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hm4t5_cdab8cce-9b55-478d-b1b5-740aa9746143/ovn-acl-logging/0.log" Apr 16 22:23:39.482875 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:39.482840 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-vdjbn"] Apr 16 22:23:39.486132 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:39.486116 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-vdjbn" Apr 16 22:23:39.488573 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:39.488554 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-qpntl\"" Apr 16 22:23:39.488663 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:39.488567 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 22:23:39.493860 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:39.493839 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-vdjbn"] Apr 16 22:23:39.576185 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:39.576148 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-vdjbn"] Apr 16 22:23:39.585330 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:39.585302 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jhpp\" (UniqueName: \"kubernetes.io/projected/ced3cf34-e607-4b55-b81f-5d88e204c279-kube-api-access-4jhpp\") pod \"limitador-limitador-64c8f475fb-vdjbn\" (UID: \"ced3cf34-e607-4b55-b81f-5d88e204c279\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-vdjbn" Apr 16 22:23:39.585475 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:39.585336 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ced3cf34-e607-4b55-b81f-5d88e204c279-config-file\") pod \"limitador-limitador-64c8f475fb-vdjbn\" (UID: \"ced3cf34-e607-4b55-b81f-5d88e204c279\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-vdjbn" Apr 16 22:23:39.686155 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:39.686117 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4jhpp\" (UniqueName: \"kubernetes.io/projected/ced3cf34-e607-4b55-b81f-5d88e204c279-kube-api-access-4jhpp\") pod \"limitador-limitador-64c8f475fb-vdjbn\" (UID: \"ced3cf34-e607-4b55-b81f-5d88e204c279\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-vdjbn" Apr 16 22:23:39.686155 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:39.686157 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ced3cf34-e607-4b55-b81f-5d88e204c279-config-file\") pod \"limitador-limitador-64c8f475fb-vdjbn\" (UID: \"ced3cf34-e607-4b55-b81f-5d88e204c279\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-vdjbn" Apr 16 22:23:39.686792 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:39.686772 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ced3cf34-e607-4b55-b81f-5d88e204c279-config-file\") pod \"limitador-limitador-64c8f475fb-vdjbn\" (UID: \"ced3cf34-e607-4b55-b81f-5d88e204c279\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-vdjbn" Apr 16 22:23:39.694246 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:39.694214 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jhpp\" (UniqueName: \"kubernetes.io/projected/ced3cf34-e607-4b55-b81f-5d88e204c279-kube-api-access-4jhpp\") pod \"limitador-limitador-64c8f475fb-vdjbn\" (UID: \"ced3cf34-e607-4b55-b81f-5d88e204c279\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-vdjbn" Apr 16 22:23:39.796557 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:39.796520 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-vdjbn" Apr 16 22:23:39.924976 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:39.924823 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-vdjbn"] Apr 16 22:23:39.927595 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:23:39.927567 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podced3cf34_e607_4b55_b81f_5d88e204c279.slice/crio-3c68ae76299fc636605d8f7771aeaeb01c209ab4c6a7868fbc4c3d9182bce8ce WatchSource:0}: Error finding container 3c68ae76299fc636605d8f7771aeaeb01c209ab4c6a7868fbc4c3d9182bce8ce: Status 404 returned error can't find the container with id 3c68ae76299fc636605d8f7771aeaeb01c209ab4c6a7868fbc4c3d9182bce8ce Apr 16 22:23:40.123974 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:40.123889 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-vdjbn" event={"ID":"ced3cf34-e607-4b55-b81f-5d88e204c279","Type":"ContainerStarted","Data":"3c68ae76299fc636605d8f7771aeaeb01c209ab4c6a7868fbc4c3d9182bce8ce"} Apr 16 22:23:40.519758 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:40.519719 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-8d4j8"] Apr 16 22:23:40.524257 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:40.524239 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-8d4j8" Apr 16 22:23:40.526273 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:40.526251 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-qn88w\"" Apr 16 22:23:40.529293 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:40.529270 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-8d4j8"] Apr 16 22:23:40.593512 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:40.593476 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2hwc\" (UniqueName: \"kubernetes.io/projected/8e8f697f-2c0f-41f5-8adb-f129507410ef-kube-api-access-f2hwc\") pod \"authorino-79cbc94b89-8d4j8\" (UID: \"8e8f697f-2c0f-41f5-8adb-f129507410ef\") " pod="kuadrant-system/authorino-79cbc94b89-8d4j8" Apr 16 22:23:40.694232 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:40.694193 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2hwc\" (UniqueName: \"kubernetes.io/projected/8e8f697f-2c0f-41f5-8adb-f129507410ef-kube-api-access-f2hwc\") pod \"authorino-79cbc94b89-8d4j8\" (UID: \"8e8f697f-2c0f-41f5-8adb-f129507410ef\") " pod="kuadrant-system/authorino-79cbc94b89-8d4j8" Apr 16 22:23:40.701928 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:40.701904 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2hwc\" (UniqueName: \"kubernetes.io/projected/8e8f697f-2c0f-41f5-8adb-f129507410ef-kube-api-access-f2hwc\") pod \"authorino-79cbc94b89-8d4j8\" (UID: \"8e8f697f-2c0f-41f5-8adb-f129507410ef\") " pod="kuadrant-system/authorino-79cbc94b89-8d4j8" Apr 16 22:23:40.834861 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:40.834767 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-8d4j8" Apr 16 22:23:40.987389 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:40.987357 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-8d4j8"] Apr 16 22:23:40.989922 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:23:40.989893 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e8f697f_2c0f_41f5_8adb_f129507410ef.slice/crio-2e8f3d77c17977c14a914a4b67bc7d05e8e750d599887e82d8f6fd650835748f WatchSource:0}: Error finding container 2e8f3d77c17977c14a914a4b67bc7d05e8e750d599887e82d8f6fd650835748f: Status 404 returned error can't find the container with id 2e8f3d77c17977c14a914a4b67bc7d05e8e750d599887e82d8f6fd650835748f Apr 16 22:23:41.128719 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:41.128636 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-8d4j8" event={"ID":"8e8f697f-2c0f-41f5-8adb-f129507410ef","Type":"ContainerStarted","Data":"2e8f3d77c17977c14a914a4b67bc7d05e8e750d599887e82d8f6fd650835748f"} Apr 16 22:23:46.151391 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:46.151343 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-8d4j8" event={"ID":"8e8f697f-2c0f-41f5-8adb-f129507410ef","Type":"ContainerStarted","Data":"f6b653f21cbfa5d155e1932d8e03b1ac5707c5a44e4ac2f8ca05e818ad5145bb"} Apr 16 22:23:46.152627 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:46.152603 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-vdjbn" event={"ID":"ced3cf34-e607-4b55-b81f-5d88e204c279","Type":"ContainerStarted","Data":"4ed91a40d674d03a3486347937490126f7496d5cc5a4f976b28a75020f883172"} Apr 16 22:23:46.152773 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:46.152757 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-vdjbn" Apr 16 22:23:46.166060 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:46.166003 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-8d4j8" podStartSLOduration=2.075007416 podStartE2EDuration="6.165990894s" podCreationTimestamp="2026-04-16 22:23:40 +0000 UTC" firstStartedPulling="2026-04-16 22:23:40.991457295 +0000 UTC m=+611.359863440" lastFinishedPulling="2026-04-16 22:23:45.082440788 +0000 UTC m=+615.450846918" observedRunningTime="2026-04-16 22:23:46.164101917 +0000 UTC m=+616.532508093" watchObservedRunningTime="2026-04-16 22:23:46.165990894 +0000 UTC m=+616.534397044" Apr 16 22:23:46.180206 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:46.180162 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-vdjbn" podStartSLOduration=2.075631331 podStartE2EDuration="7.180149716s" podCreationTimestamp="2026-04-16 22:23:39 +0000 UTC" firstStartedPulling="2026-04-16 22:23:39.929485563 +0000 UTC m=+610.297891698" lastFinishedPulling="2026-04-16 22:23:45.034003943 +0000 UTC m=+615.402410083" observedRunningTime="2026-04-16 22:23:46.178763446 +0000 UTC m=+616.547169594" watchObservedRunningTime="2026-04-16 22:23:46.180149716 +0000 UTC m=+616.548555866" Apr 16 22:23:52.501194 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:52.501109 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-vdjbn"] Apr 16 22:23:52.501577 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:52.501344 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-vdjbn" podUID="ced3cf34-e607-4b55-b81f-5d88e204c279" containerName="limitador" containerID="cri-o://4ed91a40d674d03a3486347937490126f7496d5cc5a4f976b28a75020f883172" gracePeriod=30 Apr 16 22:23:52.503642 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:52.503616 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-vdjbn" Apr 16 22:23:53.045721 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:53.045696 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-vdjbn" Apr 16 22:23:53.177227 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:53.177134 2574 generic.go:358] "Generic (PLEG): container finished" podID="ced3cf34-e607-4b55-b81f-5d88e204c279" containerID="4ed91a40d674d03a3486347937490126f7496d5cc5a4f976b28a75020f883172" exitCode=0 Apr 16 22:23:53.177227 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:53.177192 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-vdjbn" Apr 16 22:23:53.177227 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:53.177219 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-vdjbn" event={"ID":"ced3cf34-e607-4b55-b81f-5d88e204c279","Type":"ContainerDied","Data":"4ed91a40d674d03a3486347937490126f7496d5cc5a4f976b28a75020f883172"} Apr 16 22:23:53.177466 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:53.177251 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-vdjbn" event={"ID":"ced3cf34-e607-4b55-b81f-5d88e204c279","Type":"ContainerDied","Data":"3c68ae76299fc636605d8f7771aeaeb01c209ab4c6a7868fbc4c3d9182bce8ce"} Apr 16 22:23:53.177466 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:53.177265 2574 scope.go:117] "RemoveContainer" containerID="4ed91a40d674d03a3486347937490126f7496d5cc5a4f976b28a75020f883172" Apr 16 22:23:53.184874 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:53.184857 2574 scope.go:117] "RemoveContainer" containerID="4ed91a40d674d03a3486347937490126f7496d5cc5a4f976b28a75020f883172" Apr 16 22:23:53.185142 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:23:53.185125 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ed91a40d674d03a3486347937490126f7496d5cc5a4f976b28a75020f883172\": container with ID starting with 4ed91a40d674d03a3486347937490126f7496d5cc5a4f976b28a75020f883172 not found: ID does not exist" containerID="4ed91a40d674d03a3486347937490126f7496d5cc5a4f976b28a75020f883172" Apr 16 22:23:53.185200 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:53.185152 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ed91a40d674d03a3486347937490126f7496d5cc5a4f976b28a75020f883172"} err="failed to get container status \"4ed91a40d674d03a3486347937490126f7496d5cc5a4f976b28a75020f883172\": rpc error: code = NotFound desc = could not find container \"4ed91a40d674d03a3486347937490126f7496d5cc5a4f976b28a75020f883172\": container with ID starting with 4ed91a40d674d03a3486347937490126f7496d5cc5a4f976b28a75020f883172 not found: ID does not exist" Apr 16 22:23:53.204471 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:53.204448 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jhpp\" (UniqueName: \"kubernetes.io/projected/ced3cf34-e607-4b55-b81f-5d88e204c279-kube-api-access-4jhpp\") pod \"ced3cf34-e607-4b55-b81f-5d88e204c279\" (UID: \"ced3cf34-e607-4b55-b81f-5d88e204c279\") " Apr 16 22:23:53.204547 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:53.204486 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ced3cf34-e607-4b55-b81f-5d88e204c279-config-file\") pod \"ced3cf34-e607-4b55-b81f-5d88e204c279\" (UID: \"ced3cf34-e607-4b55-b81f-5d88e204c279\") " Apr 16 22:23:53.204857 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:53.204836 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ced3cf34-e607-4b55-b81f-5d88e204c279-config-file" (OuterVolumeSpecName: "config-file") pod "ced3cf34-e607-4b55-b81f-5d88e204c279" (UID: "ced3cf34-e607-4b55-b81f-5d88e204c279"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:23:53.206588 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:53.206570 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ced3cf34-e607-4b55-b81f-5d88e204c279-kube-api-access-4jhpp" (OuterVolumeSpecName: "kube-api-access-4jhpp") pod "ced3cf34-e607-4b55-b81f-5d88e204c279" (UID: "ced3cf34-e607-4b55-b81f-5d88e204c279"). InnerVolumeSpecName "kube-api-access-4jhpp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:23:53.306036 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:53.305998 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4jhpp\" (UniqueName: \"kubernetes.io/projected/ced3cf34-e607-4b55-b81f-5d88e204c279-kube-api-access-4jhpp\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:23:53.306036 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:53.306031 2574 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ced3cf34-e607-4b55-b81f-5d88e204c279-config-file\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:23:53.497694 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:53.497664 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-vdjbn"] Apr 16 22:23:53.500511 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:53.500486 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-vdjbn"] Apr 16 22:23:54.265796 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:23:54.265764 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ced3cf34-e607-4b55-b81f-5d88e204c279" path="/var/lib/kubelet/pods/ced3cf34-e607-4b55-b81f-5d88e204c279/volumes" Apr 16 22:24:01.712171 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:01.712130 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bd676465-rmwzh"] Apr 16 22:24:01.712644 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:01.712607 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ced3cf34-e607-4b55-b81f-5d88e204c279" containerName="limitador" Apr 16 22:24:01.712644 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:01.712627 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ced3cf34-e607-4b55-b81f-5d88e204c279" containerName="limitador" Apr 16 22:24:01.712781 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:01.712707 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="ced3cf34-e607-4b55-b81f-5d88e204c279" containerName="limitador" Apr 16 22:24:01.717183 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:01.717160 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-rmwzh" Apr 16 22:24:01.719373 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:01.719350 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 16 22:24:01.722774 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:01.722729 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-rmwzh"] Apr 16 22:24:01.878075 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:01.878035 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tvgq\" (UniqueName: \"kubernetes.io/projected/ef6ec432-e618-4ffe-b27f-93fdba577807-kube-api-access-7tvgq\") pod \"authorino-68bd676465-rmwzh\" (UID: \"ef6ec432-e618-4ffe-b27f-93fdba577807\") " pod="kuadrant-system/authorino-68bd676465-rmwzh" Apr 16 22:24:01.878263 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:01.878120 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/ef6ec432-e618-4ffe-b27f-93fdba577807-tls-cert\") pod \"authorino-68bd676465-rmwzh\" (UID: \"ef6ec432-e618-4ffe-b27f-93fdba577807\") " pod="kuadrant-system/authorino-68bd676465-rmwzh" Apr 16 22:24:01.979167 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:01.979069 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/ef6ec432-e618-4ffe-b27f-93fdba577807-tls-cert\") pod \"authorino-68bd676465-rmwzh\" (UID: \"ef6ec432-e618-4ffe-b27f-93fdba577807\") " pod="kuadrant-system/authorino-68bd676465-rmwzh" Apr 16 22:24:01.979167 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:01.979158 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7tvgq\" (UniqueName: \"kubernetes.io/projected/ef6ec432-e618-4ffe-b27f-93fdba577807-kube-api-access-7tvgq\") pod \"authorino-68bd676465-rmwzh\" (UID: \"ef6ec432-e618-4ffe-b27f-93fdba577807\") " pod="kuadrant-system/authorino-68bd676465-rmwzh" Apr 16 22:24:01.981543 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:01.981515 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/ef6ec432-e618-4ffe-b27f-93fdba577807-tls-cert\") pod \"authorino-68bd676465-rmwzh\" (UID: \"ef6ec432-e618-4ffe-b27f-93fdba577807\") " pod="kuadrant-system/authorino-68bd676465-rmwzh" Apr 16 22:24:01.986920 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:01.986896 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tvgq\" (UniqueName: \"kubernetes.io/projected/ef6ec432-e618-4ffe-b27f-93fdba577807-kube-api-access-7tvgq\") pod \"authorino-68bd676465-rmwzh\" (UID: \"ef6ec432-e618-4ffe-b27f-93fdba577807\") " pod="kuadrant-system/authorino-68bd676465-rmwzh" Apr 16 22:24:02.026453 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:02.026422 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-rmwzh" Apr 16 22:24:02.145162 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:02.144934 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-rmwzh"] Apr 16 22:24:02.147501 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:24:02.147467 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef6ec432_e618_4ffe_b27f_93fdba577807.slice/crio-b2e996778eba79526e7729ed5495cf596675732b58b88c63260911956baaecde WatchSource:0}: Error finding container b2e996778eba79526e7729ed5495cf596675732b58b88c63260911956baaecde: Status 404 returned error can't find the container with id b2e996778eba79526e7729ed5495cf596675732b58b88c63260911956baaecde Apr 16 22:24:02.210887 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:02.210850 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-rmwzh" event={"ID":"ef6ec432-e618-4ffe-b27f-93fdba577807","Type":"ContainerStarted","Data":"b2e996778eba79526e7729ed5495cf596675732b58b88c63260911956baaecde"} Apr 16 22:24:03.215936 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:03.215897 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-rmwzh" event={"ID":"ef6ec432-e618-4ffe-b27f-93fdba577807","Type":"ContainerStarted","Data":"3ba12d5bdc10456ea88c181520a8624efa0f2f849f379c7ef9089ca60d396cbd"} Apr 16 22:24:03.232794 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:03.232729 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bd676465-rmwzh" podStartSLOduration=1.891434118 podStartE2EDuration="2.232716777s" podCreationTimestamp="2026-04-16 22:24:01 +0000 UTC" firstStartedPulling="2026-04-16 22:24:02.148835524 +0000 UTC m=+632.517241660" lastFinishedPulling="2026-04-16 22:24:02.490118186 +0000 UTC m=+632.858524319" observedRunningTime="2026-04-16 22:24:03.231217268 +0000 UTC m=+633.599623418" watchObservedRunningTime="2026-04-16 22:24:03.232716777 +0000 UTC m=+633.601122930" Apr 16 22:24:03.261512 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:03.261471 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-8d4j8"] Apr 16 22:24:03.261727 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:03.261704 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-8d4j8" podUID="8e8f697f-2c0f-41f5-8adb-f129507410ef" containerName="authorino" containerID="cri-o://f6b653f21cbfa5d155e1932d8e03b1ac5707c5a44e4ac2f8ca05e818ad5145bb" gracePeriod=30 Apr 16 22:24:03.499419 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:03.499396 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-8d4j8" Apr 16 22:24:03.692271 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:03.692236 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2hwc\" (UniqueName: \"kubernetes.io/projected/8e8f697f-2c0f-41f5-8adb-f129507410ef-kube-api-access-f2hwc\") pod \"8e8f697f-2c0f-41f5-8adb-f129507410ef\" (UID: \"8e8f697f-2c0f-41f5-8adb-f129507410ef\") " Apr 16 22:24:03.694323 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:03.694291 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e8f697f-2c0f-41f5-8adb-f129507410ef-kube-api-access-f2hwc" (OuterVolumeSpecName: "kube-api-access-f2hwc") pod "8e8f697f-2c0f-41f5-8adb-f129507410ef" (UID: "8e8f697f-2c0f-41f5-8adb-f129507410ef"). InnerVolumeSpecName "kube-api-access-f2hwc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:24:03.792877 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:03.792843 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f2hwc\" (UniqueName: \"kubernetes.io/projected/8e8f697f-2c0f-41f5-8adb-f129507410ef-kube-api-access-f2hwc\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:24:04.219870 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:04.219770 2574 generic.go:358] "Generic (PLEG): container finished" podID="8e8f697f-2c0f-41f5-8adb-f129507410ef" containerID="f6b653f21cbfa5d155e1932d8e03b1ac5707c5a44e4ac2f8ca05e818ad5145bb" exitCode=0 Apr 16 22:24:04.219870 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:04.219819 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-8d4j8" Apr 16 22:24:04.220344 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:04.219818 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-8d4j8" event={"ID":"8e8f697f-2c0f-41f5-8adb-f129507410ef","Type":"ContainerDied","Data":"f6b653f21cbfa5d155e1932d8e03b1ac5707c5a44e4ac2f8ca05e818ad5145bb"} Apr 16 22:24:04.220344 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:04.219923 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-8d4j8" event={"ID":"8e8f697f-2c0f-41f5-8adb-f129507410ef","Type":"ContainerDied","Data":"2e8f3d77c17977c14a914a4b67bc7d05e8e750d599887e82d8f6fd650835748f"} Apr 16 22:24:04.220344 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:04.219948 2574 scope.go:117] "RemoveContainer" containerID="f6b653f21cbfa5d155e1932d8e03b1ac5707c5a44e4ac2f8ca05e818ad5145bb" Apr 16 22:24:04.227782 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:04.227733 2574 scope.go:117] "RemoveContainer" containerID="f6b653f21cbfa5d155e1932d8e03b1ac5707c5a44e4ac2f8ca05e818ad5145bb" Apr 16 22:24:04.228007 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:24:04.227990 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6b653f21cbfa5d155e1932d8e03b1ac5707c5a44e4ac2f8ca05e818ad5145bb\": container with ID starting with f6b653f21cbfa5d155e1932d8e03b1ac5707c5a44e4ac2f8ca05e818ad5145bb not found: ID does not exist" containerID="f6b653f21cbfa5d155e1932d8e03b1ac5707c5a44e4ac2f8ca05e818ad5145bb" Apr 16 22:24:04.228070 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:04.228016 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6b653f21cbfa5d155e1932d8e03b1ac5707c5a44e4ac2f8ca05e818ad5145bb"} err="failed to get container status \"f6b653f21cbfa5d155e1932d8e03b1ac5707c5a44e4ac2f8ca05e818ad5145bb\": rpc error: code = NotFound desc = could not find container \"f6b653f21cbfa5d155e1932d8e03b1ac5707c5a44e4ac2f8ca05e818ad5145bb\": container with ID starting with f6b653f21cbfa5d155e1932d8e03b1ac5707c5a44e4ac2f8ca05e818ad5145bb not found: ID does not exist" Apr 16 22:24:04.238975 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:04.238952 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-8d4j8"] Apr 16 22:24:04.242400 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:04.242377 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-8d4j8"] Apr 16 22:24:04.262285 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:04.262255 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e8f697f-2c0f-41f5-8adb-f129507410ef" path="/var/lib/kubelet/pods/8e8f697f-2c0f-41f5-8adb-f129507410ef/volumes" Apr 16 22:24:20.867073 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:20.867036 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-5pbdl"] Apr 16 22:24:20.867480 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:20.867357 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e8f697f-2c0f-41f5-8adb-f129507410ef" containerName="authorino" Apr 16 22:24:20.867480 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:20.867368 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e8f697f-2c0f-41f5-8adb-f129507410ef" containerName="authorino" Apr 16 22:24:20.867480 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:20.867426 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e8f697f-2c0f-41f5-8adb-f129507410ef" containerName="authorino" Apr 16 22:24:20.870146 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:20.870128 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84d7d5cfc6-5pbdl" Apr 16 22:24:20.872408 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:20.872391 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 22:24:20.873007 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:20.872984 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 22:24:20.873007 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:20.872992 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 22:24:20.873164 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:20.872993 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-djw8b\"" Apr 16 22:24:20.880788 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:20.880768 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-5pbdl"] Apr 16 22:24:20.884481 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:20.884461 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-5f8dc4bc7b-ndnb8"] Apr 16 22:24:20.887702 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:20.887683 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5f8dc4bc7b-ndnb8" Apr 16 22:24:20.890346 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:20.890329 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-zbj7l\"" Apr 16 22:24:20.890491 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:20.890471 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 22:24:20.901079 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:20.901056 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-5f8dc4bc7b-ndnb8"] Apr 16 22:24:20.936380 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:20.936349 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd8bm\" (UniqueName: \"kubernetes.io/projected/e2b06a9b-2118-46da-9af6-4a99383d9443-kube-api-access-jd8bm\") pod \"llmisvc-controller-manager-5f8dc4bc7b-ndnb8\" (UID: \"e2b06a9b-2118-46da-9af6-4a99383d9443\") " pod="kserve/llmisvc-controller-manager-5f8dc4bc7b-ndnb8" Apr 16 22:24:20.936540 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:20.936393 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b07bf6a-0c60-4165-bb57-9163532b5ef6-cert\") pod \"kserve-controller-manager-84d7d5cfc6-5pbdl\" (UID: \"5b07bf6a-0c60-4165-bb57-9163532b5ef6\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-5pbdl" Apr 16 22:24:20.936540 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:20.936439 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e2b06a9b-2118-46da-9af6-4a99383d9443-cert\") pod \"llmisvc-controller-manager-5f8dc4bc7b-ndnb8\" (UID: \"e2b06a9b-2118-46da-9af6-4a99383d9443\") " pod="kserve/llmisvc-controller-manager-5f8dc4bc7b-ndnb8" Apr 16 22:24:20.936540 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:20.936479 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mb9d\" (UniqueName: \"kubernetes.io/projected/5b07bf6a-0c60-4165-bb57-9163532b5ef6-kube-api-access-2mb9d\") pod \"kserve-controller-manager-84d7d5cfc6-5pbdl\" (UID: \"5b07bf6a-0c60-4165-bb57-9163532b5ef6\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-5pbdl" Apr 16 22:24:21.037853 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:21.037803 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mb9d\" (UniqueName: \"kubernetes.io/projected/5b07bf6a-0c60-4165-bb57-9163532b5ef6-kube-api-access-2mb9d\") pod \"kserve-controller-manager-84d7d5cfc6-5pbdl\" (UID: \"5b07bf6a-0c60-4165-bb57-9163532b5ef6\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-5pbdl" Apr 16 22:24:21.038027 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:21.037923 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jd8bm\" (UniqueName: \"kubernetes.io/projected/e2b06a9b-2118-46da-9af6-4a99383d9443-kube-api-access-jd8bm\") pod \"llmisvc-controller-manager-5f8dc4bc7b-ndnb8\" (UID: \"e2b06a9b-2118-46da-9af6-4a99383d9443\") " pod="kserve/llmisvc-controller-manager-5f8dc4bc7b-ndnb8" Apr 16 22:24:21.038027 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:21.037955 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b07bf6a-0c60-4165-bb57-9163532b5ef6-cert\") pod \"kserve-controller-manager-84d7d5cfc6-5pbdl\" (UID: \"5b07bf6a-0c60-4165-bb57-9163532b5ef6\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-5pbdl" Apr 16 22:24:21.038109 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:24:21.038079 2574 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 16 22:24:21.038109 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:21.038097 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e2b06a9b-2118-46da-9af6-4a99383d9443-cert\") pod \"llmisvc-controller-manager-5f8dc4bc7b-ndnb8\" (UID: \"e2b06a9b-2118-46da-9af6-4a99383d9443\") " pod="kserve/llmisvc-controller-manager-5f8dc4bc7b-ndnb8" Apr 16 22:24:21.038228 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:24:21.038138 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b07bf6a-0c60-4165-bb57-9163532b5ef6-cert podName:5b07bf6a-0c60-4165-bb57-9163532b5ef6 nodeName:}" failed. No retries permitted until 2026-04-16 22:24:21.538117121 +0000 UTC m=+651.906523264 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5b07bf6a-0c60-4165-bb57-9163532b5ef6-cert") pod "kserve-controller-manager-84d7d5cfc6-5pbdl" (UID: "5b07bf6a-0c60-4165-bb57-9163532b5ef6") : secret "kserve-webhook-server-cert" not found Apr 16 22:24:21.040478 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:21.040455 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e2b06a9b-2118-46da-9af6-4a99383d9443-cert\") pod \"llmisvc-controller-manager-5f8dc4bc7b-ndnb8\" (UID: \"e2b06a9b-2118-46da-9af6-4a99383d9443\") " pod="kserve/llmisvc-controller-manager-5f8dc4bc7b-ndnb8" Apr 16 22:24:21.047278 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:21.047251 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd8bm\" (UniqueName: \"kubernetes.io/projected/e2b06a9b-2118-46da-9af6-4a99383d9443-kube-api-access-jd8bm\") pod \"llmisvc-controller-manager-5f8dc4bc7b-ndnb8\" (UID: \"e2b06a9b-2118-46da-9af6-4a99383d9443\") " pod="kserve/llmisvc-controller-manager-5f8dc4bc7b-ndnb8" Apr 16 22:24:21.047602 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:21.047586 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mb9d\" (UniqueName: \"kubernetes.io/projected/5b07bf6a-0c60-4165-bb57-9163532b5ef6-kube-api-access-2mb9d\") pod \"kserve-controller-manager-84d7d5cfc6-5pbdl\" (UID: \"5b07bf6a-0c60-4165-bb57-9163532b5ef6\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-5pbdl" Apr 16 22:24:21.197831 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:21.197721 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5f8dc4bc7b-ndnb8" Apr 16 22:24:21.336345 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:21.336320 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-5f8dc4bc7b-ndnb8"] Apr 16 22:24:21.338373 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:24:21.338346 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode2b06a9b_2118_46da_9af6_4a99383d9443.slice/crio-7f0f4b641b6131e01327c4a531c2a2ebd73af542a2a767d26010f1fbe00a9d03 WatchSource:0}: Error finding container 7f0f4b641b6131e01327c4a531c2a2ebd73af542a2a767d26010f1fbe00a9d03: Status 404 returned error can't find the container with id 7f0f4b641b6131e01327c4a531c2a2ebd73af542a2a767d26010f1fbe00a9d03 Apr 16 22:24:21.542930 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:21.542889 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b07bf6a-0c60-4165-bb57-9163532b5ef6-cert\") pod \"kserve-controller-manager-84d7d5cfc6-5pbdl\" (UID: \"5b07bf6a-0c60-4165-bb57-9163532b5ef6\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-5pbdl" Apr 16 22:24:21.545323 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:21.545295 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b07bf6a-0c60-4165-bb57-9163532b5ef6-cert\") pod \"kserve-controller-manager-84d7d5cfc6-5pbdl\" (UID: \"5b07bf6a-0c60-4165-bb57-9163532b5ef6\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-5pbdl" Apr 16 22:24:21.781254 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:21.781218 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84d7d5cfc6-5pbdl" Apr 16 22:24:21.899270 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:21.899227 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-5pbdl"] Apr 16 22:24:21.901261 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:24:21.901234 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b07bf6a_0c60_4165_bb57_9163532b5ef6.slice/crio-1bbf77c7fdd55138abac704ec0d5f4a77c7c45be8ca9b3cf29c9e184a5b68e39 WatchSource:0}: Error finding container 1bbf77c7fdd55138abac704ec0d5f4a77c7c45be8ca9b3cf29c9e184a5b68e39: Status 404 returned error can't find the container with id 1bbf77c7fdd55138abac704ec0d5f4a77c7c45be8ca9b3cf29c9e184a5b68e39 Apr 16 22:24:22.283721 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:22.283679 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5f8dc4bc7b-ndnb8" event={"ID":"e2b06a9b-2118-46da-9af6-4a99383d9443","Type":"ContainerStarted","Data":"7f0f4b641b6131e01327c4a531c2a2ebd73af542a2a767d26010f1fbe00a9d03"} Apr 16 22:24:22.284762 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:22.284723 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84d7d5cfc6-5pbdl" event={"ID":"5b07bf6a-0c60-4165-bb57-9163532b5ef6","Type":"ContainerStarted","Data":"1bbf77c7fdd55138abac704ec0d5f4a77c7c45be8ca9b3cf29c9e184a5b68e39"} Apr 16 22:24:25.296760 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:25.296699 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84d7d5cfc6-5pbdl" event={"ID":"5b07bf6a-0c60-4165-bb57-9163532b5ef6","Type":"ContainerStarted","Data":"28777f0e64db9992789a44f8855c94e52ec55a979acd6356f5e391ede312fa38"} Apr 16 22:24:25.297132 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:25.296864 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-84d7d5cfc6-5pbdl" Apr 16 22:24:25.312703 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:25.312649 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-84d7d5cfc6-5pbdl" podStartSLOduration=2.045468213 podStartE2EDuration="5.31263185s" podCreationTimestamp="2026-04-16 22:24:20 +0000 UTC" firstStartedPulling="2026-04-16 22:24:21.902551637 +0000 UTC m=+652.270957773" lastFinishedPulling="2026-04-16 22:24:25.169715277 +0000 UTC m=+655.538121410" observedRunningTime="2026-04-16 22:24:25.312199442 +0000 UTC m=+655.680605618" watchObservedRunningTime="2026-04-16 22:24:25.31263185 +0000 UTC m=+655.681038000" Apr 16 22:24:26.301040 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:26.300998 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5f8dc4bc7b-ndnb8" event={"ID":"e2b06a9b-2118-46da-9af6-4a99383d9443","Type":"ContainerStarted","Data":"89c119d71f2c67a962ffa8dc9457b318b3e545ee04768423effa264bd6830d72"} Apr 16 22:24:26.301528 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:26.301076 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-5f8dc4bc7b-ndnb8" Apr 16 22:24:26.315718 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:26.315552 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-5f8dc4bc7b-ndnb8" podStartSLOduration=2.485105106 podStartE2EDuration="6.315534289s" podCreationTimestamp="2026-04-16 22:24:20 +0000 UTC" firstStartedPulling="2026-04-16 22:24:21.339596288 +0000 UTC m=+651.708002420" lastFinishedPulling="2026-04-16 22:24:25.170025473 +0000 UTC m=+655.538431603" observedRunningTime="2026-04-16 22:24:26.315441945 +0000 UTC m=+656.683848099" watchObservedRunningTime="2026-04-16 22:24:26.315534289 +0000 UTC m=+656.683940441" Apr 16 22:24:56.306540 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:56.306507 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-84d7d5cfc6-5pbdl" Apr 16 22:24:57.306841 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:57.306806 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-5f8dc4bc7b-ndnb8" Apr 16 22:24:58.616705 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:58.616667 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-5pbdl"] Apr 16 22:24:58.617163 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:58.616904 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-84d7d5cfc6-5pbdl" podUID="5b07bf6a-0c60-4165-bb57-9163532b5ef6" containerName="manager" containerID="cri-o://28777f0e64db9992789a44f8855c94e52ec55a979acd6356f5e391ede312fa38" gracePeriod=10 Apr 16 22:24:58.639792 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:58.639768 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-4xdxd"] Apr 16 22:24:58.645423 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:58.645391 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84d7d5cfc6-4xdxd" Apr 16 22:24:58.652752 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:58.652710 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-4xdxd"] Apr 16 22:24:58.759344 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:58.759307 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwlcb\" (UniqueName: \"kubernetes.io/projected/3a52ec15-1ac7-4dad-b763-9ce2039d5f3c-kube-api-access-dwlcb\") pod \"kserve-controller-manager-84d7d5cfc6-4xdxd\" (UID: \"3a52ec15-1ac7-4dad-b763-9ce2039d5f3c\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-4xdxd" Apr 16 22:24:58.759344 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:58.759350 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3a52ec15-1ac7-4dad-b763-9ce2039d5f3c-cert\") pod \"kserve-controller-manager-84d7d5cfc6-4xdxd\" (UID: \"3a52ec15-1ac7-4dad-b763-9ce2039d5f3c\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-4xdxd" Apr 16 22:24:58.852171 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:58.852149 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84d7d5cfc6-5pbdl" Apr 16 22:24:58.860168 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:58.860140 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwlcb\" (UniqueName: \"kubernetes.io/projected/3a52ec15-1ac7-4dad-b763-9ce2039d5f3c-kube-api-access-dwlcb\") pod \"kserve-controller-manager-84d7d5cfc6-4xdxd\" (UID: \"3a52ec15-1ac7-4dad-b763-9ce2039d5f3c\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-4xdxd" Apr 16 22:24:58.860331 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:58.860181 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3a52ec15-1ac7-4dad-b763-9ce2039d5f3c-cert\") pod \"kserve-controller-manager-84d7d5cfc6-4xdxd\" (UID: \"3a52ec15-1ac7-4dad-b763-9ce2039d5f3c\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-4xdxd" Apr 16 22:24:58.862502 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:58.862479 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3a52ec15-1ac7-4dad-b763-9ce2039d5f3c-cert\") pod \"kserve-controller-manager-84d7d5cfc6-4xdxd\" (UID: \"3a52ec15-1ac7-4dad-b763-9ce2039d5f3c\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-4xdxd" Apr 16 22:24:58.867559 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:58.867504 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwlcb\" (UniqueName: \"kubernetes.io/projected/3a52ec15-1ac7-4dad-b763-9ce2039d5f3c-kube-api-access-dwlcb\") pod \"kserve-controller-manager-84d7d5cfc6-4xdxd\" (UID: \"3a52ec15-1ac7-4dad-b763-9ce2039d5f3c\") " pod="kserve/kserve-controller-manager-84d7d5cfc6-4xdxd" Apr 16 22:24:58.960784 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:58.960727 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mb9d\" (UniqueName: \"kubernetes.io/projected/5b07bf6a-0c60-4165-bb57-9163532b5ef6-kube-api-access-2mb9d\") pod \"5b07bf6a-0c60-4165-bb57-9163532b5ef6\" (UID: \"5b07bf6a-0c60-4165-bb57-9163532b5ef6\") " Apr 16 22:24:58.960966 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:58.960829 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b07bf6a-0c60-4165-bb57-9163532b5ef6-cert\") pod \"5b07bf6a-0c60-4165-bb57-9163532b5ef6\" (UID: \"5b07bf6a-0c60-4165-bb57-9163532b5ef6\") " Apr 16 22:24:58.962878 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:58.962849 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b07bf6a-0c60-4165-bb57-9163532b5ef6-kube-api-access-2mb9d" (OuterVolumeSpecName: "kube-api-access-2mb9d") pod "5b07bf6a-0c60-4165-bb57-9163532b5ef6" (UID: "5b07bf6a-0c60-4165-bb57-9163532b5ef6"). InnerVolumeSpecName "kube-api-access-2mb9d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:24:58.962878 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:58.962859 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b07bf6a-0c60-4165-bb57-9163532b5ef6-cert" (OuterVolumeSpecName: "cert") pod "5b07bf6a-0c60-4165-bb57-9163532b5ef6" (UID: "5b07bf6a-0c60-4165-bb57-9163532b5ef6"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:24:59.001944 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:59.001915 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84d7d5cfc6-4xdxd" Apr 16 22:24:59.061947 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:59.061920 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2mb9d\" (UniqueName: \"kubernetes.io/projected/5b07bf6a-0c60-4165-bb57-9163532b5ef6-kube-api-access-2mb9d\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:24:59.061947 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:59.061945 2574 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b07bf6a-0c60-4165-bb57-9163532b5ef6-cert\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:24:59.122312 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:59.122239 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-4xdxd"] Apr 16 22:24:59.125640 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:24:59.125604 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a52ec15_1ac7_4dad_b763_9ce2039d5f3c.slice/crio-69a320a9de61c24757d374834170680ed59b227d5b8fe517b44a3fafc513a312 WatchSource:0}: Error finding container 69a320a9de61c24757d374834170680ed59b227d5b8fe517b44a3fafc513a312: Status 404 returned error can't find the container with id 69a320a9de61c24757d374834170680ed59b227d5b8fe517b44a3fafc513a312 Apr 16 22:24:59.404461 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:59.404369 2574 generic.go:358] "Generic (PLEG): container finished" podID="5b07bf6a-0c60-4165-bb57-9163532b5ef6" containerID="28777f0e64db9992789a44f8855c94e52ec55a979acd6356f5e391ede312fa38" exitCode=0 Apr 16 22:24:59.404461 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:59.404428 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-84d7d5cfc6-5pbdl" Apr 16 22:24:59.404461 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:59.404437 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84d7d5cfc6-5pbdl" event={"ID":"5b07bf6a-0c60-4165-bb57-9163532b5ef6","Type":"ContainerDied","Data":"28777f0e64db9992789a44f8855c94e52ec55a979acd6356f5e391ede312fa38"} Apr 16 22:24:59.404461 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:59.404462 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84d7d5cfc6-5pbdl" event={"ID":"5b07bf6a-0c60-4165-bb57-9163532b5ef6","Type":"ContainerDied","Data":"1bbf77c7fdd55138abac704ec0d5f4a77c7c45be8ca9b3cf29c9e184a5b68e39"} Apr 16 22:24:59.404795 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:59.404477 2574 scope.go:117] "RemoveContainer" containerID="28777f0e64db9992789a44f8855c94e52ec55a979acd6356f5e391ede312fa38" Apr 16 22:24:59.405488 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:59.405457 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84d7d5cfc6-4xdxd" event={"ID":"3a52ec15-1ac7-4dad-b763-9ce2039d5f3c","Type":"ContainerStarted","Data":"69a320a9de61c24757d374834170680ed59b227d5b8fe517b44a3fafc513a312"} Apr 16 22:24:59.412768 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:59.412728 2574 scope.go:117] "RemoveContainer" containerID="28777f0e64db9992789a44f8855c94e52ec55a979acd6356f5e391ede312fa38" Apr 16 22:24:59.413060 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:24:59.413033 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28777f0e64db9992789a44f8855c94e52ec55a979acd6356f5e391ede312fa38\": container with ID starting with 28777f0e64db9992789a44f8855c94e52ec55a979acd6356f5e391ede312fa38 not found: ID does not exist" containerID="28777f0e64db9992789a44f8855c94e52ec55a979acd6356f5e391ede312fa38" Apr 16 22:24:59.413144 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:59.413069 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28777f0e64db9992789a44f8855c94e52ec55a979acd6356f5e391ede312fa38"} err="failed to get container status \"28777f0e64db9992789a44f8855c94e52ec55a979acd6356f5e391ede312fa38\": rpc error: code = NotFound desc = could not find container \"28777f0e64db9992789a44f8855c94e52ec55a979acd6356f5e391ede312fa38\": container with ID starting with 28777f0e64db9992789a44f8855c94e52ec55a979acd6356f5e391ede312fa38 not found: ID does not exist" Apr 16 22:24:59.425105 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:59.425082 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-5pbdl"] Apr 16 22:24:59.429362 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:24:59.429340 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-84d7d5cfc6-5pbdl"] Apr 16 22:25:00.262142 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:25:00.262102 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b07bf6a-0c60-4165-bb57-9163532b5ef6" path="/var/lib/kubelet/pods/5b07bf6a-0c60-4165-bb57-9163532b5ef6/volumes" Apr 16 22:25:00.411256 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:25:00.411212 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-84d7d5cfc6-4xdxd" event={"ID":"3a52ec15-1ac7-4dad-b763-9ce2039d5f3c","Type":"ContainerStarted","Data":"521c26a1366c4794266e556cd8c4f9a6cc843bb94716c96b552189f3f6a03fde"} Apr 16 22:25:00.411446 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:25:00.411326 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-84d7d5cfc6-4xdxd" Apr 16 22:25:00.427219 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:25:00.427171 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-84d7d5cfc6-4xdxd" podStartSLOduration=2.110739846 podStartE2EDuration="2.427159177s" podCreationTimestamp="2026-04-16 22:24:58 +0000 UTC" firstStartedPulling="2026-04-16 22:24:59.126880661 +0000 UTC m=+689.495286793" lastFinishedPulling="2026-04-16 22:24:59.443299986 +0000 UTC m=+689.811706124" observedRunningTime="2026-04-16 22:25:00.425595795 +0000 UTC m=+690.794001946" watchObservedRunningTime="2026-04-16 22:25:00.427159177 +0000 UTC m=+690.795565327" Apr 16 22:25:31.418939 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:25:31.418908 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-84d7d5cfc6-4xdxd" Apr 16 22:26:09.152873 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.152841 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h"] Apr 16 22:26:09.153354 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.153184 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b07bf6a-0c60-4165-bb57-9163532b5ef6" containerName="manager" Apr 16 22:26:09.153354 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.153196 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b07bf6a-0c60-4165-bb57-9163532b5ef6" containerName="manager" Apr 16 22:26:09.153354 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.153256 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b07bf6a-0c60-4165-bb57-9163532b5ef6" containerName="manager" Apr 16 22:26:09.156599 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.156577 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:09.159373 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.159351 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 16 22:26:09.159496 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.159371 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 22:26:09.160011 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.159993 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 22:26:09.160100 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.159997 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-gwx87\"" Apr 16 22:26:09.171805 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.171779 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h"] Apr 16 22:26:09.258231 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.258188 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/bc78cd9a-99b3-4352-b1e0-5e88a17d0c32-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cft8h\" (UID: \"bc78cd9a-99b3-4352-b1e0-5e88a17d0c32\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:09.258419 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.258248 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/bc78cd9a-99b3-4352-b1e0-5e88a17d0c32-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cft8h\" (UID: \"bc78cd9a-99b3-4352-b1e0-5e88a17d0c32\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:09.258419 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.258384 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/bc78cd9a-99b3-4352-b1e0-5e88a17d0c32-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cft8h\" (UID: \"bc78cd9a-99b3-4352-b1e0-5e88a17d0c32\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:09.258509 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.258464 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/bc78cd9a-99b3-4352-b1e0-5e88a17d0c32-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cft8h\" (UID: \"bc78cd9a-99b3-4352-b1e0-5e88a17d0c32\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:09.258620 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.258605 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/bc78cd9a-99b3-4352-b1e0-5e88a17d0c32-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cft8h\" (UID: \"bc78cd9a-99b3-4352-b1e0-5e88a17d0c32\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:09.258657 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.258631 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/bc78cd9a-99b3-4352-b1e0-5e88a17d0c32-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cft8h\" (UID: \"bc78cd9a-99b3-4352-b1e0-5e88a17d0c32\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:09.258657 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.258653 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/bc78cd9a-99b3-4352-b1e0-5e88a17d0c32-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cft8h\" (UID: \"bc78cd9a-99b3-4352-b1e0-5e88a17d0c32\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:09.258780 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.258669 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/bc78cd9a-99b3-4352-b1e0-5e88a17d0c32-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cft8h\" (UID: \"bc78cd9a-99b3-4352-b1e0-5e88a17d0c32\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:09.258780 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.258715 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2wkp\" (UniqueName: \"kubernetes.io/projected/bc78cd9a-99b3-4352-b1e0-5e88a17d0c32-kube-api-access-j2wkp\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cft8h\" (UID: \"bc78cd9a-99b3-4352-b1e0-5e88a17d0c32\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:09.360175 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.360138 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/bc78cd9a-99b3-4352-b1e0-5e88a17d0c32-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cft8h\" (UID: \"bc78cd9a-99b3-4352-b1e0-5e88a17d0c32\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:09.360360 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.360201 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/bc78cd9a-99b3-4352-b1e0-5e88a17d0c32-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cft8h\" (UID: \"bc78cd9a-99b3-4352-b1e0-5e88a17d0c32\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:09.360360 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.360227 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/bc78cd9a-99b3-4352-b1e0-5e88a17d0c32-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cft8h\" (UID: \"bc78cd9a-99b3-4352-b1e0-5e88a17d0c32\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:09.360360 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.360251 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/bc78cd9a-99b3-4352-b1e0-5e88a17d0c32-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cft8h\" (UID: \"bc78cd9a-99b3-4352-b1e0-5e88a17d0c32\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:09.360360 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.360288 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/bc78cd9a-99b3-4352-b1e0-5e88a17d0c32-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cft8h\" (UID: \"bc78cd9a-99b3-4352-b1e0-5e88a17d0c32\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:09.360360 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.360316 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/bc78cd9a-99b3-4352-b1e0-5e88a17d0c32-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cft8h\" (UID: \"bc78cd9a-99b3-4352-b1e0-5e88a17d0c32\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:09.360360 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.360350 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/bc78cd9a-99b3-4352-b1e0-5e88a17d0c32-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cft8h\" (UID: \"bc78cd9a-99b3-4352-b1e0-5e88a17d0c32\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:09.360661 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.360374 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/bc78cd9a-99b3-4352-b1e0-5e88a17d0c32-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cft8h\" (UID: \"bc78cd9a-99b3-4352-b1e0-5e88a17d0c32\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:09.360661 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.360404 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2wkp\" (UniqueName: \"kubernetes.io/projected/bc78cd9a-99b3-4352-b1e0-5e88a17d0c32-kube-api-access-j2wkp\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cft8h\" (UID: \"bc78cd9a-99b3-4352-b1e0-5e88a17d0c32\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:09.360812 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.360778 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/bc78cd9a-99b3-4352-b1e0-5e88a17d0c32-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cft8h\" (UID: \"bc78cd9a-99b3-4352-b1e0-5e88a17d0c32\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:09.361044 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.361014 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/bc78cd9a-99b3-4352-b1e0-5e88a17d0c32-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cft8h\" (UID: \"bc78cd9a-99b3-4352-b1e0-5e88a17d0c32\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:09.361128 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.361109 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/bc78cd9a-99b3-4352-b1e0-5e88a17d0c32-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cft8h\" (UID: \"bc78cd9a-99b3-4352-b1e0-5e88a17d0c32\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:09.361183 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.361136 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/bc78cd9a-99b3-4352-b1e0-5e88a17d0c32-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cft8h\" (UID: \"bc78cd9a-99b3-4352-b1e0-5e88a17d0c32\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:09.361241 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.361175 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/bc78cd9a-99b3-4352-b1e0-5e88a17d0c32-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cft8h\" (UID: \"bc78cd9a-99b3-4352-b1e0-5e88a17d0c32\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:09.362564 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.362543 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/bc78cd9a-99b3-4352-b1e0-5e88a17d0c32-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cft8h\" (UID: \"bc78cd9a-99b3-4352-b1e0-5e88a17d0c32\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:09.363402 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.363380 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/bc78cd9a-99b3-4352-b1e0-5e88a17d0c32-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cft8h\" (UID: \"bc78cd9a-99b3-4352-b1e0-5e88a17d0c32\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:09.368955 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.368936 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/bc78cd9a-99b3-4352-b1e0-5e88a17d0c32-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cft8h\" (UID: \"bc78cd9a-99b3-4352-b1e0-5e88a17d0c32\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:09.369144 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.369126 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2wkp\" (UniqueName: \"kubernetes.io/projected/bc78cd9a-99b3-4352-b1e0-5e88a17d0c32-kube-api-access-j2wkp\") pod \"router-gateway-1-openshift-default-6c59fbf55c-cft8h\" (UID: \"bc78cd9a-99b3-4352-b1e0-5e88a17d0c32\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:09.470512 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.470435 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:09.593558 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.593534 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h"] Apr 16 22:26:09.596336 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:26:09.596306 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc78cd9a_99b3_4352_b1e0_5e88a17d0c32.slice/crio-5f867354fb02df9eb5c5f9a28398d2b067bbd0311a6c7143e17160c5dab1bd2f WatchSource:0}: Error finding container 5f867354fb02df9eb5c5f9a28398d2b067bbd0311a6c7143e17160c5dab1bd2f: Status 404 returned error can't find the container with id 5f867354fb02df9eb5c5f9a28398d2b067bbd0311a6c7143e17160c5dab1bd2f Apr 16 22:26:09.598475 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.598441 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 22:26:09.598581 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.598523 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 22:26:09.598581 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.598565 2574 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 22:26:09.651534 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:09.651506 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" event={"ID":"bc78cd9a-99b3-4352-b1e0-5e88a17d0c32","Type":"ContainerStarted","Data":"5f867354fb02df9eb5c5f9a28398d2b067bbd0311a6c7143e17160c5dab1bd2f"} Apr 16 22:26:10.655838 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:10.655795 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" event={"ID":"bc78cd9a-99b3-4352-b1e0-5e88a17d0c32","Type":"ContainerStarted","Data":"3472fea88fa1e0b8c5067b08d231577b48af9e863bcfe6410db5109799d4d9a1"} Apr 16 22:26:10.675506 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:10.675455 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" podStartSLOduration=1.675441728 podStartE2EDuration="1.675441728s" podCreationTimestamp="2026-04-16 22:26:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:26:10.673687631 +0000 UTC m=+761.042093810" watchObservedRunningTime="2026-04-16 22:26:10.675441728 +0000 UTC m=+761.043847946" Apr 16 22:26:11.470942 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:11.470903 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:11.475923 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:11.475890 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:11.660128 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:11.660093 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:11.661151 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:11.661133 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-cft8h" Apr 16 22:26:19.609827 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:19.609791 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb"] Apr 16 22:26:19.613705 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:19.613682 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" Apr 16 22:26:19.616251 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:19.616229 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-pt4qs\"" Apr 16 22:26:19.616837 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:19.616816 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 16 22:26:19.616837 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:19.616828 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-tvdrb\"" Apr 16 22:26:19.625283 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:19.625261 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb"] Apr 16 22:26:19.645495 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:19.645466 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6cvt\" (UniqueName: \"kubernetes.io/projected/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-kube-api-access-x6cvt\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb\" (UID: \"12af897f-b7b7-4205-ad3f-fd5e3fa30d68\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" Apr 16 22:26:19.645665 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:19.645506 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb\" (UID: \"12af897f-b7b7-4205-ad3f-fd5e3fa30d68\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" Apr 16 22:26:19.645665 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:19.645585 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb\" (UID: \"12af897f-b7b7-4205-ad3f-fd5e3fa30d68\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" Apr 16 22:26:19.645665 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:19.645623 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb\" (UID: \"12af897f-b7b7-4205-ad3f-fd5e3fa30d68\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" Apr 16 22:26:19.645853 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:19.645736 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb\" (UID: \"12af897f-b7b7-4205-ad3f-fd5e3fa30d68\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" Apr 16 22:26:19.645853 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:19.645821 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb\" (UID: \"12af897f-b7b7-4205-ad3f-fd5e3fa30d68\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" Apr 16 22:26:19.746675 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:19.746635 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb\" (UID: \"12af897f-b7b7-4205-ad3f-fd5e3fa30d68\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" Apr 16 22:26:19.746675 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:19.746677 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb\" (UID: \"12af897f-b7b7-4205-ad3f-fd5e3fa30d68\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" Apr 16 22:26:19.746932 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:19.746720 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6cvt\" (UniqueName: \"kubernetes.io/projected/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-kube-api-access-x6cvt\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb\" (UID: \"12af897f-b7b7-4205-ad3f-fd5e3fa30d68\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" Apr 16 22:26:19.746932 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:19.746763 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb\" (UID: \"12af897f-b7b7-4205-ad3f-fd5e3fa30d68\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" Apr 16 22:26:19.746932 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:19.746788 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb\" (UID: \"12af897f-b7b7-4205-ad3f-fd5e3fa30d68\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" Apr 16 22:26:19.746932 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:19.746806 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb\" (UID: \"12af897f-b7b7-4205-ad3f-fd5e3fa30d68\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" Apr 16 22:26:19.747145 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:19.747065 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb\" (UID: \"12af897f-b7b7-4205-ad3f-fd5e3fa30d68\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" Apr 16 22:26:19.747212 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:19.747138 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb\" (UID: \"12af897f-b7b7-4205-ad3f-fd5e3fa30d68\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" Apr 16 22:26:19.747212 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:19.747169 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb\" (UID: \"12af897f-b7b7-4205-ad3f-fd5e3fa30d68\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" Apr 16 22:26:19.747301 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:19.747227 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb\" (UID: \"12af897f-b7b7-4205-ad3f-fd5e3fa30d68\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" Apr 16 22:26:19.749165 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:19.749141 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb\" (UID: \"12af897f-b7b7-4205-ad3f-fd5e3fa30d68\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" Apr 16 22:26:19.755953 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:19.755936 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6cvt\" (UniqueName: \"kubernetes.io/projected/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-kube-api-access-x6cvt\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb\" (UID: \"12af897f-b7b7-4205-ad3f-fd5e3fa30d68\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" Apr 16 22:26:19.926371 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:19.926285 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" Apr 16 22:26:20.049271 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:20.049206 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb"] Apr 16 22:26:20.051594 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:26:20.051563 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12af897f_b7b7_4205_ad3f_fd5e3fa30d68.slice/crio-ed6f7407c65902bf0a14de88c0136b86de4385382830640c93e65b80f3ac9133 WatchSource:0}: Error finding container ed6f7407c65902bf0a14de88c0136b86de4385382830640c93e65b80f3ac9133: Status 404 returned error can't find the container with id ed6f7407c65902bf0a14de88c0136b86de4385382830640c93e65b80f3ac9133 Apr 16 22:26:20.053508 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:20.053488 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:26:20.691614 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:20.691576 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" event={"ID":"12af897f-b7b7-4205-ad3f-fd5e3fa30d68","Type":"ContainerStarted","Data":"ed6f7407c65902bf0a14de88c0136b86de4385382830640c93e65b80f3ac9133"} Apr 16 22:26:23.706156 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:23.706068 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" event={"ID":"12af897f-b7b7-4205-ad3f-fd5e3fa30d68","Type":"ContainerStarted","Data":"83c7f4a54914322be93db96105fe400aa70afe7488765ac8603853f8f0ed5dcb"} Apr 16 22:26:24.710866 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:24.710825 2574 generic.go:358] "Generic (PLEG): container finished" podID="12af897f-b7b7-4205-ad3f-fd5e3fa30d68" containerID="83c7f4a54914322be93db96105fe400aa70afe7488765ac8603853f8f0ed5dcb" exitCode=0 Apr 16 22:26:24.711346 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:24.710869 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" event={"ID":"12af897f-b7b7-4205-ad3f-fd5e3fa30d68","Type":"ContainerDied","Data":"83c7f4a54914322be93db96105fe400aa70afe7488765ac8603853f8f0ed5dcb"} Apr 16 22:26:26.719318 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:26.719282 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" event={"ID":"12af897f-b7b7-4205-ad3f-fd5e3fa30d68","Type":"ContainerStarted","Data":"8e002f95aface966dff335ebf7ab6adc29ad7665f851fde2b95f9a8aaabbf231"} Apr 16 22:26:55.830755 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:55.830709 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" event={"ID":"12af897f-b7b7-4205-ad3f-fd5e3fa30d68","Type":"ContainerStarted","Data":"0e043c11124e1e3dc9c0e614ebc057b01d46c99d28c027a17a6fca27e3105f9f"} Apr 16 22:26:55.831169 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:55.830918 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" Apr 16 22:26:55.833381 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:55.833359 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" Apr 16 22:26:55.851629 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:55.851562 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" podStartSLOduration=1.148983216 podStartE2EDuration="36.851543478s" podCreationTimestamp="2026-04-16 22:26:19 +0000 UTC" firstStartedPulling="2026-04-16 22:26:20.053677695 +0000 UTC m=+770.422083831" lastFinishedPulling="2026-04-16 22:26:55.756237947 +0000 UTC m=+806.124644093" observedRunningTime="2026-04-16 22:26:55.848952582 +0000 UTC m=+806.217358736" watchObservedRunningTime="2026-04-16 22:26:55.851543478 +0000 UTC m=+806.219949631" Apr 16 22:26:59.926441 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:59.926399 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" Apr 16 22:26:59.926441 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:26:59.926446 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" Apr 16 22:27:09.928627 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:09.928595 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" Apr 16 22:27:09.929816 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:09.929792 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" Apr 16 22:27:19.759205 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:19.759164 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb"] Apr 16 22:27:19.962950 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:19.962908 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb"] Apr 16 22:27:19.963107 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:19.963061 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" Apr 16 22:27:19.966025 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:19.965998 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-f6x7q\"" Apr 16 22:27:19.966437 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:19.966415 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 16 22:27:20.132210 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:20.132121 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e46fb856-45bc-4f5b-b210-e8781693a76f-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb\" (UID: \"e46fb856-45bc-4f5b-b210-e8781693a76f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" Apr 16 22:27:20.132210 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:20.132175 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e46fb856-45bc-4f5b-b210-e8781693a76f-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb\" (UID: \"e46fb856-45bc-4f5b-b210-e8781693a76f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" Apr 16 22:27:20.132210 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:20.132201 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e46fb856-45bc-4f5b-b210-e8781693a76f-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb\" (UID: \"e46fb856-45bc-4f5b-b210-e8781693a76f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" Apr 16 22:27:20.132420 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:20.132246 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pdtf\" (UniqueName: \"kubernetes.io/projected/e46fb856-45bc-4f5b-b210-e8781693a76f-kube-api-access-2pdtf\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb\" (UID: \"e46fb856-45bc-4f5b-b210-e8781693a76f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" Apr 16 22:27:20.132420 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:20.132267 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e46fb856-45bc-4f5b-b210-e8781693a76f-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb\" (UID: \"e46fb856-45bc-4f5b-b210-e8781693a76f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" Apr 16 22:27:20.132420 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:20.132317 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e46fb856-45bc-4f5b-b210-e8781693a76f-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb\" (UID: \"e46fb856-45bc-4f5b-b210-e8781693a76f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" Apr 16 22:27:20.233080 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:20.233043 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e46fb856-45bc-4f5b-b210-e8781693a76f-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb\" (UID: \"e46fb856-45bc-4f5b-b210-e8781693a76f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" Apr 16 22:27:20.233259 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:20.233091 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e46fb856-45bc-4f5b-b210-e8781693a76f-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb\" (UID: \"e46fb856-45bc-4f5b-b210-e8781693a76f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" Apr 16 22:27:20.233259 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:20.233121 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e46fb856-45bc-4f5b-b210-e8781693a76f-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb\" (UID: \"e46fb856-45bc-4f5b-b210-e8781693a76f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" Apr 16 22:27:20.233259 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:20.233162 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2pdtf\" (UniqueName: \"kubernetes.io/projected/e46fb856-45bc-4f5b-b210-e8781693a76f-kube-api-access-2pdtf\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb\" (UID: \"e46fb856-45bc-4f5b-b210-e8781693a76f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" Apr 16 22:27:20.233259 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:20.233190 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e46fb856-45bc-4f5b-b210-e8781693a76f-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb\" (UID: \"e46fb856-45bc-4f5b-b210-e8781693a76f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" Apr 16 22:27:20.233259 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:20.233246 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e46fb856-45bc-4f5b-b210-e8781693a76f-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb\" (UID: \"e46fb856-45bc-4f5b-b210-e8781693a76f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" Apr 16 22:27:20.233573 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:20.233539 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e46fb856-45bc-4f5b-b210-e8781693a76f-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb\" (UID: \"e46fb856-45bc-4f5b-b210-e8781693a76f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" Apr 16 22:27:20.233688 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:20.233590 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e46fb856-45bc-4f5b-b210-e8781693a76f-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb\" (UID: \"e46fb856-45bc-4f5b-b210-e8781693a76f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" Apr 16 22:27:20.233688 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:20.233620 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e46fb856-45bc-4f5b-b210-e8781693a76f-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb\" (UID: \"e46fb856-45bc-4f5b-b210-e8781693a76f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" Apr 16 22:27:20.233793 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:20.233715 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e46fb856-45bc-4f5b-b210-e8781693a76f-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb\" (UID: \"e46fb856-45bc-4f5b-b210-e8781693a76f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" Apr 16 22:27:20.235859 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:20.235836 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e46fb856-45bc-4f5b-b210-e8781693a76f-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb\" (UID: \"e46fb856-45bc-4f5b-b210-e8781693a76f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" Apr 16 22:27:20.241941 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:20.241912 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pdtf\" (UniqueName: \"kubernetes.io/projected/e46fb856-45bc-4f5b-b210-e8781693a76f-kube-api-access-2pdtf\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb\" (UID: \"e46fb856-45bc-4f5b-b210-e8781693a76f\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" Apr 16 22:27:20.274660 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:20.274626 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" Apr 16 22:27:20.413585 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:20.413539 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb"] Apr 16 22:27:20.415832 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:27:20.415803 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode46fb856_45bc_4f5b_b210_e8781693a76f.slice/crio-db2a14843dbb1920b8055d77616b259de2ce8e105166907788f1f79e655ea4aa WatchSource:0}: Error finding container db2a14843dbb1920b8055d77616b259de2ce8e105166907788f1f79e655ea4aa: Status 404 returned error can't find the container with id db2a14843dbb1920b8055d77616b259de2ce8e105166907788f1f79e655ea4aa Apr 16 22:27:20.920100 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:20.920062 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" event={"ID":"e46fb856-45bc-4f5b-b210-e8781693a76f","Type":"ContainerStarted","Data":"6f50c5ba6fc95982213e4685da5380ee62c0e094a26c04ef7525cd07634bef34"} Apr 16 22:27:20.920634 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:20.920109 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" event={"ID":"e46fb856-45bc-4f5b-b210-e8781693a76f","Type":"ContainerStarted","Data":"db2a14843dbb1920b8055d77616b259de2ce8e105166907788f1f79e655ea4aa"} Apr 16 22:27:21.401094 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:21.401061 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb"] Apr 16 22:27:21.401432 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:21.401404 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" podUID="12af897f-b7b7-4205-ad3f-fd5e3fa30d68" containerName="main" containerID="cri-o://8e002f95aface966dff335ebf7ab6adc29ad7665f851fde2b95f9a8aaabbf231" gracePeriod=30 Apr 16 22:27:21.401547 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:21.401435 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" podUID="12af897f-b7b7-4205-ad3f-fd5e3fa30d68" containerName="tokenizer" containerID="cri-o://0e043c11124e1e3dc9c0e614ebc057b01d46c99d28c027a17a6fca27e3105f9f" gracePeriod=30 Apr 16 22:27:21.925529 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:21.925492 2574 generic.go:358] "Generic (PLEG): container finished" podID="e46fb856-45bc-4f5b-b210-e8781693a76f" containerID="6f50c5ba6fc95982213e4685da5380ee62c0e094a26c04ef7525cd07634bef34" exitCode=0 Apr 16 22:27:21.926015 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:21.925583 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" event={"ID":"e46fb856-45bc-4f5b-b210-e8781693a76f","Type":"ContainerDied","Data":"6f50c5ba6fc95982213e4685da5380ee62c0e094a26c04ef7525cd07634bef34"} Apr 16 22:27:21.927626 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:21.927601 2574 generic.go:358] "Generic (PLEG): container finished" podID="12af897f-b7b7-4205-ad3f-fd5e3fa30d68" containerID="8e002f95aface966dff335ebf7ab6adc29ad7665f851fde2b95f9a8aaabbf231" exitCode=0 Apr 16 22:27:21.927714 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:21.927646 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" event={"ID":"12af897f-b7b7-4205-ad3f-fd5e3fa30d68","Type":"ContainerDied","Data":"8e002f95aface966dff335ebf7ab6adc29ad7665f851fde2b95f9a8aaabbf231"} Apr 16 22:27:22.763828 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.763802 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" Apr 16 22:27:22.857332 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.857240 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6cvt\" (UniqueName: \"kubernetes.io/projected/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-kube-api-access-x6cvt\") pod \"12af897f-b7b7-4205-ad3f-fd5e3fa30d68\" (UID: \"12af897f-b7b7-4205-ad3f-fd5e3fa30d68\") " Apr 16 22:27:22.857332 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.857309 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-tokenizer-tmp\") pod \"12af897f-b7b7-4205-ad3f-fd5e3fa30d68\" (UID: \"12af897f-b7b7-4205-ad3f-fd5e3fa30d68\") " Apr 16 22:27:22.857564 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.857351 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-tokenizer-cache\") pod \"12af897f-b7b7-4205-ad3f-fd5e3fa30d68\" (UID: \"12af897f-b7b7-4205-ad3f-fd5e3fa30d68\") " Apr 16 22:27:22.857564 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.857378 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-kserve-provision-location\") pod \"12af897f-b7b7-4205-ad3f-fd5e3fa30d68\" (UID: \"12af897f-b7b7-4205-ad3f-fd5e3fa30d68\") " Apr 16 22:27:22.857564 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.857458 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-tokenizer-uds\") pod \"12af897f-b7b7-4205-ad3f-fd5e3fa30d68\" (UID: \"12af897f-b7b7-4205-ad3f-fd5e3fa30d68\") " Apr 16 22:27:22.857564 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.857503 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-tls-certs\") pod \"12af897f-b7b7-4205-ad3f-fd5e3fa30d68\" (UID: \"12af897f-b7b7-4205-ad3f-fd5e3fa30d68\") " Apr 16 22:27:22.857784 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.857620 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "12af897f-b7b7-4205-ad3f-fd5e3fa30d68" (UID: "12af897f-b7b7-4205-ad3f-fd5e3fa30d68"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:27:22.857784 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.857667 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "12af897f-b7b7-4205-ad3f-fd5e3fa30d68" (UID: "12af897f-b7b7-4205-ad3f-fd5e3fa30d68"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:27:22.857867 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.857829 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "12af897f-b7b7-4205-ad3f-fd5e3fa30d68" (UID: "12af897f-b7b7-4205-ad3f-fd5e3fa30d68"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:27:22.857908 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.857895 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-tokenizer-tmp\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:27:22.857942 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.857916 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-tokenizer-cache\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:27:22.858206 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.858183 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "12af897f-b7b7-4205-ad3f-fd5e3fa30d68" (UID: "12af897f-b7b7-4205-ad3f-fd5e3fa30d68"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:27:22.859790 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.859759 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "12af897f-b7b7-4205-ad3f-fd5e3fa30d68" (UID: "12af897f-b7b7-4205-ad3f-fd5e3fa30d68"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:27:22.859975 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.859953 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-kube-api-access-x6cvt" (OuterVolumeSpecName: "kube-api-access-x6cvt") pod "12af897f-b7b7-4205-ad3f-fd5e3fa30d68" (UID: "12af897f-b7b7-4205-ad3f-fd5e3fa30d68"). InnerVolumeSpecName "kube-api-access-x6cvt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:27:22.933289 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.933252 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" event={"ID":"e46fb856-45bc-4f5b-b210-e8781693a76f","Type":"ContainerStarted","Data":"507365b45649440227733e499580cc48c347f8ce23b746c00676f41c6acdf0f9"} Apr 16 22:27:22.933289 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.933288 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" event={"ID":"e46fb856-45bc-4f5b-b210-e8781693a76f","Type":"ContainerStarted","Data":"bfc33dde096c7141c09ef60381f54d109d16dc77803b92aac05c8187e132417e"} Apr 16 22:27:22.933815 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.933411 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" Apr 16 22:27:22.934884 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.934856 2574 generic.go:358] "Generic (PLEG): container finished" podID="12af897f-b7b7-4205-ad3f-fd5e3fa30d68" containerID="0e043c11124e1e3dc9c0e614ebc057b01d46c99d28c027a17a6fca27e3105f9f" exitCode=0 Apr 16 22:27:22.935008 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.934891 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" event={"ID":"12af897f-b7b7-4205-ad3f-fd5e3fa30d68","Type":"ContainerDied","Data":"0e043c11124e1e3dc9c0e614ebc057b01d46c99d28c027a17a6fca27e3105f9f"} Apr 16 22:27:22.935008 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.934914 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" event={"ID":"12af897f-b7b7-4205-ad3f-fd5e3fa30d68","Type":"ContainerDied","Data":"ed6f7407c65902bf0a14de88c0136b86de4385382830640c93e65b80f3ac9133"} Apr 16 22:27:22.935008 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.934934 2574 scope.go:117] "RemoveContainer" containerID="0e043c11124e1e3dc9c0e614ebc057b01d46c99d28c027a17a6fca27e3105f9f" Apr 16 22:27:22.935008 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.934936 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb" Apr 16 22:27:22.943178 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.943161 2574 scope.go:117] "RemoveContainer" containerID="8e002f95aface966dff335ebf7ab6adc29ad7665f851fde2b95f9a8aaabbf231" Apr 16 22:27:22.951645 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.951626 2574 scope.go:117] "RemoveContainer" containerID="83c7f4a54914322be93db96105fe400aa70afe7488765ac8603853f8f0ed5dcb" Apr 16 22:27:22.958963 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.958942 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-tokenizer-uds\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:27:22.959056 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.958967 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-tls-certs\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:27:22.959056 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.958981 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x6cvt\" (UniqueName: \"kubernetes.io/projected/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-kube-api-access-x6cvt\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:27:22.959056 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.958995 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12af897f-b7b7-4205-ad3f-fd5e3fa30d68-kserve-provision-location\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:27:22.960879 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.960855 2574 scope.go:117] "RemoveContainer" containerID="0e043c11124e1e3dc9c0e614ebc057b01d46c99d28c027a17a6fca27e3105f9f" Apr 16 22:27:22.961246 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:27:22.961225 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e043c11124e1e3dc9c0e614ebc057b01d46c99d28c027a17a6fca27e3105f9f\": container with ID starting with 0e043c11124e1e3dc9c0e614ebc057b01d46c99d28c027a17a6fca27e3105f9f not found: ID does not exist" containerID="0e043c11124e1e3dc9c0e614ebc057b01d46c99d28c027a17a6fca27e3105f9f" Apr 16 22:27:22.961302 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.961257 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e043c11124e1e3dc9c0e614ebc057b01d46c99d28c027a17a6fca27e3105f9f"} err="failed to get container status \"0e043c11124e1e3dc9c0e614ebc057b01d46c99d28c027a17a6fca27e3105f9f\": rpc error: code = NotFound desc = could not find container \"0e043c11124e1e3dc9c0e614ebc057b01d46c99d28c027a17a6fca27e3105f9f\": container with ID starting with 0e043c11124e1e3dc9c0e614ebc057b01d46c99d28c027a17a6fca27e3105f9f not found: ID does not exist" Apr 16 22:27:22.961302 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.961278 2574 scope.go:117] "RemoveContainer" containerID="8e002f95aface966dff335ebf7ab6adc29ad7665f851fde2b95f9a8aaabbf231" Apr 16 22:27:22.961519 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:27:22.961493 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e002f95aface966dff335ebf7ab6adc29ad7665f851fde2b95f9a8aaabbf231\": container with ID starting with 8e002f95aface966dff335ebf7ab6adc29ad7665f851fde2b95f9a8aaabbf231 not found: ID does not exist" containerID="8e002f95aface966dff335ebf7ab6adc29ad7665f851fde2b95f9a8aaabbf231" Apr 16 22:27:22.961562 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.961529 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e002f95aface966dff335ebf7ab6adc29ad7665f851fde2b95f9a8aaabbf231"} err="failed to get container status \"8e002f95aface966dff335ebf7ab6adc29ad7665f851fde2b95f9a8aaabbf231\": rpc error: code = NotFound desc = could not find container \"8e002f95aface966dff335ebf7ab6adc29ad7665f851fde2b95f9a8aaabbf231\": container with ID starting with 8e002f95aface966dff335ebf7ab6adc29ad7665f851fde2b95f9a8aaabbf231 not found: ID does not exist" Apr 16 22:27:22.961562 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.961553 2574 scope.go:117] "RemoveContainer" containerID="83c7f4a54914322be93db96105fe400aa70afe7488765ac8603853f8f0ed5dcb" Apr 16 22:27:22.961793 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:27:22.961774 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83c7f4a54914322be93db96105fe400aa70afe7488765ac8603853f8f0ed5dcb\": container with ID starting with 83c7f4a54914322be93db96105fe400aa70afe7488765ac8603853f8f0ed5dcb not found: ID does not exist" containerID="83c7f4a54914322be93db96105fe400aa70afe7488765ac8603853f8f0ed5dcb" Apr 16 22:27:22.961837 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.961798 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83c7f4a54914322be93db96105fe400aa70afe7488765ac8603853f8f0ed5dcb"} err="failed to get container status \"83c7f4a54914322be93db96105fe400aa70afe7488765ac8603853f8f0ed5dcb\": rpc error: code = NotFound desc = could not find container \"83c7f4a54914322be93db96105fe400aa70afe7488765ac8603853f8f0ed5dcb\": container with ID starting with 83c7f4a54914322be93db96105fe400aa70afe7488765ac8603853f8f0ed5dcb not found: ID does not exist" Apr 16 22:27:22.968482 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.968435 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" podStartSLOduration=3.96842389 podStartE2EDuration="3.96842389s" podCreationTimestamp="2026-04-16 22:27:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:27:22.966699229 +0000 UTC m=+833.335105379" watchObservedRunningTime="2026-04-16 22:27:22.96842389 +0000 UTC m=+833.336830055" Apr 16 22:27:22.985965 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.985938 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb"] Apr 16 22:27:22.989862 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:22.989841 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5bc6dlgwdb"] Apr 16 22:27:24.262516 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:24.262482 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12af897f-b7b7-4205-ad3f-fd5e3fa30d68" path="/var/lib/kubelet/pods/12af897f-b7b7-4205-ad3f-fd5e3fa30d68/volumes" Apr 16 22:27:30.275634 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:30.275602 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" Apr 16 22:27:30.275634 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:30.275638 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" Apr 16 22:27:30.278468 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:30.278444 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" Apr 16 22:27:30.962325 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:30.962285 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" Apr 16 22:27:39.219190 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.219151 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78"] Apr 16 22:27:39.219631 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.219481 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12af897f-b7b7-4205-ad3f-fd5e3fa30d68" containerName="main" Apr 16 22:27:39.219631 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.219492 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="12af897f-b7b7-4205-ad3f-fd5e3fa30d68" containerName="main" Apr 16 22:27:39.219631 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.219513 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12af897f-b7b7-4205-ad3f-fd5e3fa30d68" containerName="tokenizer" Apr 16 22:27:39.219631 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.219518 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="12af897f-b7b7-4205-ad3f-fd5e3fa30d68" containerName="tokenizer" Apr 16 22:27:39.219631 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.219527 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12af897f-b7b7-4205-ad3f-fd5e3fa30d68" containerName="storage-initializer" Apr 16 22:27:39.219631 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.219532 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="12af897f-b7b7-4205-ad3f-fd5e3fa30d68" containerName="storage-initializer" Apr 16 22:27:39.219631 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.219583 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="12af897f-b7b7-4205-ad3f-fd5e3fa30d68" containerName="tokenizer" Apr 16 22:27:39.219631 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.219594 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="12af897f-b7b7-4205-ad3f-fd5e3fa30d68" containerName="main" Apr 16 22:27:39.225089 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.225064 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" Apr 16 22:27:39.227469 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.227443 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-55f7ae4a-epp-sa-dockercfg-x6dgw\"" Apr 16 22:27:39.227469 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.227455 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 16 22:27:39.235850 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.235827 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78"] Apr 16 22:27:39.304913 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.304874 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb49e6c1-56da-4374-91a6-902efe5f8464-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78\" (UID: \"fb49e6c1-56da-4374-91a6-902efe5f8464\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" Apr 16 22:27:39.305107 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.304927 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fb49e6c1-56da-4374-91a6-902efe5f8464-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78\" (UID: \"fb49e6c1-56da-4374-91a6-902efe5f8464\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" Apr 16 22:27:39.305107 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.304957 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fb49e6c1-56da-4374-91a6-902efe5f8464-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78\" (UID: \"fb49e6c1-56da-4374-91a6-902efe5f8464\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" Apr 16 22:27:39.305107 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.305073 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb49e6c1-56da-4374-91a6-902efe5f8464-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78\" (UID: \"fb49e6c1-56da-4374-91a6-902efe5f8464\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" Apr 16 22:27:39.305341 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.305107 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fb49e6c1-56da-4374-91a6-902efe5f8464-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78\" (UID: \"fb49e6c1-56da-4374-91a6-902efe5f8464\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" Apr 16 22:27:39.305341 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.305215 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f2k9\" (UniqueName: \"kubernetes.io/projected/fb49e6c1-56da-4374-91a6-902efe5f8464-kube-api-access-8f2k9\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78\" (UID: \"fb49e6c1-56da-4374-91a6-902efe5f8464\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" Apr 16 22:27:39.405749 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.405717 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8f2k9\" (UniqueName: \"kubernetes.io/projected/fb49e6c1-56da-4374-91a6-902efe5f8464-kube-api-access-8f2k9\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78\" (UID: \"fb49e6c1-56da-4374-91a6-902efe5f8464\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" Apr 16 22:27:39.405953 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.405829 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb49e6c1-56da-4374-91a6-902efe5f8464-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78\" (UID: \"fb49e6c1-56da-4374-91a6-902efe5f8464\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" Apr 16 22:27:39.405953 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.405862 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fb49e6c1-56da-4374-91a6-902efe5f8464-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78\" (UID: \"fb49e6c1-56da-4374-91a6-902efe5f8464\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" Apr 16 22:27:39.405953 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.405882 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fb49e6c1-56da-4374-91a6-902efe5f8464-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78\" (UID: \"fb49e6c1-56da-4374-91a6-902efe5f8464\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" Apr 16 22:27:39.405953 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.405910 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb49e6c1-56da-4374-91a6-902efe5f8464-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78\" (UID: \"fb49e6c1-56da-4374-91a6-902efe5f8464\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" Apr 16 22:27:39.405953 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.405939 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fb49e6c1-56da-4374-91a6-902efe5f8464-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78\" (UID: \"fb49e6c1-56da-4374-91a6-902efe5f8464\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" Apr 16 22:27:39.406278 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.406255 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb49e6c1-56da-4374-91a6-902efe5f8464-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78\" (UID: \"fb49e6c1-56da-4374-91a6-902efe5f8464\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" Apr 16 22:27:39.406324 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.406298 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fb49e6c1-56da-4374-91a6-902efe5f8464-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78\" (UID: \"fb49e6c1-56da-4374-91a6-902efe5f8464\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" Apr 16 22:27:39.406374 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.406352 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb49e6c1-56da-4374-91a6-902efe5f8464-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78\" (UID: \"fb49e6c1-56da-4374-91a6-902efe5f8464\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" Apr 16 22:27:39.406418 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.406382 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fb49e6c1-56da-4374-91a6-902efe5f8464-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78\" (UID: \"fb49e6c1-56da-4374-91a6-902efe5f8464\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" Apr 16 22:27:39.408514 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.408482 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fb49e6c1-56da-4374-91a6-902efe5f8464-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78\" (UID: \"fb49e6c1-56da-4374-91a6-902efe5f8464\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" Apr 16 22:27:39.414146 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.414118 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f2k9\" (UniqueName: \"kubernetes.io/projected/fb49e6c1-56da-4374-91a6-902efe5f8464-kube-api-access-8f2k9\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78\" (UID: \"fb49e6c1-56da-4374-91a6-902efe5f8464\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" Apr 16 22:27:39.536060 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.536027 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" Apr 16 22:27:39.667692 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.667661 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78"] Apr 16 22:27:39.670561 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:27:39.670526 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb49e6c1_56da_4374_91a6_902efe5f8464.slice/crio-ce9bda80b52fdbc3dd601b8457d45cddb26d18738ab8ef9150408b8017e92fcc WatchSource:0}: Error finding container ce9bda80b52fdbc3dd601b8457d45cddb26d18738ab8ef9150408b8017e92fcc: Status 404 returned error can't find the container with id ce9bda80b52fdbc3dd601b8457d45cddb26d18738ab8ef9150408b8017e92fcc Apr 16 22:27:39.990520 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.990482 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" event={"ID":"fb49e6c1-56da-4374-91a6-902efe5f8464","Type":"ContainerStarted","Data":"f1bee2cd9e47fd20dd7bdda028a813355948a283d6f7b8ff0860e289ec8375de"} Apr 16 22:27:39.990712 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:39.990525 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" event={"ID":"fb49e6c1-56da-4374-91a6-902efe5f8464","Type":"ContainerStarted","Data":"ce9bda80b52fdbc3dd601b8457d45cddb26d18738ab8ef9150408b8017e92fcc"} Apr 16 22:27:40.994989 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:40.994950 2574 generic.go:358] "Generic (PLEG): container finished" podID="fb49e6c1-56da-4374-91a6-902efe5f8464" containerID="f1bee2cd9e47fd20dd7bdda028a813355948a283d6f7b8ff0860e289ec8375de" exitCode=0 Apr 16 22:27:40.995398 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:40.995035 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" event={"ID":"fb49e6c1-56da-4374-91a6-902efe5f8464","Type":"ContainerDied","Data":"f1bee2cd9e47fd20dd7bdda028a813355948a283d6f7b8ff0860e289ec8375de"} Apr 16 22:27:42.000430 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:42.000396 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" event={"ID":"fb49e6c1-56da-4374-91a6-902efe5f8464","Type":"ContainerStarted","Data":"9936d4a2e8d1f513e3bbb936426ae6738f6b2e9ab9b5af4ea58998adab1964af"} Apr 16 22:27:42.000430 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:42.000430 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" event={"ID":"fb49e6c1-56da-4374-91a6-902efe5f8464","Type":"ContainerStarted","Data":"fc81122a6e3f2b7ce3b5f766fc924a21d612a9b794e02c0b24c55ffdc90ff020"} Apr 16 22:27:42.000890 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:42.000523 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" Apr 16 22:27:42.022009 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:42.021960 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" podStartSLOduration=3.021945856 podStartE2EDuration="3.021945856s" podCreationTimestamp="2026-04-16 22:27:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:27:42.019487839 +0000 UTC m=+852.387893992" watchObservedRunningTime="2026-04-16 22:27:42.021945856 +0000 UTC m=+852.390352007" Apr 16 22:27:49.536930 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:49.536888 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" Apr 16 22:27:49.537398 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:49.537046 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" Apr 16 22:27:49.539787 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:49.539765 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" Apr 16 22:27:50.030072 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:27:50.030044 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" Apr 16 22:28:01.966011 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:01.965982 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" Apr 16 22:28:02.946019 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:02.945976 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb"] Apr 16 22:28:02.946415 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:02.946383 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" podUID="e46fb856-45bc-4f5b-b210-e8781693a76f" containerName="main" containerID="cri-o://bfc33dde096c7141c09ef60381f54d109d16dc77803b92aac05c8187e132417e" gracePeriod=30 Apr 16 22:28:02.946490 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:02.946436 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" podUID="e46fb856-45bc-4f5b-b210-e8781693a76f" containerName="tokenizer" containerID="cri-o://507365b45649440227733e499580cc48c347f8ce23b746c00676f41c6acdf0f9" gracePeriod=30 Apr 16 22:28:03.073057 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:03.073026 2574 generic.go:358] "Generic (PLEG): container finished" podID="e46fb856-45bc-4f5b-b210-e8781693a76f" containerID="bfc33dde096c7141c09ef60381f54d109d16dc77803b92aac05c8187e132417e" exitCode=0 Apr 16 22:28:03.073408 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:03.073090 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" event={"ID":"e46fb856-45bc-4f5b-b210-e8781693a76f","Type":"ContainerDied","Data":"bfc33dde096c7141c09ef60381f54d109d16dc77803b92aac05c8187e132417e"} Apr 16 22:28:04.078461 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:04.078426 2574 generic.go:358] "Generic (PLEG): container finished" podID="e46fb856-45bc-4f5b-b210-e8781693a76f" containerID="507365b45649440227733e499580cc48c347f8ce23b746c00676f41c6acdf0f9" exitCode=0 Apr 16 22:28:04.078840 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:04.078497 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" event={"ID":"e46fb856-45bc-4f5b-b210-e8781693a76f","Type":"ContainerDied","Data":"507365b45649440227733e499580cc48c347f8ce23b746c00676f41c6acdf0f9"} Apr 16 22:28:04.201052 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:04.201029 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" Apr 16 22:28:04.327647 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:04.327566 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e46fb856-45bc-4f5b-b210-e8781693a76f-tokenizer-uds\") pod \"e46fb856-45bc-4f5b-b210-e8781693a76f\" (UID: \"e46fb856-45bc-4f5b-b210-e8781693a76f\") " Apr 16 22:28:04.327647 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:04.327602 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e46fb856-45bc-4f5b-b210-e8781693a76f-tokenizer-cache\") pod \"e46fb856-45bc-4f5b-b210-e8781693a76f\" (UID: \"e46fb856-45bc-4f5b-b210-e8781693a76f\") " Apr 16 22:28:04.327871 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:04.327650 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pdtf\" (UniqueName: \"kubernetes.io/projected/e46fb856-45bc-4f5b-b210-e8781693a76f-kube-api-access-2pdtf\") pod \"e46fb856-45bc-4f5b-b210-e8781693a76f\" (UID: \"e46fb856-45bc-4f5b-b210-e8781693a76f\") " Apr 16 22:28:04.327871 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:04.327758 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e46fb856-45bc-4f5b-b210-e8781693a76f-kserve-provision-location\") pod \"e46fb856-45bc-4f5b-b210-e8781693a76f\" (UID: \"e46fb856-45bc-4f5b-b210-e8781693a76f\") " Apr 16 22:28:04.327871 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:04.327828 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e46fb856-45bc-4f5b-b210-e8781693a76f-tokenizer-tmp\") pod \"e46fb856-45bc-4f5b-b210-e8781693a76f\" (UID: \"e46fb856-45bc-4f5b-b210-e8781693a76f\") " Apr 16 22:28:04.327871 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:04.327859 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e46fb856-45bc-4f5b-b210-e8781693a76f-tls-certs\") pod \"e46fb856-45bc-4f5b-b210-e8781693a76f\" (UID: \"e46fb856-45bc-4f5b-b210-e8781693a76f\") " Apr 16 22:28:04.328041 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:04.327938 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e46fb856-45bc-4f5b-b210-e8781693a76f-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "e46fb856-45bc-4f5b-b210-e8781693a76f" (UID: "e46fb856-45bc-4f5b-b210-e8781693a76f"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:28:04.328041 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:04.327950 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e46fb856-45bc-4f5b-b210-e8781693a76f-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "e46fb856-45bc-4f5b-b210-e8781693a76f" (UID: "e46fb856-45bc-4f5b-b210-e8781693a76f"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:28:04.328154 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:04.328120 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e46fb856-45bc-4f5b-b210-e8781693a76f-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "e46fb856-45bc-4f5b-b210-e8781693a76f" (UID: "e46fb856-45bc-4f5b-b210-e8781693a76f"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:28:04.328849 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:04.328690 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e46fb856-45bc-4f5b-b210-e8781693a76f-tokenizer-tmp\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:28:04.328849 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:04.328717 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e46fb856-45bc-4f5b-b210-e8781693a76f-tokenizer-uds\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:28:04.328849 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:04.328732 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e46fb856-45bc-4f5b-b210-e8781693a76f-tokenizer-cache\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:28:04.328849 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:04.328734 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e46fb856-45bc-4f5b-b210-e8781693a76f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e46fb856-45bc-4f5b-b210-e8781693a76f" (UID: "e46fb856-45bc-4f5b-b210-e8781693a76f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:28:04.330112 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:04.330082 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e46fb856-45bc-4f5b-b210-e8781693a76f-kube-api-access-2pdtf" (OuterVolumeSpecName: "kube-api-access-2pdtf") pod "e46fb856-45bc-4f5b-b210-e8781693a76f" (UID: "e46fb856-45bc-4f5b-b210-e8781693a76f"). InnerVolumeSpecName "kube-api-access-2pdtf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:28:04.330202 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:04.330174 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e46fb856-45bc-4f5b-b210-e8781693a76f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e46fb856-45bc-4f5b-b210-e8781693a76f" (UID: "e46fb856-45bc-4f5b-b210-e8781693a76f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:28:04.429977 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:04.429942 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2pdtf\" (UniqueName: \"kubernetes.io/projected/e46fb856-45bc-4f5b-b210-e8781693a76f-kube-api-access-2pdtf\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:28:04.429977 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:04.429971 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e46fb856-45bc-4f5b-b210-e8781693a76f-kserve-provision-location\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:28:04.429977 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:04.429983 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e46fb856-45bc-4f5b-b210-e8781693a76f-tls-certs\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:28:05.084613 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:05.084576 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" event={"ID":"e46fb856-45bc-4f5b-b210-e8781693a76f","Type":"ContainerDied","Data":"db2a14843dbb1920b8055d77616b259de2ce8e105166907788f1f79e655ea4aa"} Apr 16 22:28:05.084613 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:05.084605 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb" Apr 16 22:28:05.085147 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:05.084627 2574 scope.go:117] "RemoveContainer" containerID="507365b45649440227733e499580cc48c347f8ce23b746c00676f41c6acdf0f9" Apr 16 22:28:05.093312 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:05.093294 2574 scope.go:117] "RemoveContainer" containerID="bfc33dde096c7141c09ef60381f54d109d16dc77803b92aac05c8187e132417e" Apr 16 22:28:05.100538 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:05.100517 2574 scope.go:117] "RemoveContainer" containerID="6f50c5ba6fc95982213e4685da5380ee62c0e094a26c04ef7525cd07634bef34" Apr 16 22:28:05.106030 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:05.106007 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb"] Apr 16 22:28:05.109949 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:05.109924 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7d9875f2d5cb"] Apr 16 22:28:06.263398 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:06.263363 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e46fb856-45bc-4f5b-b210-e8781693a76f" path="/var/lib/kubelet/pods/e46fb856-45bc-4f5b-b210-e8781693a76f/volumes" Apr 16 22:28:12.036973 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:12.036940 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" Apr 16 22:28:30.185222 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:30.185133 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v4dxt_dba969ac-c53c-4354-ab6e-cc853b8c1449/console-operator/1.log" Apr 16 22:28:30.187939 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:30.187914 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v4dxt_dba969ac-c53c-4354-ab6e-cc853b8c1449/console-operator/1.log" Apr 16 22:28:30.190894 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:30.190870 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hm4t5_cdab8cce-9b55-478d-b1b5-740aa9746143/ovn-acl-logging/0.log" Apr 16 22:28:30.193840 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:28:30.193822 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hm4t5_cdab8cce-9b55-478d-b1b5-740aa9746143/ovn-acl-logging/0.log" Apr 16 22:30:28.827144 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:28.827111 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78"] Apr 16 22:30:28.827688 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:28.827511 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" podUID="fb49e6c1-56da-4374-91a6-902efe5f8464" containerName="main" containerID="cri-o://fc81122a6e3f2b7ce3b5f766fc924a21d612a9b794e02c0b24c55ffdc90ff020" gracePeriod=30 Apr 16 22:30:28.827688 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:28.827608 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" podUID="fb49e6c1-56da-4374-91a6-902efe5f8464" containerName="tokenizer" containerID="cri-o://9936d4a2e8d1f513e3bbb936426ae6738f6b2e9ab9b5af4ea58998adab1964af" gracePeriod=30 Apr 16 22:30:29.554458 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:29.554426 2574 generic.go:358] "Generic (PLEG): container finished" podID="fb49e6c1-56da-4374-91a6-902efe5f8464" containerID="fc81122a6e3f2b7ce3b5f766fc924a21d612a9b794e02c0b24c55ffdc90ff020" exitCode=0 Apr 16 22:30:29.554628 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:29.554510 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" event={"ID":"fb49e6c1-56da-4374-91a6-902efe5f8464","Type":"ContainerDied","Data":"fc81122a6e3f2b7ce3b5f766fc924a21d612a9b794e02c0b24c55ffdc90ff020"} Apr 16 22:30:30.082568 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.082502 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" Apr 16 22:30:30.162388 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.162358 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f2k9\" (UniqueName: \"kubernetes.io/projected/fb49e6c1-56da-4374-91a6-902efe5f8464-kube-api-access-8f2k9\") pod \"fb49e6c1-56da-4374-91a6-902efe5f8464\" (UID: \"fb49e6c1-56da-4374-91a6-902efe5f8464\") " Apr 16 22:30:30.162556 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.162400 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb49e6c1-56da-4374-91a6-902efe5f8464-tokenizer-cache\") pod \"fb49e6c1-56da-4374-91a6-902efe5f8464\" (UID: \"fb49e6c1-56da-4374-91a6-902efe5f8464\") " Apr 16 22:30:30.162556 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.162471 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fb49e6c1-56da-4374-91a6-902efe5f8464-tokenizer-tmp\") pod \"fb49e6c1-56da-4374-91a6-902efe5f8464\" (UID: \"fb49e6c1-56da-4374-91a6-902efe5f8464\") " Apr 16 22:30:30.162556 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.162499 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fb49e6c1-56da-4374-91a6-902efe5f8464-tls-certs\") pod \"fb49e6c1-56da-4374-91a6-902efe5f8464\" (UID: \"fb49e6c1-56da-4374-91a6-902efe5f8464\") " Apr 16 22:30:30.162556 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.162542 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb49e6c1-56da-4374-91a6-902efe5f8464-kserve-provision-location\") pod \"fb49e6c1-56da-4374-91a6-902efe5f8464\" (UID: \"fb49e6c1-56da-4374-91a6-902efe5f8464\") " Apr 16 22:30:30.162789 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.162571 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fb49e6c1-56da-4374-91a6-902efe5f8464-tokenizer-uds\") pod \"fb49e6c1-56da-4374-91a6-902efe5f8464\" (UID: \"fb49e6c1-56da-4374-91a6-902efe5f8464\") " Apr 16 22:30:30.162789 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.162731 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb49e6c1-56da-4374-91a6-902efe5f8464-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "fb49e6c1-56da-4374-91a6-902efe5f8464" (UID: "fb49e6c1-56da-4374-91a6-902efe5f8464"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:30:30.162934 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.162838 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb49e6c1-56da-4374-91a6-902efe5f8464-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "fb49e6c1-56da-4374-91a6-902efe5f8464" (UID: "fb49e6c1-56da-4374-91a6-902efe5f8464"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:30:30.162934 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.162885 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb49e6c1-56da-4374-91a6-902efe5f8464-tokenizer-cache\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:30:30.162934 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.162916 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb49e6c1-56da-4374-91a6-902efe5f8464-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "fb49e6c1-56da-4374-91a6-902efe5f8464" (UID: "fb49e6c1-56da-4374-91a6-902efe5f8464"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:30:30.163239 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.163218 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb49e6c1-56da-4374-91a6-902efe5f8464-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fb49e6c1-56da-4374-91a6-902efe5f8464" (UID: "fb49e6c1-56da-4374-91a6-902efe5f8464"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:30:30.164503 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.164472 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb49e6c1-56da-4374-91a6-902efe5f8464-kube-api-access-8f2k9" (OuterVolumeSpecName: "kube-api-access-8f2k9") pod "fb49e6c1-56da-4374-91a6-902efe5f8464" (UID: "fb49e6c1-56da-4374-91a6-902efe5f8464"). InnerVolumeSpecName "kube-api-access-8f2k9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:30:30.164603 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.164511 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb49e6c1-56da-4374-91a6-902efe5f8464-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "fb49e6c1-56da-4374-91a6-902efe5f8464" (UID: "fb49e6c1-56da-4374-91a6-902efe5f8464"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:30:30.263685 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.263658 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fb49e6c1-56da-4374-91a6-902efe5f8464-tokenizer-tmp\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:30:30.263863 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.263781 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fb49e6c1-56da-4374-91a6-902efe5f8464-tls-certs\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:30:30.263863 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.263805 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb49e6c1-56da-4374-91a6-902efe5f8464-kserve-provision-location\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:30:30.263863 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.263819 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fb49e6c1-56da-4374-91a6-902efe5f8464-tokenizer-uds\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:30:30.263863 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.263834 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8f2k9\" (UniqueName: \"kubernetes.io/projected/fb49e6c1-56da-4374-91a6-902efe5f8464-kube-api-access-8f2k9\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:30:30.559754 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.559707 2574 generic.go:358] "Generic (PLEG): container finished" podID="fb49e6c1-56da-4374-91a6-902efe5f8464" containerID="9936d4a2e8d1f513e3bbb936426ae6738f6b2e9ab9b5af4ea58998adab1964af" exitCode=0 Apr 16 22:30:30.559938 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.559775 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" event={"ID":"fb49e6c1-56da-4374-91a6-902efe5f8464","Type":"ContainerDied","Data":"9936d4a2e8d1f513e3bbb936426ae6738f6b2e9ab9b5af4ea58998adab1964af"} Apr 16 22:30:30.559938 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.559807 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" event={"ID":"fb49e6c1-56da-4374-91a6-902efe5f8464","Type":"ContainerDied","Data":"ce9bda80b52fdbc3dd601b8457d45cddb26d18738ab8ef9150408b8017e92fcc"} Apr 16 22:30:30.559938 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.559817 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" Apr 16 22:30:30.559938 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.559826 2574 scope.go:117] "RemoveContainer" containerID="9936d4a2e8d1f513e3bbb936426ae6738f6b2e9ab9b5af4ea58998adab1964af" Apr 16 22:30:30.567781 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.567759 2574 scope.go:117] "RemoveContainer" containerID="fc81122a6e3f2b7ce3b5f766fc924a21d612a9b794e02c0b24c55ffdc90ff020" Apr 16 22:30:30.575383 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.575367 2574 scope.go:117] "RemoveContainer" containerID="f1bee2cd9e47fd20dd7bdda028a813355948a283d6f7b8ff0860e289ec8375de" Apr 16 22:30:30.576454 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.576435 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78"] Apr 16 22:30:30.581764 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.581730 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78"] Apr 16 22:30:30.583381 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.583365 2574 scope.go:117] "RemoveContainer" containerID="9936d4a2e8d1f513e3bbb936426ae6738f6b2e9ab9b5af4ea58998adab1964af" Apr 16 22:30:30.583604 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:30:30.583586 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9936d4a2e8d1f513e3bbb936426ae6738f6b2e9ab9b5af4ea58998adab1964af\": container with ID starting with 9936d4a2e8d1f513e3bbb936426ae6738f6b2e9ab9b5af4ea58998adab1964af not found: ID does not exist" containerID="9936d4a2e8d1f513e3bbb936426ae6738f6b2e9ab9b5af4ea58998adab1964af" Apr 16 22:30:30.583672 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.583617 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9936d4a2e8d1f513e3bbb936426ae6738f6b2e9ab9b5af4ea58998adab1964af"} err="failed to get container status \"9936d4a2e8d1f513e3bbb936426ae6738f6b2e9ab9b5af4ea58998adab1964af\": rpc error: code = NotFound desc = could not find container \"9936d4a2e8d1f513e3bbb936426ae6738f6b2e9ab9b5af4ea58998adab1964af\": container with ID starting with 9936d4a2e8d1f513e3bbb936426ae6738f6b2e9ab9b5af4ea58998adab1964af not found: ID does not exist" Apr 16 22:30:30.583672 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.583639 2574 scope.go:117] "RemoveContainer" containerID="fc81122a6e3f2b7ce3b5f766fc924a21d612a9b794e02c0b24c55ffdc90ff020" Apr 16 22:30:30.583877 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:30:30.583861 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc81122a6e3f2b7ce3b5f766fc924a21d612a9b794e02c0b24c55ffdc90ff020\": container with ID starting with fc81122a6e3f2b7ce3b5f766fc924a21d612a9b794e02c0b24c55ffdc90ff020 not found: ID does not exist" containerID="fc81122a6e3f2b7ce3b5f766fc924a21d612a9b794e02c0b24c55ffdc90ff020" Apr 16 22:30:30.583927 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.583882 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc81122a6e3f2b7ce3b5f766fc924a21d612a9b794e02c0b24c55ffdc90ff020"} err="failed to get container status \"fc81122a6e3f2b7ce3b5f766fc924a21d612a9b794e02c0b24c55ffdc90ff020\": rpc error: code = NotFound desc = could not find container \"fc81122a6e3f2b7ce3b5f766fc924a21d612a9b794e02c0b24c55ffdc90ff020\": container with ID starting with fc81122a6e3f2b7ce3b5f766fc924a21d612a9b794e02c0b24c55ffdc90ff020 not found: ID does not exist" Apr 16 22:30:30.583927 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.583897 2574 scope.go:117] "RemoveContainer" containerID="f1bee2cd9e47fd20dd7bdda028a813355948a283d6f7b8ff0860e289ec8375de" Apr 16 22:30:30.584092 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:30:30.584073 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1bee2cd9e47fd20dd7bdda028a813355948a283d6f7b8ff0860e289ec8375de\": container with ID starting with f1bee2cd9e47fd20dd7bdda028a813355948a283d6f7b8ff0860e289ec8375de not found: ID does not exist" containerID="f1bee2cd9e47fd20dd7bdda028a813355948a283d6f7b8ff0860e289ec8375de" Apr 16 22:30:30.584144 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:30.584103 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1bee2cd9e47fd20dd7bdda028a813355948a283d6f7b8ff0860e289ec8375de"} err="failed to get container status \"f1bee2cd9e47fd20dd7bdda028a813355948a283d6f7b8ff0860e289ec8375de\": rpc error: code = NotFound desc = could not find container \"f1bee2cd9e47fd20dd7bdda028a813355948a283d6f7b8ff0860e289ec8375de\": container with ID starting with f1bee2cd9e47fd20dd7bdda028a813355948a283d6f7b8ff0860e289ec8375de not found: ID does not exist" Apr 16 22:30:32.263061 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:32.263028 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb49e6c1-56da-4374-91a6-902efe5f8464" path="/var/lib/kubelet/pods/fb49e6c1-56da-4374-91a6-902efe5f8464/volumes" Apr 16 22:30:35.030007 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:35.029958 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-scheghz78" podUID="fb49e6c1-56da-4374-91a6-902efe5f8464" containerName="tokenizer" probeResult="failure" output="Get \"http://10.133.0.35:8082/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 16 22:30:37.611932 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.611901 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg"] Apr 16 22:30:37.612299 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.612244 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb49e6c1-56da-4374-91a6-902efe5f8464" containerName="storage-initializer" Apr 16 22:30:37.612299 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.612257 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb49e6c1-56da-4374-91a6-902efe5f8464" containerName="storage-initializer" Apr 16 22:30:37.612299 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.612270 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e46fb856-45bc-4f5b-b210-e8781693a76f" containerName="storage-initializer" Apr 16 22:30:37.612299 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.612275 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e46fb856-45bc-4f5b-b210-e8781693a76f" containerName="storage-initializer" Apr 16 22:30:37.612299 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.612287 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb49e6c1-56da-4374-91a6-902efe5f8464" containerName="tokenizer" Apr 16 22:30:37.612299 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.612293 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb49e6c1-56da-4374-91a6-902efe5f8464" containerName="tokenizer" Apr 16 22:30:37.612299 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.612300 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb49e6c1-56da-4374-91a6-902efe5f8464" containerName="main" Apr 16 22:30:37.612584 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.612304 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb49e6c1-56da-4374-91a6-902efe5f8464" containerName="main" Apr 16 22:30:37.612584 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.612320 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e46fb856-45bc-4f5b-b210-e8781693a76f" containerName="main" Apr 16 22:30:37.612584 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.612326 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e46fb856-45bc-4f5b-b210-e8781693a76f" containerName="main" Apr 16 22:30:37.612584 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.612332 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e46fb856-45bc-4f5b-b210-e8781693a76f" containerName="tokenizer" Apr 16 22:30:37.612584 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.612337 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e46fb856-45bc-4f5b-b210-e8781693a76f" containerName="tokenizer" Apr 16 22:30:37.612584 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.612381 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb49e6c1-56da-4374-91a6-902efe5f8464" containerName="main" Apr 16 22:30:37.612584 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.612412 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="e46fb856-45bc-4f5b-b210-e8781693a76f" containerName="tokenizer" Apr 16 22:30:37.612584 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.612421 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="e46fb856-45bc-4f5b-b210-e8781693a76f" containerName="main" Apr 16 22:30:37.612584 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.612428 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb49e6c1-56da-4374-91a6-902efe5f8464" containerName="tokenizer" Apr 16 22:30:37.617252 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.617230 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" Apr 16 22:30:37.619796 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.619724 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-pt4qs\"" Apr 16 22:30:37.620506 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.620488 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 16 22:30:37.620585 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.620489 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-sr5xf\"" Apr 16 22:30:37.626363 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.626340 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg"] Apr 16 22:30:37.713756 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.713702 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/bc187f15-4d3a-4811-8a50-4f4be47cae9f-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg\" (UID: \"bc187f15-4d3a-4811-8a50-4f4be47cae9f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" Apr 16 22:30:37.713949 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.713802 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc187f15-4d3a-4811-8a50-4f4be47cae9f-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg\" (UID: \"bc187f15-4d3a-4811-8a50-4f4be47cae9f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" Apr 16 22:30:37.713949 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.713834 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bc187f15-4d3a-4811-8a50-4f4be47cae9f-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg\" (UID: \"bc187f15-4d3a-4811-8a50-4f4be47cae9f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" Apr 16 22:30:37.713949 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.713878 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/bc187f15-4d3a-4811-8a50-4f4be47cae9f-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg\" (UID: \"bc187f15-4d3a-4811-8a50-4f4be47cae9f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" Apr 16 22:30:37.713949 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.713899 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/bc187f15-4d3a-4811-8a50-4f4be47cae9f-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg\" (UID: \"bc187f15-4d3a-4811-8a50-4f4be47cae9f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" Apr 16 22:30:37.713949 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.713914 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfpxf\" (UniqueName: \"kubernetes.io/projected/bc187f15-4d3a-4811-8a50-4f4be47cae9f-kube-api-access-tfpxf\") pod \"custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg\" (UID: \"bc187f15-4d3a-4811-8a50-4f4be47cae9f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" Apr 16 22:30:37.814881 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.814845 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/bc187f15-4d3a-4811-8a50-4f4be47cae9f-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg\" (UID: \"bc187f15-4d3a-4811-8a50-4f4be47cae9f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" Apr 16 22:30:37.815030 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.814890 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc187f15-4d3a-4811-8a50-4f4be47cae9f-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg\" (UID: \"bc187f15-4d3a-4811-8a50-4f4be47cae9f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" Apr 16 22:30:37.815030 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.815007 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bc187f15-4d3a-4811-8a50-4f4be47cae9f-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg\" (UID: \"bc187f15-4d3a-4811-8a50-4f4be47cae9f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" Apr 16 22:30:37.815116 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.815093 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/bc187f15-4d3a-4811-8a50-4f4be47cae9f-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg\" (UID: \"bc187f15-4d3a-4811-8a50-4f4be47cae9f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" Apr 16 22:30:37.815154 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.815128 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/bc187f15-4d3a-4811-8a50-4f4be47cae9f-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg\" (UID: \"bc187f15-4d3a-4811-8a50-4f4be47cae9f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" Apr 16 22:30:37.815193 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.815153 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tfpxf\" (UniqueName: \"kubernetes.io/projected/bc187f15-4d3a-4811-8a50-4f4be47cae9f-kube-api-access-tfpxf\") pod \"custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg\" (UID: \"bc187f15-4d3a-4811-8a50-4f4be47cae9f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" Apr 16 22:30:37.815245 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.815211 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc187f15-4d3a-4811-8a50-4f4be47cae9f-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg\" (UID: \"bc187f15-4d3a-4811-8a50-4f4be47cae9f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" Apr 16 22:30:37.815298 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.815258 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/bc187f15-4d3a-4811-8a50-4f4be47cae9f-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg\" (UID: \"bc187f15-4d3a-4811-8a50-4f4be47cae9f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" Apr 16 22:30:37.815392 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.815373 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/bc187f15-4d3a-4811-8a50-4f4be47cae9f-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg\" (UID: \"bc187f15-4d3a-4811-8a50-4f4be47cae9f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" Apr 16 22:30:37.815454 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.815412 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/bc187f15-4d3a-4811-8a50-4f4be47cae9f-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg\" (UID: \"bc187f15-4d3a-4811-8a50-4f4be47cae9f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" Apr 16 22:30:37.817647 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.817622 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bc187f15-4d3a-4811-8a50-4f4be47cae9f-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg\" (UID: \"bc187f15-4d3a-4811-8a50-4f4be47cae9f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" Apr 16 22:30:37.823226 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.823201 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfpxf\" (UniqueName: \"kubernetes.io/projected/bc187f15-4d3a-4811-8a50-4f4be47cae9f-kube-api-access-tfpxf\") pod \"custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg\" (UID: \"bc187f15-4d3a-4811-8a50-4f4be47cae9f\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" Apr 16 22:30:37.927729 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:37.927651 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" Apr 16 22:30:38.058526 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:38.058502 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg"] Apr 16 22:30:38.061230 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:30:38.061198 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc187f15_4d3a_4811_8a50_4f4be47cae9f.slice/crio-1303884be6c13eea943cab9ad349ef89bb0feee99a979c758ee9ea877161c3e3 WatchSource:0}: Error finding container 1303884be6c13eea943cab9ad349ef89bb0feee99a979c758ee9ea877161c3e3: Status 404 returned error can't find the container with id 1303884be6c13eea943cab9ad349ef89bb0feee99a979c758ee9ea877161c3e3 Apr 16 22:30:38.591168 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:38.591134 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" event={"ID":"bc187f15-4d3a-4811-8a50-4f4be47cae9f","Type":"ContainerStarted","Data":"16fbe57ea80a1b63b2d5aa32eb3455264dcdaab52bab2024ab026d16d00f997b"} Apr 16 22:30:38.591168 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:38.591171 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" event={"ID":"bc187f15-4d3a-4811-8a50-4f4be47cae9f","Type":"ContainerStarted","Data":"1303884be6c13eea943cab9ad349ef89bb0feee99a979c758ee9ea877161c3e3"} Apr 16 22:30:39.596475 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:39.596444 2574 generic.go:358] "Generic (PLEG): container finished" podID="bc187f15-4d3a-4811-8a50-4f4be47cae9f" containerID="16fbe57ea80a1b63b2d5aa32eb3455264dcdaab52bab2024ab026d16d00f997b" exitCode=0 Apr 16 22:30:39.596884 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:39.596514 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" event={"ID":"bc187f15-4d3a-4811-8a50-4f4be47cae9f","Type":"ContainerDied","Data":"16fbe57ea80a1b63b2d5aa32eb3455264dcdaab52bab2024ab026d16d00f997b"} Apr 16 22:30:40.602423 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:40.602390 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" event={"ID":"bc187f15-4d3a-4811-8a50-4f4be47cae9f","Type":"ContainerStarted","Data":"945f2923797e9f8c7a3383f9a73bbcc1192cf233b6053b78e30e4a43a635c6c1"} Apr 16 22:30:40.602423 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:40.602427 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" event={"ID":"bc187f15-4d3a-4811-8a50-4f4be47cae9f","Type":"ContainerStarted","Data":"b1d500b9efe59633b2b51a41135013ebacd96378fbec8517bd5358a8a21100ab"} Apr 16 22:30:40.602877 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:40.602508 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" Apr 16 22:30:40.621602 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:40.621544 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" podStartSLOduration=3.621528344 podStartE2EDuration="3.621528344s" podCreationTimestamp="2026-04-16 22:30:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:30:40.620581168 +0000 UTC m=+1030.988987339" watchObservedRunningTime="2026-04-16 22:30:40.621528344 +0000 UTC m=+1030.989934495" Apr 16 22:30:47.928605 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:47.928563 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" Apr 16 22:30:47.928605 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:47.928607 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" Apr 16 22:30:47.931186 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:47.931162 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" Apr 16 22:30:48.632265 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:30:48.632236 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" Apr 16 22:31:09.635307 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:31:09.635278 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" Apr 16 22:32:23.693145 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:23.693109 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg"] Apr 16 22:32:23.693833 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:23.693442 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" podUID="bc187f15-4d3a-4811-8a50-4f4be47cae9f" containerName="tokenizer" containerID="cri-o://945f2923797e9f8c7a3383f9a73bbcc1192cf233b6053b78e30e4a43a635c6c1" gracePeriod=30 Apr 16 22:32:23.694428 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:23.693892 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" podUID="bc187f15-4d3a-4811-8a50-4f4be47cae9f" containerName="main" containerID="cri-o://b1d500b9efe59633b2b51a41135013ebacd96378fbec8517bd5358a8a21100ab" gracePeriod=30 Apr 16 22:32:23.954079 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:23.953988 2574 generic.go:358] "Generic (PLEG): container finished" podID="bc187f15-4d3a-4811-8a50-4f4be47cae9f" containerID="b1d500b9efe59633b2b51a41135013ebacd96378fbec8517bd5358a8a21100ab" exitCode=0 Apr 16 22:32:23.954079 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:23.954057 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" event={"ID":"bc187f15-4d3a-4811-8a50-4f4be47cae9f","Type":"ContainerDied","Data":"b1d500b9efe59633b2b51a41135013ebacd96378fbec8517bd5358a8a21100ab"} Apr 16 22:32:24.951735 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:24.951716 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" Apr 16 22:32:24.958639 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:24.958611 2574 generic.go:358] "Generic (PLEG): container finished" podID="bc187f15-4d3a-4811-8a50-4f4be47cae9f" containerID="945f2923797e9f8c7a3383f9a73bbcc1192cf233b6053b78e30e4a43a635c6c1" exitCode=0 Apr 16 22:32:24.958762 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:24.958685 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" Apr 16 22:32:24.958762 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:24.958696 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" event={"ID":"bc187f15-4d3a-4811-8a50-4f4be47cae9f","Type":"ContainerDied","Data":"945f2923797e9f8c7a3383f9a73bbcc1192cf233b6053b78e30e4a43a635c6c1"} Apr 16 22:32:24.958762 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:24.958736 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg" event={"ID":"bc187f15-4d3a-4811-8a50-4f4be47cae9f","Type":"ContainerDied","Data":"1303884be6c13eea943cab9ad349ef89bb0feee99a979c758ee9ea877161c3e3"} Apr 16 22:32:24.958867 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:24.958764 2574 scope.go:117] "RemoveContainer" containerID="945f2923797e9f8c7a3383f9a73bbcc1192cf233b6053b78e30e4a43a635c6c1" Apr 16 22:32:24.966306 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:24.966285 2574 scope.go:117] "RemoveContainer" containerID="b1d500b9efe59633b2b51a41135013ebacd96378fbec8517bd5358a8a21100ab" Apr 16 22:32:24.975104 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:24.975086 2574 scope.go:117] "RemoveContainer" containerID="16fbe57ea80a1b63b2d5aa32eb3455264dcdaab52bab2024ab026d16d00f997b" Apr 16 22:32:24.983293 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:24.983280 2574 scope.go:117] "RemoveContainer" containerID="945f2923797e9f8c7a3383f9a73bbcc1192cf233b6053b78e30e4a43a635c6c1" Apr 16 22:32:24.983553 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:32:24.983534 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"945f2923797e9f8c7a3383f9a73bbcc1192cf233b6053b78e30e4a43a635c6c1\": container with ID starting with 945f2923797e9f8c7a3383f9a73bbcc1192cf233b6053b78e30e4a43a635c6c1 not found: ID does not exist" containerID="945f2923797e9f8c7a3383f9a73bbcc1192cf233b6053b78e30e4a43a635c6c1" Apr 16 22:32:24.983632 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:24.983562 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"945f2923797e9f8c7a3383f9a73bbcc1192cf233b6053b78e30e4a43a635c6c1"} err="failed to get container status \"945f2923797e9f8c7a3383f9a73bbcc1192cf233b6053b78e30e4a43a635c6c1\": rpc error: code = NotFound desc = could not find container \"945f2923797e9f8c7a3383f9a73bbcc1192cf233b6053b78e30e4a43a635c6c1\": container with ID starting with 945f2923797e9f8c7a3383f9a73bbcc1192cf233b6053b78e30e4a43a635c6c1 not found: ID does not exist" Apr 16 22:32:24.983632 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:24.983581 2574 scope.go:117] "RemoveContainer" containerID="b1d500b9efe59633b2b51a41135013ebacd96378fbec8517bd5358a8a21100ab" Apr 16 22:32:24.983836 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:32:24.983813 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1d500b9efe59633b2b51a41135013ebacd96378fbec8517bd5358a8a21100ab\": container with ID starting with b1d500b9efe59633b2b51a41135013ebacd96378fbec8517bd5358a8a21100ab not found: ID does not exist" containerID="b1d500b9efe59633b2b51a41135013ebacd96378fbec8517bd5358a8a21100ab" Apr 16 22:32:24.983932 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:24.983841 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1d500b9efe59633b2b51a41135013ebacd96378fbec8517bd5358a8a21100ab"} err="failed to get container status \"b1d500b9efe59633b2b51a41135013ebacd96378fbec8517bd5358a8a21100ab\": rpc error: code = NotFound desc = could not find container \"b1d500b9efe59633b2b51a41135013ebacd96378fbec8517bd5358a8a21100ab\": container with ID starting with b1d500b9efe59633b2b51a41135013ebacd96378fbec8517bd5358a8a21100ab not found: ID does not exist" Apr 16 22:32:24.983932 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:24.983855 2574 scope.go:117] "RemoveContainer" containerID="16fbe57ea80a1b63b2d5aa32eb3455264dcdaab52bab2024ab026d16d00f997b" Apr 16 22:32:24.984102 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:32:24.984077 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16fbe57ea80a1b63b2d5aa32eb3455264dcdaab52bab2024ab026d16d00f997b\": container with ID starting with 16fbe57ea80a1b63b2d5aa32eb3455264dcdaab52bab2024ab026d16d00f997b not found: ID does not exist" containerID="16fbe57ea80a1b63b2d5aa32eb3455264dcdaab52bab2024ab026d16d00f997b" Apr 16 22:32:24.984167 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:24.984106 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16fbe57ea80a1b63b2d5aa32eb3455264dcdaab52bab2024ab026d16d00f997b"} err="failed to get container status \"16fbe57ea80a1b63b2d5aa32eb3455264dcdaab52bab2024ab026d16d00f997b\": rpc error: code = NotFound desc = could not find container \"16fbe57ea80a1b63b2d5aa32eb3455264dcdaab52bab2024ab026d16d00f997b\": container with ID starting with 16fbe57ea80a1b63b2d5aa32eb3455264dcdaab52bab2024ab026d16d00f997b not found: ID does not exist" Apr 16 22:32:25.018629 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:25.018603 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfpxf\" (UniqueName: \"kubernetes.io/projected/bc187f15-4d3a-4811-8a50-4f4be47cae9f-kube-api-access-tfpxf\") pod \"bc187f15-4d3a-4811-8a50-4f4be47cae9f\" (UID: \"bc187f15-4d3a-4811-8a50-4f4be47cae9f\") " Apr 16 22:32:25.018717 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:25.018658 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/bc187f15-4d3a-4811-8a50-4f4be47cae9f-tokenizer-uds\") pod \"bc187f15-4d3a-4811-8a50-4f4be47cae9f\" (UID: \"bc187f15-4d3a-4811-8a50-4f4be47cae9f\") " Apr 16 22:32:25.018717 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:25.018681 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/bc187f15-4d3a-4811-8a50-4f4be47cae9f-tokenizer-tmp\") pod \"bc187f15-4d3a-4811-8a50-4f4be47cae9f\" (UID: \"bc187f15-4d3a-4811-8a50-4f4be47cae9f\") " Apr 16 22:32:25.018847 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:25.018729 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc187f15-4d3a-4811-8a50-4f4be47cae9f-kserve-provision-location\") pod \"bc187f15-4d3a-4811-8a50-4f4be47cae9f\" (UID: \"bc187f15-4d3a-4811-8a50-4f4be47cae9f\") " Apr 16 22:32:25.018847 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:25.018813 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bc187f15-4d3a-4811-8a50-4f4be47cae9f-tls-certs\") pod \"bc187f15-4d3a-4811-8a50-4f4be47cae9f\" (UID: \"bc187f15-4d3a-4811-8a50-4f4be47cae9f\") " Apr 16 22:32:25.018961 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:25.018841 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/bc187f15-4d3a-4811-8a50-4f4be47cae9f-tokenizer-cache\") pod \"bc187f15-4d3a-4811-8a50-4f4be47cae9f\" (UID: \"bc187f15-4d3a-4811-8a50-4f4be47cae9f\") " Apr 16 22:32:25.019047 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:25.019003 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc187f15-4d3a-4811-8a50-4f4be47cae9f-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "bc187f15-4d3a-4811-8a50-4f4be47cae9f" (UID: "bc187f15-4d3a-4811-8a50-4f4be47cae9f"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:32:25.019159 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:25.019107 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc187f15-4d3a-4811-8a50-4f4be47cae9f-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "bc187f15-4d3a-4811-8a50-4f4be47cae9f" (UID: "bc187f15-4d3a-4811-8a50-4f4be47cae9f"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:32:25.019208 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:25.019158 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc187f15-4d3a-4811-8a50-4f4be47cae9f-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "bc187f15-4d3a-4811-8a50-4f4be47cae9f" (UID: "bc187f15-4d3a-4811-8a50-4f4be47cae9f"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:32:25.019514 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:25.019494 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc187f15-4d3a-4811-8a50-4f4be47cae9f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bc187f15-4d3a-4811-8a50-4f4be47cae9f" (UID: "bc187f15-4d3a-4811-8a50-4f4be47cae9f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:32:25.020700 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:25.020674 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc187f15-4d3a-4811-8a50-4f4be47cae9f-kube-api-access-tfpxf" (OuterVolumeSpecName: "kube-api-access-tfpxf") pod "bc187f15-4d3a-4811-8a50-4f4be47cae9f" (UID: "bc187f15-4d3a-4811-8a50-4f4be47cae9f"). InnerVolumeSpecName "kube-api-access-tfpxf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:32:25.020807 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:25.020781 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc187f15-4d3a-4811-8a50-4f4be47cae9f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "bc187f15-4d3a-4811-8a50-4f4be47cae9f" (UID: "bc187f15-4d3a-4811-8a50-4f4be47cae9f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:32:25.119868 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:25.119835 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/bc187f15-4d3a-4811-8a50-4f4be47cae9f-tokenizer-cache\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:32:25.119868 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:25.119864 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tfpxf\" (UniqueName: \"kubernetes.io/projected/bc187f15-4d3a-4811-8a50-4f4be47cae9f-kube-api-access-tfpxf\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:32:25.119868 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:25.119876 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/bc187f15-4d3a-4811-8a50-4f4be47cae9f-tokenizer-uds\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:32:25.120064 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:25.119885 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/bc187f15-4d3a-4811-8a50-4f4be47cae9f-tokenizer-tmp\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:32:25.120064 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:25.119894 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc187f15-4d3a-4811-8a50-4f4be47cae9f-kserve-provision-location\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:32:25.120064 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:25.119904 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bc187f15-4d3a-4811-8a50-4f4be47cae9f-tls-certs\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:32:25.280129 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:25.280101 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg"] Apr 16 22:32:25.284302 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:25.284279 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-56bddd49bxnwg"] Apr 16 22:32:26.262841 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:26.262811 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc187f15-4d3a-4811-8a50-4f4be47cae9f" path="/var/lib/kubelet/pods/bc187f15-4d3a-4811-8a50-4f4be47cae9f/volumes" Apr 16 22:32:35.954971 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:35.954934 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t"] Apr 16 22:32:35.955480 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:35.955436 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc187f15-4d3a-4811-8a50-4f4be47cae9f" containerName="storage-initializer" Apr 16 22:32:35.955480 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:35.955463 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc187f15-4d3a-4811-8a50-4f4be47cae9f" containerName="storage-initializer" Apr 16 22:32:35.955600 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:35.955502 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc187f15-4d3a-4811-8a50-4f4be47cae9f" containerName="tokenizer" Apr 16 22:32:35.955600 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:35.955512 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc187f15-4d3a-4811-8a50-4f4be47cae9f" containerName="tokenizer" Apr 16 22:32:35.955600 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:35.955528 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc187f15-4d3a-4811-8a50-4f4be47cae9f" containerName="main" Apr 16 22:32:35.955600 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:35.955540 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc187f15-4d3a-4811-8a50-4f4be47cae9f" containerName="main" Apr 16 22:32:35.955814 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:35.955605 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc187f15-4d3a-4811-8a50-4f4be47cae9f" containerName="tokenizer" Apr 16 22:32:35.955814 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:35.955620 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc187f15-4d3a-4811-8a50-4f4be47cae9f" containerName="main" Apr 16 22:32:35.960349 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:35.960329 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" Apr 16 22:32:35.964876 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:35.963173 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-pt4qs\"" Apr 16 22:32:35.964876 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:35.963839 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 16 22:32:35.964876 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:35.964088 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-epp-sa-dockercfg-569nd\"" Apr 16 22:32:35.969245 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:35.969223 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t"] Apr 16 22:32:36.115457 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:36.115418 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4c305339-79ab-48ca-af59-55e04605fb9d-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t\" (UID: \"4c305339-79ab-48ca-af59-55e04605fb9d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" Apr 16 22:32:36.115457 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:36.115461 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c305339-79ab-48ca-af59-55e04605fb9d-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t\" (UID: \"4c305339-79ab-48ca-af59-55e04605fb9d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" Apr 16 22:32:36.115719 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:36.115571 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4c305339-79ab-48ca-af59-55e04605fb9d-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t\" (UID: \"4c305339-79ab-48ca-af59-55e04605fb9d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" Apr 16 22:32:36.115719 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:36.115616 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf42g\" (UniqueName: \"kubernetes.io/projected/4c305339-79ab-48ca-af59-55e04605fb9d-kube-api-access-hf42g\") pod \"router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t\" (UID: \"4c305339-79ab-48ca-af59-55e04605fb9d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" Apr 16 22:32:36.115719 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:36.115666 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4c305339-79ab-48ca-af59-55e04605fb9d-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t\" (UID: \"4c305339-79ab-48ca-af59-55e04605fb9d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" Apr 16 22:32:36.115719 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:36.115715 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4c305339-79ab-48ca-af59-55e04605fb9d-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t\" (UID: \"4c305339-79ab-48ca-af59-55e04605fb9d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" Apr 16 22:32:36.216418 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:36.216338 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4c305339-79ab-48ca-af59-55e04605fb9d-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t\" (UID: \"4c305339-79ab-48ca-af59-55e04605fb9d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" Apr 16 22:32:36.216418 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:36.216403 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4c305339-79ab-48ca-af59-55e04605fb9d-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t\" (UID: \"4c305339-79ab-48ca-af59-55e04605fb9d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" Apr 16 22:32:36.216646 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:36.216433 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c305339-79ab-48ca-af59-55e04605fb9d-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t\" (UID: \"4c305339-79ab-48ca-af59-55e04605fb9d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" Apr 16 22:32:36.216646 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:36.216496 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4c305339-79ab-48ca-af59-55e04605fb9d-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t\" (UID: \"4c305339-79ab-48ca-af59-55e04605fb9d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" Apr 16 22:32:36.216646 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:36.216526 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hf42g\" (UniqueName: \"kubernetes.io/projected/4c305339-79ab-48ca-af59-55e04605fb9d-kube-api-access-hf42g\") pod \"router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t\" (UID: \"4c305339-79ab-48ca-af59-55e04605fb9d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" Apr 16 22:32:36.216646 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:36.216571 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4c305339-79ab-48ca-af59-55e04605fb9d-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t\" (UID: \"4c305339-79ab-48ca-af59-55e04605fb9d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" Apr 16 22:32:36.216879 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:36.216847 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4c305339-79ab-48ca-af59-55e04605fb9d-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t\" (UID: \"4c305339-79ab-48ca-af59-55e04605fb9d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" Apr 16 22:32:36.216937 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:36.216878 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4c305339-79ab-48ca-af59-55e04605fb9d-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t\" (UID: \"4c305339-79ab-48ca-af59-55e04605fb9d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" Apr 16 22:32:36.216937 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:36.216917 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4c305339-79ab-48ca-af59-55e04605fb9d-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t\" (UID: \"4c305339-79ab-48ca-af59-55e04605fb9d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" Apr 16 22:32:36.217016 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:36.216938 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c305339-79ab-48ca-af59-55e04605fb9d-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t\" (UID: \"4c305339-79ab-48ca-af59-55e04605fb9d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" Apr 16 22:32:36.219049 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:36.219030 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4c305339-79ab-48ca-af59-55e04605fb9d-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t\" (UID: \"4c305339-79ab-48ca-af59-55e04605fb9d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" Apr 16 22:32:36.225904 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:36.225885 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf42g\" (UniqueName: \"kubernetes.io/projected/4c305339-79ab-48ca-af59-55e04605fb9d-kube-api-access-hf42g\") pod \"router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t\" (UID: \"4c305339-79ab-48ca-af59-55e04605fb9d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" Apr 16 22:32:36.275980 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:36.275954 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" Apr 16 22:32:36.398367 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:36.398065 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t"] Apr 16 22:32:36.400425 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:32:36.400389 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c305339_79ab_48ca_af59_55e04605fb9d.slice/crio-f844224095ed1e2197a9617750f1f94bf6577423c474dbb092c1cb9fd0ea01f1 WatchSource:0}: Error finding container f844224095ed1e2197a9617750f1f94bf6577423c474dbb092c1cb9fd0ea01f1: Status 404 returned error can't find the container with id f844224095ed1e2197a9617750f1f94bf6577423c474dbb092c1cb9fd0ea01f1 Apr 16 22:32:36.402381 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:36.402365 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:32:37.002219 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:37.002179 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" event={"ID":"4c305339-79ab-48ca-af59-55e04605fb9d","Type":"ContainerStarted","Data":"f426895a71aebccda012c0f80f50ef8ce5912e4c8fec74e4bdfaca692403b625"} Apr 16 22:32:37.002219 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:37.002218 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" event={"ID":"4c305339-79ab-48ca-af59-55e04605fb9d","Type":"ContainerStarted","Data":"f844224095ed1e2197a9617750f1f94bf6577423c474dbb092c1cb9fd0ea01f1"} Apr 16 22:32:38.006952 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:38.006918 2574 generic.go:358] "Generic (PLEG): container finished" podID="4c305339-79ab-48ca-af59-55e04605fb9d" containerID="f426895a71aebccda012c0f80f50ef8ce5912e4c8fec74e4bdfaca692403b625" exitCode=0 Apr 16 22:32:38.007306 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:38.006996 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" event={"ID":"4c305339-79ab-48ca-af59-55e04605fb9d","Type":"ContainerDied","Data":"f426895a71aebccda012c0f80f50ef8ce5912e4c8fec74e4bdfaca692403b625"} Apr 16 22:32:39.011843 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:39.011806 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" event={"ID":"4c305339-79ab-48ca-af59-55e04605fb9d","Type":"ContainerStarted","Data":"a9a0a5955dc0fa2544af97c2254e2c2cf6c309a3f9ff2f466ff8e63f065e5b39"} Apr 16 22:32:39.011843 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:39.011848 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" event={"ID":"4c305339-79ab-48ca-af59-55e04605fb9d","Type":"ContainerStarted","Data":"d8c99133e45e5588d0d297873135c1d83b5f624a12c75d60a0af90b10df973e6"} Apr 16 22:32:39.012250 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:39.011946 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" Apr 16 22:32:39.031456 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:39.031410 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" podStartSLOduration=4.031396638 podStartE2EDuration="4.031396638s" podCreationTimestamp="2026-04-16 22:32:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:32:39.029921112 +0000 UTC m=+1149.398327264" watchObservedRunningTime="2026-04-16 22:32:39.031396638 +0000 UTC m=+1149.399802782" Apr 16 22:32:46.276731 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:46.276694 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" Apr 16 22:32:46.277309 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:46.276849 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" Apr 16 22:32:46.279466 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:46.279446 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" Apr 16 22:32:47.038632 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:32:47.038598 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" Apr 16 22:33:09.045701 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:33:09.045620 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" Apr 16 22:33:30.207918 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:33:30.207885 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v4dxt_dba969ac-c53c-4354-ab6e-cc853b8c1449/console-operator/1.log" Apr 16 22:33:30.211422 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:33:30.211397 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v4dxt_dba969ac-c53c-4354-ab6e-cc853b8c1449/console-operator/1.log" Apr 16 22:33:30.213494 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:33:30.213474 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hm4t5_cdab8cce-9b55-478d-b1b5-740aa9746143/ovn-acl-logging/0.log" Apr 16 22:33:30.217429 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:33:30.217412 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hm4t5_cdab8cce-9b55-478d-b1b5-740aa9746143/ovn-acl-logging/0.log" Apr 16 22:33:50.630010 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:33:50.629972 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-5f8dc4bc7b-ndnb8"] Apr 16 22:33:50.630421 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:33:50.630230 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-5f8dc4bc7b-ndnb8" podUID="e2b06a9b-2118-46da-9af6-4a99383d9443" containerName="manager" containerID="cri-o://89c119d71f2c67a962ffa8dc9457b318b3e545ee04768423effa264bd6830d72" gracePeriod=30 Apr 16 22:33:50.875991 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:33:50.875969 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5f8dc4bc7b-ndnb8" Apr 16 22:33:50.951907 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:33:50.951836 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jd8bm\" (UniqueName: \"kubernetes.io/projected/e2b06a9b-2118-46da-9af6-4a99383d9443-kube-api-access-jd8bm\") pod \"e2b06a9b-2118-46da-9af6-4a99383d9443\" (UID: \"e2b06a9b-2118-46da-9af6-4a99383d9443\") " Apr 16 22:33:50.952034 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:33:50.951908 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e2b06a9b-2118-46da-9af6-4a99383d9443-cert\") pod \"e2b06a9b-2118-46da-9af6-4a99383d9443\" (UID: \"e2b06a9b-2118-46da-9af6-4a99383d9443\") " Apr 16 22:33:50.953975 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:33:50.953938 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2b06a9b-2118-46da-9af6-4a99383d9443-cert" (OuterVolumeSpecName: "cert") pod "e2b06a9b-2118-46da-9af6-4a99383d9443" (UID: "e2b06a9b-2118-46da-9af6-4a99383d9443"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:33:50.954117 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:33:50.954038 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2b06a9b-2118-46da-9af6-4a99383d9443-kube-api-access-jd8bm" (OuterVolumeSpecName: "kube-api-access-jd8bm") pod "e2b06a9b-2118-46da-9af6-4a99383d9443" (UID: "e2b06a9b-2118-46da-9af6-4a99383d9443"). InnerVolumeSpecName "kube-api-access-jd8bm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:33:51.053306 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:33:51.053274 2574 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e2b06a9b-2118-46da-9af6-4a99383d9443-cert\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:33:51.053306 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:33:51.053302 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jd8bm\" (UniqueName: \"kubernetes.io/projected/e2b06a9b-2118-46da-9af6-4a99383d9443-kube-api-access-jd8bm\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:33:51.255481 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:33:51.255451 2574 generic.go:358] "Generic (PLEG): container finished" podID="e2b06a9b-2118-46da-9af6-4a99383d9443" containerID="89c119d71f2c67a962ffa8dc9457b318b3e545ee04768423effa264bd6830d72" exitCode=0 Apr 16 22:33:51.255646 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:33:51.255506 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5f8dc4bc7b-ndnb8" event={"ID":"e2b06a9b-2118-46da-9af6-4a99383d9443","Type":"ContainerDied","Data":"89c119d71f2c67a962ffa8dc9457b318b3e545ee04768423effa264bd6830d72"} Apr 16 22:33:51.255646 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:33:51.255528 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-5f8dc4bc7b-ndnb8" event={"ID":"e2b06a9b-2118-46da-9af6-4a99383d9443","Type":"ContainerDied","Data":"7f0f4b641b6131e01327c4a531c2a2ebd73af542a2a767d26010f1fbe00a9d03"} Apr 16 22:33:51.255646 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:33:51.255545 2574 scope.go:117] "RemoveContainer" containerID="89c119d71f2c67a962ffa8dc9457b318b3e545ee04768423effa264bd6830d72" Apr 16 22:33:51.255646 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:33:51.255508 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-5f8dc4bc7b-ndnb8" Apr 16 22:33:51.264165 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:33:51.264146 2574 scope.go:117] "RemoveContainer" containerID="89c119d71f2c67a962ffa8dc9457b318b3e545ee04768423effa264bd6830d72" Apr 16 22:33:51.264416 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:33:51.264398 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89c119d71f2c67a962ffa8dc9457b318b3e545ee04768423effa264bd6830d72\": container with ID starting with 89c119d71f2c67a962ffa8dc9457b318b3e545ee04768423effa264bd6830d72 not found: ID does not exist" containerID="89c119d71f2c67a962ffa8dc9457b318b3e545ee04768423effa264bd6830d72" Apr 16 22:33:51.264481 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:33:51.264423 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89c119d71f2c67a962ffa8dc9457b318b3e545ee04768423effa264bd6830d72"} err="failed to get container status \"89c119d71f2c67a962ffa8dc9457b318b3e545ee04768423effa264bd6830d72\": rpc error: code = NotFound desc = could not find container \"89c119d71f2c67a962ffa8dc9457b318b3e545ee04768423effa264bd6830d72\": container with ID starting with 89c119d71f2c67a962ffa8dc9457b318b3e545ee04768423effa264bd6830d72 not found: ID does not exist" Apr 16 22:33:51.275014 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:33:51.274992 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-5f8dc4bc7b-ndnb8"] Apr 16 22:33:51.278585 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:33:51.278562 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-5f8dc4bc7b-ndnb8"] Apr 16 22:33:52.263526 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:33:52.263491 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2b06a9b-2118-46da-9af6-4a99383d9443" path="/var/lib/kubelet/pods/e2b06a9b-2118-46da-9af6-4a99383d9443/volumes" Apr 16 22:34:22.209850 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:22.209769 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t"] Apr 16 22:34:22.212263 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:22.210195 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" podUID="4c305339-79ab-48ca-af59-55e04605fb9d" containerName="main" containerID="cri-o://d8c99133e45e5588d0d297873135c1d83b5f624a12c75d60a0af90b10df973e6" gracePeriod=30 Apr 16 22:34:22.212263 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:22.210307 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" podUID="4c305339-79ab-48ca-af59-55e04605fb9d" containerName="tokenizer" containerID="cri-o://a9a0a5955dc0fa2544af97c2254e2c2cf6c309a3f9ff2f466ff8e63f065e5b39" gracePeriod=30 Apr 16 22:34:22.357517 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:22.357475 2574 generic.go:358] "Generic (PLEG): container finished" podID="4c305339-79ab-48ca-af59-55e04605fb9d" containerID="d8c99133e45e5588d0d297873135c1d83b5f624a12c75d60a0af90b10df973e6" exitCode=0 Apr 16 22:34:22.357669 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:22.357537 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" event={"ID":"4c305339-79ab-48ca-af59-55e04605fb9d","Type":"ContainerDied","Data":"d8c99133e45e5588d0d297873135c1d83b5f624a12c75d60a0af90b10df973e6"} Apr 16 22:34:23.362541 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:23.362509 2574 generic.go:358] "Generic (PLEG): container finished" podID="4c305339-79ab-48ca-af59-55e04605fb9d" containerID="a9a0a5955dc0fa2544af97c2254e2c2cf6c309a3f9ff2f466ff8e63f065e5b39" exitCode=0 Apr 16 22:34:23.362864 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:23.362577 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" event={"ID":"4c305339-79ab-48ca-af59-55e04605fb9d","Type":"ContainerDied","Data":"a9a0a5955dc0fa2544af97c2254e2c2cf6c309a3f9ff2f466ff8e63f065e5b39"} Apr 16 22:34:23.362864 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:23.362611 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" event={"ID":"4c305339-79ab-48ca-af59-55e04605fb9d","Type":"ContainerDied","Data":"f844224095ed1e2197a9617750f1f94bf6577423c474dbb092c1cb9fd0ea01f1"} Apr 16 22:34:23.362864 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:23.362629 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f844224095ed1e2197a9617750f1f94bf6577423c474dbb092c1cb9fd0ea01f1" Apr 16 22:34:23.372466 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:23.372449 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" Apr 16 22:34:23.531236 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:23.531209 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4c305339-79ab-48ca-af59-55e04605fb9d-tls-certs\") pod \"4c305339-79ab-48ca-af59-55e04605fb9d\" (UID: \"4c305339-79ab-48ca-af59-55e04605fb9d\") " Apr 16 22:34:23.531395 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:23.531267 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c305339-79ab-48ca-af59-55e04605fb9d-kserve-provision-location\") pod \"4c305339-79ab-48ca-af59-55e04605fb9d\" (UID: \"4c305339-79ab-48ca-af59-55e04605fb9d\") " Apr 16 22:34:23.531395 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:23.531294 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4c305339-79ab-48ca-af59-55e04605fb9d-tokenizer-tmp\") pod \"4c305339-79ab-48ca-af59-55e04605fb9d\" (UID: \"4c305339-79ab-48ca-af59-55e04605fb9d\") " Apr 16 22:34:23.531395 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:23.531335 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hf42g\" (UniqueName: \"kubernetes.io/projected/4c305339-79ab-48ca-af59-55e04605fb9d-kube-api-access-hf42g\") pod \"4c305339-79ab-48ca-af59-55e04605fb9d\" (UID: \"4c305339-79ab-48ca-af59-55e04605fb9d\") " Apr 16 22:34:23.531395 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:23.531362 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4c305339-79ab-48ca-af59-55e04605fb9d-tokenizer-cache\") pod \"4c305339-79ab-48ca-af59-55e04605fb9d\" (UID: \"4c305339-79ab-48ca-af59-55e04605fb9d\") " Apr 16 22:34:23.531395 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:23.531383 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4c305339-79ab-48ca-af59-55e04605fb9d-tokenizer-uds\") pod \"4c305339-79ab-48ca-af59-55e04605fb9d\" (UID: \"4c305339-79ab-48ca-af59-55e04605fb9d\") " Apr 16 22:34:23.531765 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:23.531681 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c305339-79ab-48ca-af59-55e04605fb9d-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "4c305339-79ab-48ca-af59-55e04605fb9d" (UID: "4c305339-79ab-48ca-af59-55e04605fb9d"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:34:23.531837 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:23.531790 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c305339-79ab-48ca-af59-55e04605fb9d-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "4c305339-79ab-48ca-af59-55e04605fb9d" (UID: "4c305339-79ab-48ca-af59-55e04605fb9d"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:34:23.531894 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:23.531864 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c305339-79ab-48ca-af59-55e04605fb9d-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "4c305339-79ab-48ca-af59-55e04605fb9d" (UID: "4c305339-79ab-48ca-af59-55e04605fb9d"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:34:23.532118 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:23.532098 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c305339-79ab-48ca-af59-55e04605fb9d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4c305339-79ab-48ca-af59-55e04605fb9d" (UID: "4c305339-79ab-48ca-af59-55e04605fb9d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:34:23.533241 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:23.533221 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c305339-79ab-48ca-af59-55e04605fb9d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "4c305339-79ab-48ca-af59-55e04605fb9d" (UID: "4c305339-79ab-48ca-af59-55e04605fb9d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:34:23.533323 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:23.533304 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c305339-79ab-48ca-af59-55e04605fb9d-kube-api-access-hf42g" (OuterVolumeSpecName: "kube-api-access-hf42g") pod "4c305339-79ab-48ca-af59-55e04605fb9d" (UID: "4c305339-79ab-48ca-af59-55e04605fb9d"). InnerVolumeSpecName "kube-api-access-hf42g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:34:23.632791 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:23.632767 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4c305339-79ab-48ca-af59-55e04605fb9d-kserve-provision-location\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:34:23.632791 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:23.632791 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4c305339-79ab-48ca-af59-55e04605fb9d-tokenizer-tmp\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:34:23.632953 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:23.632801 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hf42g\" (UniqueName: \"kubernetes.io/projected/4c305339-79ab-48ca-af59-55e04605fb9d-kube-api-access-hf42g\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:34:23.632953 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:23.632811 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4c305339-79ab-48ca-af59-55e04605fb9d-tokenizer-cache\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:34:23.632953 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:23.632821 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4c305339-79ab-48ca-af59-55e04605fb9d-tokenizer-uds\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:34:23.632953 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:23.632830 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4c305339-79ab-48ca-af59-55e04605fb9d-tls-certs\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:34:24.368396 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:24.368358 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t" Apr 16 22:34:24.385727 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:24.385693 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t"] Apr 16 22:34:24.388788 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:24.388765 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-57c9c88596-67d8t"] Apr 16 22:34:26.263022 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:26.262987 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c305339-79ab-48ca-af59-55e04605fb9d" path="/var/lib/kubelet/pods/4c305339-79ab-48ca-af59-55e04605fb9d/volumes" Apr 16 22:34:41.329018 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.328982 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns"] Apr 16 22:34:41.329410 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.329286 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c305339-79ab-48ca-af59-55e04605fb9d" containerName="storage-initializer" Apr 16 22:34:41.329410 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.329306 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c305339-79ab-48ca-af59-55e04605fb9d" containerName="storage-initializer" Apr 16 22:34:41.329410 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.329316 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e2b06a9b-2118-46da-9af6-4a99383d9443" containerName="manager" Apr 16 22:34:41.329410 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.329321 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2b06a9b-2118-46da-9af6-4a99383d9443" containerName="manager" Apr 16 22:34:41.329410 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.329327 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c305339-79ab-48ca-af59-55e04605fb9d" containerName="tokenizer" Apr 16 22:34:41.329410 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.329334 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c305339-79ab-48ca-af59-55e04605fb9d" containerName="tokenizer" Apr 16 22:34:41.329410 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.329353 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4c305339-79ab-48ca-af59-55e04605fb9d" containerName="main" Apr 16 22:34:41.329410 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.329358 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c305339-79ab-48ca-af59-55e04605fb9d" containerName="main" Apr 16 22:34:41.329410 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.329410 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c305339-79ab-48ca-af59-55e04605fb9d" containerName="main" Apr 16 22:34:41.329410 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.329417 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="4c305339-79ab-48ca-af59-55e04605fb9d" containerName="tokenizer" Apr 16 22:34:41.329850 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.329425 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="e2b06a9b-2118-46da-9af6-4a99383d9443" containerName="manager" Apr 16 22:34:41.333903 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.333886 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" Apr 16 22:34:41.336236 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.336216 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-qrwpj\"" Apr 16 22:34:41.336929 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.336913 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-pt4qs\"" Apr 16 22:34:41.336985 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.336913 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 16 22:34:41.342963 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.342944 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns"] Apr 16 22:34:41.378547 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.378522 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/426b76f8-d382-4389-a2cc-0dab37deeb3f-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns\" (UID: \"426b76f8-d382-4389-a2cc-0dab37deeb3f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" Apr 16 22:34:41.378657 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.378573 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/426b76f8-d382-4389-a2cc-0dab37deeb3f-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns\" (UID: \"426b76f8-d382-4389-a2cc-0dab37deeb3f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" Apr 16 22:34:41.378705 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.378649 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq5g6\" (UniqueName: \"kubernetes.io/projected/426b76f8-d382-4389-a2cc-0dab37deeb3f-kube-api-access-sq5g6\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns\" (UID: \"426b76f8-d382-4389-a2cc-0dab37deeb3f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" Apr 16 22:34:41.378705 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.378682 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/426b76f8-d382-4389-a2cc-0dab37deeb3f-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns\" (UID: \"426b76f8-d382-4389-a2cc-0dab37deeb3f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" Apr 16 22:34:41.378817 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.378731 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/426b76f8-d382-4389-a2cc-0dab37deeb3f-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns\" (UID: \"426b76f8-d382-4389-a2cc-0dab37deeb3f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" Apr 16 22:34:41.378817 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.378803 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/426b76f8-d382-4389-a2cc-0dab37deeb3f-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns\" (UID: \"426b76f8-d382-4389-a2cc-0dab37deeb3f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" Apr 16 22:34:41.479295 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.479257 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/426b76f8-d382-4389-a2cc-0dab37deeb3f-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns\" (UID: \"426b76f8-d382-4389-a2cc-0dab37deeb3f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" Apr 16 22:34:41.479487 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.479300 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/426b76f8-d382-4389-a2cc-0dab37deeb3f-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns\" (UID: \"426b76f8-d382-4389-a2cc-0dab37deeb3f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" Apr 16 22:34:41.479487 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.479339 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sq5g6\" (UniqueName: \"kubernetes.io/projected/426b76f8-d382-4389-a2cc-0dab37deeb3f-kube-api-access-sq5g6\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns\" (UID: \"426b76f8-d382-4389-a2cc-0dab37deeb3f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" Apr 16 22:34:41.479487 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.479360 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/426b76f8-d382-4389-a2cc-0dab37deeb3f-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns\" (UID: \"426b76f8-d382-4389-a2cc-0dab37deeb3f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" Apr 16 22:34:41.479487 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.479390 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/426b76f8-d382-4389-a2cc-0dab37deeb3f-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns\" (UID: \"426b76f8-d382-4389-a2cc-0dab37deeb3f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" Apr 16 22:34:41.479487 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.479419 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/426b76f8-d382-4389-a2cc-0dab37deeb3f-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns\" (UID: \"426b76f8-d382-4389-a2cc-0dab37deeb3f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" Apr 16 22:34:41.479824 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.479797 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/426b76f8-d382-4389-a2cc-0dab37deeb3f-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns\" (UID: \"426b76f8-d382-4389-a2cc-0dab37deeb3f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" Apr 16 22:34:41.479917 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.479851 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/426b76f8-d382-4389-a2cc-0dab37deeb3f-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns\" (UID: \"426b76f8-d382-4389-a2cc-0dab37deeb3f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" Apr 16 22:34:41.479917 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.479865 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/426b76f8-d382-4389-a2cc-0dab37deeb3f-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns\" (UID: \"426b76f8-d382-4389-a2cc-0dab37deeb3f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" Apr 16 22:34:41.480008 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.479920 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/426b76f8-d382-4389-a2cc-0dab37deeb3f-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns\" (UID: \"426b76f8-d382-4389-a2cc-0dab37deeb3f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" Apr 16 22:34:41.481749 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.481721 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/426b76f8-d382-4389-a2cc-0dab37deeb3f-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns\" (UID: \"426b76f8-d382-4389-a2cc-0dab37deeb3f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" Apr 16 22:34:41.487248 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.487218 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq5g6\" (UniqueName: \"kubernetes.io/projected/426b76f8-d382-4389-a2cc-0dab37deeb3f-kube-api-access-sq5g6\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns\" (UID: \"426b76f8-d382-4389-a2cc-0dab37deeb3f\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" Apr 16 22:34:41.643763 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.643652 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" Apr 16 22:34:41.763690 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:41.763634 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns"] Apr 16 22:34:41.766426 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:34:41.766398 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod426b76f8_d382_4389_a2cc_0dab37deeb3f.slice/crio-bbd79b87a2a4909dcb0edbaaed5aed5747e7ce5475e10f2b44d0475d6c6e9a5c WatchSource:0}: Error finding container bbd79b87a2a4909dcb0edbaaed5aed5747e7ce5475e10f2b44d0475d6c6e9a5c: Status 404 returned error can't find the container with id bbd79b87a2a4909dcb0edbaaed5aed5747e7ce5475e10f2b44d0475d6c6e9a5c Apr 16 22:34:42.435462 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:42.435427 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" event={"ID":"426b76f8-d382-4389-a2cc-0dab37deeb3f","Type":"ContainerStarted","Data":"d250a36e49447f685ffefe2e3a91762ae9ba8663ec1dd8780f67daf6e0444950"} Apr 16 22:34:42.435462 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:42.435466 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" event={"ID":"426b76f8-d382-4389-a2cc-0dab37deeb3f","Type":"ContainerStarted","Data":"bbd79b87a2a4909dcb0edbaaed5aed5747e7ce5475e10f2b44d0475d6c6e9a5c"} Apr 16 22:34:43.441560 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:43.441524 2574 generic.go:358] "Generic (PLEG): container finished" podID="426b76f8-d382-4389-a2cc-0dab37deeb3f" containerID="d250a36e49447f685ffefe2e3a91762ae9ba8663ec1dd8780f67daf6e0444950" exitCode=0 Apr 16 22:34:43.442057 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:43.441604 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" event={"ID":"426b76f8-d382-4389-a2cc-0dab37deeb3f","Type":"ContainerDied","Data":"d250a36e49447f685ffefe2e3a91762ae9ba8663ec1dd8780f67daf6e0444950"} Apr 16 22:34:44.446766 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:44.446704 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" event={"ID":"426b76f8-d382-4389-a2cc-0dab37deeb3f","Type":"ContainerStarted","Data":"9627dc0e759496add92acf650c6b0815185e42d709f89ed3e7ad6f0d3f42bb59"} Apr 16 22:34:44.446766 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:44.446768 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" event={"ID":"426b76f8-d382-4389-a2cc-0dab37deeb3f","Type":"ContainerStarted","Data":"a2ef818c75f9588d75632f510eda3df843e8095b33c1833d1ea147b9a9168780"} Apr 16 22:34:44.447353 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:44.446815 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" Apr 16 22:34:44.465444 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:44.465385 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" podStartSLOduration=3.465358165 podStartE2EDuration="3.465358165s" podCreationTimestamp="2026-04-16 22:34:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:34:44.464837543 +0000 UTC m=+1274.833243694" watchObservedRunningTime="2026-04-16 22:34:44.465358165 +0000 UTC m=+1274.833764317" Apr 16 22:34:51.644162 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:51.644121 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" Apr 16 22:34:51.644162 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:51.644172 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" Apr 16 22:34:51.647005 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:51.646978 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" Apr 16 22:34:52.475363 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:34:52.475333 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" Apr 16 22:35:13.479453 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:35:13.479423 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" Apr 16 22:37:53.303536 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:53.303460 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns"] Apr 16 22:37:53.304108 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:53.303842 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" podUID="426b76f8-d382-4389-a2cc-0dab37deeb3f" containerName="main" containerID="cri-o://a2ef818c75f9588d75632f510eda3df843e8095b33c1833d1ea147b9a9168780" gracePeriod=30 Apr 16 22:37:53.304108 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:53.303869 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" podUID="426b76f8-d382-4389-a2cc-0dab37deeb3f" containerName="tokenizer" containerID="cri-o://9627dc0e759496add92acf650c6b0815185e42d709f89ed3e7ad6f0d3f42bb59" gracePeriod=30 Apr 16 22:37:53.478145 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:37:53.478106 2574 logging.go:55] [core] [Channel #353 SubChannel #354]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.38:9003", ServerName: "10.133.0.38:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.38:9003: connect: connection refused" Apr 16 22:37:54.078159 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:54.078124 2574 generic.go:358] "Generic (PLEG): container finished" podID="426b76f8-d382-4389-a2cc-0dab37deeb3f" containerID="a2ef818c75f9588d75632f510eda3df843e8095b33c1833d1ea147b9a9168780" exitCode=0 Apr 16 22:37:54.078159 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:54.078164 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" event={"ID":"426b76f8-d382-4389-a2cc-0dab37deeb3f","Type":"ContainerDied","Data":"a2ef818c75f9588d75632f510eda3df843e8095b33c1833d1ea147b9a9168780"} Apr 16 22:37:54.452427 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:54.452398 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" Apr 16 22:37:54.478662 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:54.478620 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" podUID="426b76f8-d382-4389-a2cc-0dab37deeb3f" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.38:9003\" within 1s: context deadline exceeded" Apr 16 22:37:54.571296 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:54.571266 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/426b76f8-d382-4389-a2cc-0dab37deeb3f-tokenizer-uds\") pod \"426b76f8-d382-4389-a2cc-0dab37deeb3f\" (UID: \"426b76f8-d382-4389-a2cc-0dab37deeb3f\") " Apr 16 22:37:54.571296 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:54.571299 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sq5g6\" (UniqueName: \"kubernetes.io/projected/426b76f8-d382-4389-a2cc-0dab37deeb3f-kube-api-access-sq5g6\") pod \"426b76f8-d382-4389-a2cc-0dab37deeb3f\" (UID: \"426b76f8-d382-4389-a2cc-0dab37deeb3f\") " Apr 16 22:37:54.571541 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:54.571317 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/426b76f8-d382-4389-a2cc-0dab37deeb3f-kserve-provision-location\") pod \"426b76f8-d382-4389-a2cc-0dab37deeb3f\" (UID: \"426b76f8-d382-4389-a2cc-0dab37deeb3f\") " Apr 16 22:37:54.571541 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:54.571368 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/426b76f8-d382-4389-a2cc-0dab37deeb3f-tokenizer-tmp\") pod \"426b76f8-d382-4389-a2cc-0dab37deeb3f\" (UID: \"426b76f8-d382-4389-a2cc-0dab37deeb3f\") " Apr 16 22:37:54.571541 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:54.571409 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/426b76f8-d382-4389-a2cc-0dab37deeb3f-tls-certs\") pod \"426b76f8-d382-4389-a2cc-0dab37deeb3f\" (UID: \"426b76f8-d382-4389-a2cc-0dab37deeb3f\") " Apr 16 22:37:54.571541 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:54.571467 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/426b76f8-d382-4389-a2cc-0dab37deeb3f-tokenizer-cache\") pod \"426b76f8-d382-4389-a2cc-0dab37deeb3f\" (UID: \"426b76f8-d382-4389-a2cc-0dab37deeb3f\") " Apr 16 22:37:54.571771 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:54.571611 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/426b76f8-d382-4389-a2cc-0dab37deeb3f-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "426b76f8-d382-4389-a2cc-0dab37deeb3f" (UID: "426b76f8-d382-4389-a2cc-0dab37deeb3f"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:37:54.571771 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:54.571730 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/426b76f8-d382-4389-a2cc-0dab37deeb3f-tokenizer-uds\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:37:54.571771 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:54.571735 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/426b76f8-d382-4389-a2cc-0dab37deeb3f-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "426b76f8-d382-4389-a2cc-0dab37deeb3f" (UID: "426b76f8-d382-4389-a2cc-0dab37deeb3f"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:37:54.571924 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:54.571904 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/426b76f8-d382-4389-a2cc-0dab37deeb3f-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "426b76f8-d382-4389-a2cc-0dab37deeb3f" (UID: "426b76f8-d382-4389-a2cc-0dab37deeb3f"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:37:54.572348 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:54.572322 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/426b76f8-d382-4389-a2cc-0dab37deeb3f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "426b76f8-d382-4389-a2cc-0dab37deeb3f" (UID: "426b76f8-d382-4389-a2cc-0dab37deeb3f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:37:54.573530 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:54.573500 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/426b76f8-d382-4389-a2cc-0dab37deeb3f-kube-api-access-sq5g6" (OuterVolumeSpecName: "kube-api-access-sq5g6") pod "426b76f8-d382-4389-a2cc-0dab37deeb3f" (UID: "426b76f8-d382-4389-a2cc-0dab37deeb3f"). InnerVolumeSpecName "kube-api-access-sq5g6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:37:54.573530 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:54.573512 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/426b76f8-d382-4389-a2cc-0dab37deeb3f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "426b76f8-d382-4389-a2cc-0dab37deeb3f" (UID: "426b76f8-d382-4389-a2cc-0dab37deeb3f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:37:54.672260 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:54.672170 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/426b76f8-d382-4389-a2cc-0dab37deeb3f-tokenizer-cache\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:37:54.672260 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:54.672203 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sq5g6\" (UniqueName: \"kubernetes.io/projected/426b76f8-d382-4389-a2cc-0dab37deeb3f-kube-api-access-sq5g6\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:37:54.672260 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:54.672214 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/426b76f8-d382-4389-a2cc-0dab37deeb3f-kserve-provision-location\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:37:54.672260 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:54.672224 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/426b76f8-d382-4389-a2cc-0dab37deeb3f-tokenizer-tmp\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:37:54.672260 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:54.672233 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/426b76f8-d382-4389-a2cc-0dab37deeb3f-tls-certs\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:37:55.083189 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:55.083149 2574 generic.go:358] "Generic (PLEG): container finished" podID="426b76f8-d382-4389-a2cc-0dab37deeb3f" containerID="9627dc0e759496add92acf650c6b0815185e42d709f89ed3e7ad6f0d3f42bb59" exitCode=0 Apr 16 22:37:55.083374 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:55.083225 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" Apr 16 22:37:55.083374 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:55.083233 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" event={"ID":"426b76f8-d382-4389-a2cc-0dab37deeb3f","Type":"ContainerDied","Data":"9627dc0e759496add92acf650c6b0815185e42d709f89ed3e7ad6f0d3f42bb59"} Apr 16 22:37:55.083374 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:55.083276 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns" event={"ID":"426b76f8-d382-4389-a2cc-0dab37deeb3f","Type":"ContainerDied","Data":"bbd79b87a2a4909dcb0edbaaed5aed5747e7ce5475e10f2b44d0475d6c6e9a5c"} Apr 16 22:37:55.083374 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:55.083293 2574 scope.go:117] "RemoveContainer" containerID="9627dc0e759496add92acf650c6b0815185e42d709f89ed3e7ad6f0d3f42bb59" Apr 16 22:37:55.091843 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:55.091828 2574 scope.go:117] "RemoveContainer" containerID="a2ef818c75f9588d75632f510eda3df843e8095b33c1833d1ea147b9a9168780" Apr 16 22:37:55.099254 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:55.099236 2574 scope.go:117] "RemoveContainer" containerID="d250a36e49447f685ffefe2e3a91762ae9ba8663ec1dd8780f67daf6e0444950" Apr 16 22:37:55.105188 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:55.105152 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns"] Apr 16 22:37:55.106605 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:55.106585 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schehwnns"] Apr 16 22:37:55.108377 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:55.108364 2574 scope.go:117] "RemoveContainer" containerID="9627dc0e759496add92acf650c6b0815185e42d709f89ed3e7ad6f0d3f42bb59" Apr 16 22:37:55.108664 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:37:55.108650 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9627dc0e759496add92acf650c6b0815185e42d709f89ed3e7ad6f0d3f42bb59\": container with ID starting with 9627dc0e759496add92acf650c6b0815185e42d709f89ed3e7ad6f0d3f42bb59 not found: ID does not exist" containerID="9627dc0e759496add92acf650c6b0815185e42d709f89ed3e7ad6f0d3f42bb59" Apr 16 22:37:55.108704 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:55.108671 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9627dc0e759496add92acf650c6b0815185e42d709f89ed3e7ad6f0d3f42bb59"} err="failed to get container status \"9627dc0e759496add92acf650c6b0815185e42d709f89ed3e7ad6f0d3f42bb59\": rpc error: code = NotFound desc = could not find container \"9627dc0e759496add92acf650c6b0815185e42d709f89ed3e7ad6f0d3f42bb59\": container with ID starting with 9627dc0e759496add92acf650c6b0815185e42d709f89ed3e7ad6f0d3f42bb59 not found: ID does not exist" Apr 16 22:37:55.108704 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:55.108691 2574 scope.go:117] "RemoveContainer" containerID="a2ef818c75f9588d75632f510eda3df843e8095b33c1833d1ea147b9a9168780" Apr 16 22:37:55.108942 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:37:55.108921 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2ef818c75f9588d75632f510eda3df843e8095b33c1833d1ea147b9a9168780\": container with ID starting with a2ef818c75f9588d75632f510eda3df843e8095b33c1833d1ea147b9a9168780 not found: ID does not exist" containerID="a2ef818c75f9588d75632f510eda3df843e8095b33c1833d1ea147b9a9168780" Apr 16 22:37:55.108997 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:55.108950 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2ef818c75f9588d75632f510eda3df843e8095b33c1833d1ea147b9a9168780"} err="failed to get container status \"a2ef818c75f9588d75632f510eda3df843e8095b33c1833d1ea147b9a9168780\": rpc error: code = NotFound desc = could not find container \"a2ef818c75f9588d75632f510eda3df843e8095b33c1833d1ea147b9a9168780\": container with ID starting with a2ef818c75f9588d75632f510eda3df843e8095b33c1833d1ea147b9a9168780 not found: ID does not exist" Apr 16 22:37:55.108997 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:55.108968 2574 scope.go:117] "RemoveContainer" containerID="d250a36e49447f685ffefe2e3a91762ae9ba8663ec1dd8780f67daf6e0444950" Apr 16 22:37:55.109191 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:37:55.109176 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d250a36e49447f685ffefe2e3a91762ae9ba8663ec1dd8780f67daf6e0444950\": container with ID starting with d250a36e49447f685ffefe2e3a91762ae9ba8663ec1dd8780f67daf6e0444950 not found: ID does not exist" containerID="d250a36e49447f685ffefe2e3a91762ae9ba8663ec1dd8780f67daf6e0444950" Apr 16 22:37:55.109236 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:55.109193 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d250a36e49447f685ffefe2e3a91762ae9ba8663ec1dd8780f67daf6e0444950"} err="failed to get container status \"d250a36e49447f685ffefe2e3a91762ae9ba8663ec1dd8780f67daf6e0444950\": rpc error: code = NotFound desc = could not find container \"d250a36e49447f685ffefe2e3a91762ae9ba8663ec1dd8780f67daf6e0444950\": container with ID starting with d250a36e49447f685ffefe2e3a91762ae9ba8663ec1dd8780f67daf6e0444950 not found: ID does not exist" Apr 16 22:37:56.262580 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:37:56.262547 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="426b76f8-d382-4389-a2cc-0dab37deeb3f" path="/var/lib/kubelet/pods/426b76f8-d382-4389-a2cc-0dab37deeb3f/volumes" Apr 16 22:38:09.208281 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.208249 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d"] Apr 16 22:38:09.208634 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.208565 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="426b76f8-d382-4389-a2cc-0dab37deeb3f" containerName="main" Apr 16 22:38:09.208634 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.208577 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="426b76f8-d382-4389-a2cc-0dab37deeb3f" containerName="main" Apr 16 22:38:09.208634 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.208600 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="426b76f8-d382-4389-a2cc-0dab37deeb3f" containerName="storage-initializer" Apr 16 22:38:09.208634 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.208606 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="426b76f8-d382-4389-a2cc-0dab37deeb3f" containerName="storage-initializer" Apr 16 22:38:09.208634 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.208613 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="426b76f8-d382-4389-a2cc-0dab37deeb3f" containerName="tokenizer" Apr 16 22:38:09.208634 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.208619 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="426b76f8-d382-4389-a2cc-0dab37deeb3f" containerName="tokenizer" Apr 16 22:38:09.208943 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.208665 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="426b76f8-d382-4389-a2cc-0dab37deeb3f" containerName="tokenizer" Apr 16 22:38:09.208943 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.208674 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="426b76f8-d382-4389-a2cc-0dab37deeb3f" containerName="main" Apr 16 22:38:09.212047 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.212027 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" Apr 16 22:38:09.214327 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.214309 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-zlw9j\"" Apr 16 22:38:09.215011 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.214996 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-pt4qs\"" Apr 16 22:38:09.215072 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.215021 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 16 22:38:09.223198 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.223179 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d"] Apr 16 22:38:09.296400 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.296365 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/34a56726-7522-4dc6-ba50-7237450efa6e-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d\" (UID: \"34a56726-7522-4dc6-ba50-7237450efa6e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" Apr 16 22:38:09.296591 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.296423 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34a56726-7522-4dc6-ba50-7237450efa6e-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d\" (UID: \"34a56726-7522-4dc6-ba50-7237450efa6e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" Apr 16 22:38:09.296591 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.296543 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/34a56726-7522-4dc6-ba50-7237450efa6e-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d\" (UID: \"34a56726-7522-4dc6-ba50-7237450efa6e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" Apr 16 22:38:09.296591 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.296583 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/34a56726-7522-4dc6-ba50-7237450efa6e-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d\" (UID: \"34a56726-7522-4dc6-ba50-7237450efa6e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" Apr 16 22:38:09.296800 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.296631 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/34a56726-7522-4dc6-ba50-7237450efa6e-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d\" (UID: \"34a56726-7522-4dc6-ba50-7237450efa6e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" Apr 16 22:38:09.296800 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.296688 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h97mp\" (UniqueName: \"kubernetes.io/projected/34a56726-7522-4dc6-ba50-7237450efa6e-kube-api-access-h97mp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d\" (UID: \"34a56726-7522-4dc6-ba50-7237450efa6e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" Apr 16 22:38:09.397333 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.397298 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/34a56726-7522-4dc6-ba50-7237450efa6e-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d\" (UID: \"34a56726-7522-4dc6-ba50-7237450efa6e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" Apr 16 22:38:09.397333 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.397336 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/34a56726-7522-4dc6-ba50-7237450efa6e-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d\" (UID: \"34a56726-7522-4dc6-ba50-7237450efa6e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" Apr 16 22:38:09.397569 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.397362 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/34a56726-7522-4dc6-ba50-7237450efa6e-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d\" (UID: \"34a56726-7522-4dc6-ba50-7237450efa6e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" Apr 16 22:38:09.397569 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.397414 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h97mp\" (UniqueName: \"kubernetes.io/projected/34a56726-7522-4dc6-ba50-7237450efa6e-kube-api-access-h97mp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d\" (UID: \"34a56726-7522-4dc6-ba50-7237450efa6e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" Apr 16 22:38:09.397569 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.397447 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/34a56726-7522-4dc6-ba50-7237450efa6e-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d\" (UID: \"34a56726-7522-4dc6-ba50-7237450efa6e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" Apr 16 22:38:09.397569 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.397508 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34a56726-7522-4dc6-ba50-7237450efa6e-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d\" (UID: \"34a56726-7522-4dc6-ba50-7237450efa6e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" Apr 16 22:38:09.397825 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.397798 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/34a56726-7522-4dc6-ba50-7237450efa6e-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d\" (UID: \"34a56726-7522-4dc6-ba50-7237450efa6e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" Apr 16 22:38:09.397897 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.397818 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/34a56726-7522-4dc6-ba50-7237450efa6e-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d\" (UID: \"34a56726-7522-4dc6-ba50-7237450efa6e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" Apr 16 22:38:09.397897 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.397850 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34a56726-7522-4dc6-ba50-7237450efa6e-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d\" (UID: \"34a56726-7522-4dc6-ba50-7237450efa6e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" Apr 16 22:38:09.398000 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.397940 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/34a56726-7522-4dc6-ba50-7237450efa6e-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d\" (UID: \"34a56726-7522-4dc6-ba50-7237450efa6e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" Apr 16 22:38:09.399829 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.399810 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/34a56726-7522-4dc6-ba50-7237450efa6e-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d\" (UID: \"34a56726-7522-4dc6-ba50-7237450efa6e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" Apr 16 22:38:09.404929 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.404903 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h97mp\" (UniqueName: \"kubernetes.io/projected/34a56726-7522-4dc6-ba50-7237450efa6e-kube-api-access-h97mp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d\" (UID: \"34a56726-7522-4dc6-ba50-7237450efa6e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" Apr 16 22:38:09.521633 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.521605 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" Apr 16 22:38:09.643423 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.643395 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d"] Apr 16 22:38:09.645937 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:38:09.645902 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34a56726_7522_4dc6_ba50_7237450efa6e.slice/crio-f9a8b96dcabf981e9bf5a0aa6f92fca7364240b27e684fde3c6454421eeaa54a WatchSource:0}: Error finding container f9a8b96dcabf981e9bf5a0aa6f92fca7364240b27e684fde3c6454421eeaa54a: Status 404 returned error can't find the container with id f9a8b96dcabf981e9bf5a0aa6f92fca7364240b27e684fde3c6454421eeaa54a Apr 16 22:38:09.647662 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:09.647645 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:38:10.133386 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:10.133345 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" event={"ID":"34a56726-7522-4dc6-ba50-7237450efa6e","Type":"ContainerStarted","Data":"9566f7127df35dd8f459889bf034e5c04e9dfe8d1cfa03c4f11c7b9c098e7cb8"} Apr 16 22:38:10.133386 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:10.133389 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" event={"ID":"34a56726-7522-4dc6-ba50-7237450efa6e","Type":"ContainerStarted","Data":"f9a8b96dcabf981e9bf5a0aa6f92fca7364240b27e684fde3c6454421eeaa54a"} Apr 16 22:38:11.137548 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:11.137511 2574 generic.go:358] "Generic (PLEG): container finished" podID="34a56726-7522-4dc6-ba50-7237450efa6e" containerID="9566f7127df35dd8f459889bf034e5c04e9dfe8d1cfa03c4f11c7b9c098e7cb8" exitCode=0 Apr 16 22:38:11.137941 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:11.137581 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" event={"ID":"34a56726-7522-4dc6-ba50-7237450efa6e","Type":"ContainerDied","Data":"9566f7127df35dd8f459889bf034e5c04e9dfe8d1cfa03c4f11c7b9c098e7cb8"} Apr 16 22:38:12.142905 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:12.142871 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" event={"ID":"34a56726-7522-4dc6-ba50-7237450efa6e","Type":"ContainerStarted","Data":"f45e58b6ad07bb916b342098c36088bd31d263ce78503c6f044c28e119de315b"} Apr 16 22:38:12.142905 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:12.142907 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" event={"ID":"34a56726-7522-4dc6-ba50-7237450efa6e","Type":"ContainerStarted","Data":"9ecef53c5fdd74165c3c8355f090f1646b30a0b6920a9f7aaaa338e461501f7b"} Apr 16 22:38:12.143422 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:12.142985 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" Apr 16 22:38:12.164652 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:12.164599 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" podStartSLOduration=3.164582767 podStartE2EDuration="3.164582767s" podCreationTimestamp="2026-04-16 22:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:38:12.162388277 +0000 UTC m=+1482.530794453" watchObservedRunningTime="2026-04-16 22:38:12.164582767 +0000 UTC m=+1482.532988922" Apr 16 22:38:19.521977 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:19.521937 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" Apr 16 22:38:19.521977 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:19.521983 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" Apr 16 22:38:19.524633 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:19.524609 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" Apr 16 22:38:20.168888 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:20.168852 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" Apr 16 22:38:30.230322 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:30.230297 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v4dxt_dba969ac-c53c-4354-ab6e-cc853b8c1449/console-operator/1.log" Apr 16 22:38:30.234842 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:30.234821 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v4dxt_dba969ac-c53c-4354-ab6e-cc853b8c1449/console-operator/1.log" Apr 16 22:38:30.236115 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:30.236098 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hm4t5_cdab8cce-9b55-478d-b1b5-740aa9746143/ovn-acl-logging/0.log" Apr 16 22:38:30.240638 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:30.240620 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hm4t5_cdab8cce-9b55-478d-b1b5-740aa9746143/ovn-acl-logging/0.log" Apr 16 22:38:41.173010 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:38:41.172978 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" Apr 16 22:39:23.795636 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:23.795600 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq"] Apr 16 22:39:23.798990 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:23.798974 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" Apr 16 22:39:23.801364 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:23.801339 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-epp-sa-dockercfg-pmrvg\"" Apr 16 22:39:23.801489 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:23.801429 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 16 22:39:23.809919 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:23.809890 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq"] Apr 16 22:39:23.933040 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:23.932997 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5q2g\" (UniqueName: \"kubernetes.io/projected/72514fde-37e3-456b-8b26-40b5ea5d5f21-kube-api-access-w5q2g\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq\" (UID: \"72514fde-37e3-456b-8b26-40b5ea5d5f21\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" Apr 16 22:39:23.933210 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:23.933112 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72514fde-37e3-456b-8b26-40b5ea5d5f21-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq\" (UID: \"72514fde-37e3-456b-8b26-40b5ea5d5f21\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" Apr 16 22:39:23.933210 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:23.933155 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/72514fde-37e3-456b-8b26-40b5ea5d5f21-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq\" (UID: \"72514fde-37e3-456b-8b26-40b5ea5d5f21\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" Apr 16 22:39:23.933210 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:23.933188 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72514fde-37e3-456b-8b26-40b5ea5d5f21-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq\" (UID: \"72514fde-37e3-456b-8b26-40b5ea5d5f21\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" Apr 16 22:39:23.933318 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:23.933272 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/72514fde-37e3-456b-8b26-40b5ea5d5f21-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq\" (UID: \"72514fde-37e3-456b-8b26-40b5ea5d5f21\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" Apr 16 22:39:23.933318 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:23.933296 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/72514fde-37e3-456b-8b26-40b5ea5d5f21-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq\" (UID: \"72514fde-37e3-456b-8b26-40b5ea5d5f21\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" Apr 16 22:39:24.034416 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:24.034360 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72514fde-37e3-456b-8b26-40b5ea5d5f21-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq\" (UID: \"72514fde-37e3-456b-8b26-40b5ea5d5f21\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" Apr 16 22:39:24.034585 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:24.034441 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/72514fde-37e3-456b-8b26-40b5ea5d5f21-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq\" (UID: \"72514fde-37e3-456b-8b26-40b5ea5d5f21\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" Apr 16 22:39:24.034585 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:24.034482 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72514fde-37e3-456b-8b26-40b5ea5d5f21-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq\" (UID: \"72514fde-37e3-456b-8b26-40b5ea5d5f21\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" Apr 16 22:39:24.034585 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:24.034527 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/72514fde-37e3-456b-8b26-40b5ea5d5f21-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq\" (UID: \"72514fde-37e3-456b-8b26-40b5ea5d5f21\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" Apr 16 22:39:24.034585 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:24.034558 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/72514fde-37e3-456b-8b26-40b5ea5d5f21-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq\" (UID: \"72514fde-37e3-456b-8b26-40b5ea5d5f21\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" Apr 16 22:39:24.034732 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:24.034686 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5q2g\" (UniqueName: \"kubernetes.io/projected/72514fde-37e3-456b-8b26-40b5ea5d5f21-kube-api-access-w5q2g\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq\" (UID: \"72514fde-37e3-456b-8b26-40b5ea5d5f21\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" Apr 16 22:39:24.034921 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:24.034895 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/72514fde-37e3-456b-8b26-40b5ea5d5f21-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq\" (UID: \"72514fde-37e3-456b-8b26-40b5ea5d5f21\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" Apr 16 22:39:24.034999 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:24.034928 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72514fde-37e3-456b-8b26-40b5ea5d5f21-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq\" (UID: \"72514fde-37e3-456b-8b26-40b5ea5d5f21\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" Apr 16 22:39:24.034999 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:24.034945 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/72514fde-37e3-456b-8b26-40b5ea5d5f21-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq\" (UID: \"72514fde-37e3-456b-8b26-40b5ea5d5f21\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" Apr 16 22:39:24.035105 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:24.035028 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/72514fde-37e3-456b-8b26-40b5ea5d5f21-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq\" (UID: \"72514fde-37e3-456b-8b26-40b5ea5d5f21\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" Apr 16 22:39:24.036969 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:24.036952 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72514fde-37e3-456b-8b26-40b5ea5d5f21-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq\" (UID: \"72514fde-37e3-456b-8b26-40b5ea5d5f21\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" Apr 16 22:39:24.042243 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:24.042221 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5q2g\" (UniqueName: \"kubernetes.io/projected/72514fde-37e3-456b-8b26-40b5ea5d5f21-kube-api-access-w5q2g\") pod \"scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq\" (UID: \"72514fde-37e3-456b-8b26-40b5ea5d5f21\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" Apr 16 22:39:24.110680 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:24.110596 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" Apr 16 22:39:24.237191 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:24.237118 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq"] Apr 16 22:39:24.239912 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:39:24.239877 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72514fde_37e3_456b_8b26_40b5ea5d5f21.slice/crio-3b163161c9abb2f13cb3fb1e3153275bea5353ddb5fe126548fc76e4d3994a9d WatchSource:0}: Error finding container 3b163161c9abb2f13cb3fb1e3153275bea5353ddb5fe126548fc76e4d3994a9d: Status 404 returned error can't find the container with id 3b163161c9abb2f13cb3fb1e3153275bea5353ddb5fe126548fc76e4d3994a9d Apr 16 22:39:24.382707 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:24.382622 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" event={"ID":"72514fde-37e3-456b-8b26-40b5ea5d5f21","Type":"ContainerStarted","Data":"909dec30f8b8e3ebdaf2f8f1bece6faa1360fd99f9302fbb3c9b72ef42d33c15"} Apr 16 22:39:24.382707 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:24.382659 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" event={"ID":"72514fde-37e3-456b-8b26-40b5ea5d5f21","Type":"ContainerStarted","Data":"3b163161c9abb2f13cb3fb1e3153275bea5353ddb5fe126548fc76e4d3994a9d"} Apr 16 22:39:25.387643 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:25.387562 2574 generic.go:358] "Generic (PLEG): container finished" podID="72514fde-37e3-456b-8b26-40b5ea5d5f21" containerID="909dec30f8b8e3ebdaf2f8f1bece6faa1360fd99f9302fbb3c9b72ef42d33c15" exitCode=0 Apr 16 22:39:25.387643 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:25.387609 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" event={"ID":"72514fde-37e3-456b-8b26-40b5ea5d5f21","Type":"ContainerDied","Data":"909dec30f8b8e3ebdaf2f8f1bece6faa1360fd99f9302fbb3c9b72ef42d33c15"} Apr 16 22:39:26.393495 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:26.393457 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" event={"ID":"72514fde-37e3-456b-8b26-40b5ea5d5f21","Type":"ContainerStarted","Data":"28eb3085126bbbf928d775100e5def90b3b667ae54c3379b782593c28b7a51f0"} Apr 16 22:39:26.393495 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:26.393493 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" event={"ID":"72514fde-37e3-456b-8b26-40b5ea5d5f21","Type":"ContainerStarted","Data":"6a88ae5f6732fa4369e026903d829e9541efd9409bf2f3d9dcd70609092f8fef"} Apr 16 22:39:26.393917 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:26.393643 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" Apr 16 22:39:26.413773 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:26.413703 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" podStartSLOduration=3.413689064 podStartE2EDuration="3.413689064s" podCreationTimestamp="2026-04-16 22:39:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:39:26.412838683 +0000 UTC m=+1556.781244834" watchObservedRunningTime="2026-04-16 22:39:26.413689064 +0000 UTC m=+1556.782095206" Apr 16 22:39:30.287536 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:30.287494 2574 scope.go:117] "RemoveContainer" containerID="d8c99133e45e5588d0d297873135c1d83b5f624a12c75d60a0af90b10df973e6" Apr 16 22:39:30.295325 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:30.295305 2574 scope.go:117] "RemoveContainer" containerID="a9a0a5955dc0fa2544af97c2254e2c2cf6c309a3f9ff2f466ff8e63f065e5b39" Apr 16 22:39:30.302358 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:30.302337 2574 scope.go:117] "RemoveContainer" containerID="f426895a71aebccda012c0f80f50ef8ce5912e4c8fec74e4bdfaca692403b625" Apr 16 22:39:34.111328 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:34.111288 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" Apr 16 22:39:34.111784 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:34.111343 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" Apr 16 22:39:34.114299 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:34.114271 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" Apr 16 22:39:34.421456 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:34.421377 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" Apr 16 22:39:55.425414 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:55.425383 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" Apr 16 22:39:56.949500 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:56.949450 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq"] Apr 16 22:39:56.949955 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:56.949772 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" podUID="72514fde-37e3-456b-8b26-40b5ea5d5f21" containerName="main" containerID="cri-o://6a88ae5f6732fa4369e026903d829e9541efd9409bf2f3d9dcd70609092f8fef" gracePeriod=30 Apr 16 22:39:56.950551 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:56.950095 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" podUID="72514fde-37e3-456b-8b26-40b5ea5d5f21" containerName="tokenizer" containerID="cri-o://28eb3085126bbbf928d775100e5def90b3b667ae54c3379b782593c28b7a51f0" gracePeriod=30 Apr 16 22:39:57.505205 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:57.505172 2574 generic.go:358] "Generic (PLEG): container finished" podID="72514fde-37e3-456b-8b26-40b5ea5d5f21" containerID="6a88ae5f6732fa4369e026903d829e9541efd9409bf2f3d9dcd70609092f8fef" exitCode=0 Apr 16 22:39:57.505205 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:57.505213 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" event={"ID":"72514fde-37e3-456b-8b26-40b5ea5d5f21","Type":"ContainerDied","Data":"6a88ae5f6732fa4369e026903d829e9541efd9409bf2f3d9dcd70609092f8fef"} Apr 16 22:39:58.188595 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.188569 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" Apr 16 22:39:58.359082 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.359048 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72514fde-37e3-456b-8b26-40b5ea5d5f21-kserve-provision-location\") pod \"72514fde-37e3-456b-8b26-40b5ea5d5f21\" (UID: \"72514fde-37e3-456b-8b26-40b5ea5d5f21\") " Apr 16 22:39:58.359250 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.359111 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/72514fde-37e3-456b-8b26-40b5ea5d5f21-tokenizer-uds\") pod \"72514fde-37e3-456b-8b26-40b5ea5d5f21\" (UID: \"72514fde-37e3-456b-8b26-40b5ea5d5f21\") " Apr 16 22:39:58.359250 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.359163 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5q2g\" (UniqueName: \"kubernetes.io/projected/72514fde-37e3-456b-8b26-40b5ea5d5f21-kube-api-access-w5q2g\") pod \"72514fde-37e3-456b-8b26-40b5ea5d5f21\" (UID: \"72514fde-37e3-456b-8b26-40b5ea5d5f21\") " Apr 16 22:39:58.359250 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.359209 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72514fde-37e3-456b-8b26-40b5ea5d5f21-tls-certs\") pod \"72514fde-37e3-456b-8b26-40b5ea5d5f21\" (UID: \"72514fde-37e3-456b-8b26-40b5ea5d5f21\") " Apr 16 22:39:58.359250 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.359231 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/72514fde-37e3-456b-8b26-40b5ea5d5f21-tokenizer-tmp\") pod \"72514fde-37e3-456b-8b26-40b5ea5d5f21\" (UID: \"72514fde-37e3-456b-8b26-40b5ea5d5f21\") " Apr 16 22:39:58.359467 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.359290 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/72514fde-37e3-456b-8b26-40b5ea5d5f21-tokenizer-cache\") pod \"72514fde-37e3-456b-8b26-40b5ea5d5f21\" (UID: \"72514fde-37e3-456b-8b26-40b5ea5d5f21\") " Apr 16 22:39:58.359467 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.359408 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72514fde-37e3-456b-8b26-40b5ea5d5f21-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "72514fde-37e3-456b-8b26-40b5ea5d5f21" (UID: "72514fde-37e3-456b-8b26-40b5ea5d5f21"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:39:58.359909 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.359611 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72514fde-37e3-456b-8b26-40b5ea5d5f21-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "72514fde-37e3-456b-8b26-40b5ea5d5f21" (UID: "72514fde-37e3-456b-8b26-40b5ea5d5f21"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:39:58.359909 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.359696 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72514fde-37e3-456b-8b26-40b5ea5d5f21-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "72514fde-37e3-456b-8b26-40b5ea5d5f21" (UID: "72514fde-37e3-456b-8b26-40b5ea5d5f21"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:39:58.359909 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.359790 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/72514fde-37e3-456b-8b26-40b5ea5d5f21-tokenizer-tmp\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:39:58.359909 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.359810 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/72514fde-37e3-456b-8b26-40b5ea5d5f21-tokenizer-cache\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:39:58.359909 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.359829 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/72514fde-37e3-456b-8b26-40b5ea5d5f21-tokenizer-uds\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:39:58.360170 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.359971 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72514fde-37e3-456b-8b26-40b5ea5d5f21-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "72514fde-37e3-456b-8b26-40b5ea5d5f21" (UID: "72514fde-37e3-456b-8b26-40b5ea5d5f21"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:39:58.361419 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.361397 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72514fde-37e3-456b-8b26-40b5ea5d5f21-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "72514fde-37e3-456b-8b26-40b5ea5d5f21" (UID: "72514fde-37e3-456b-8b26-40b5ea5d5f21"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:39:58.361499 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.361480 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72514fde-37e3-456b-8b26-40b5ea5d5f21-kube-api-access-w5q2g" (OuterVolumeSpecName: "kube-api-access-w5q2g") pod "72514fde-37e3-456b-8b26-40b5ea5d5f21" (UID: "72514fde-37e3-456b-8b26-40b5ea5d5f21"). InnerVolumeSpecName "kube-api-access-w5q2g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:39:58.460797 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.460763 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w5q2g\" (UniqueName: \"kubernetes.io/projected/72514fde-37e3-456b-8b26-40b5ea5d5f21-kube-api-access-w5q2g\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:39:58.460797 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.460792 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72514fde-37e3-456b-8b26-40b5ea5d5f21-tls-certs\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:39:58.460797 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.460804 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72514fde-37e3-456b-8b26-40b5ea5d5f21-kserve-provision-location\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:39:58.510039 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.510006 2574 generic.go:358] "Generic (PLEG): container finished" podID="72514fde-37e3-456b-8b26-40b5ea5d5f21" containerID="28eb3085126bbbf928d775100e5def90b3b667ae54c3379b782593c28b7a51f0" exitCode=0 Apr 16 22:39:58.510205 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.510066 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" event={"ID":"72514fde-37e3-456b-8b26-40b5ea5d5f21","Type":"ContainerDied","Data":"28eb3085126bbbf928d775100e5def90b3b667ae54c3379b782593c28b7a51f0"} Apr 16 22:39:58.510205 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.510090 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" event={"ID":"72514fde-37e3-456b-8b26-40b5ea5d5f21","Type":"ContainerDied","Data":"3b163161c9abb2f13cb3fb1e3153275bea5353ddb5fe126548fc76e4d3994a9d"} Apr 16 22:39:58.510205 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.510089 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq" Apr 16 22:39:58.510205 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.510105 2574 scope.go:117] "RemoveContainer" containerID="28eb3085126bbbf928d775100e5def90b3b667ae54c3379b782593c28b7a51f0" Apr 16 22:39:58.519356 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.519337 2574 scope.go:117] "RemoveContainer" containerID="6a88ae5f6732fa4369e026903d829e9541efd9409bf2f3d9dcd70609092f8fef" Apr 16 22:39:58.526622 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.526599 2574 scope.go:117] "RemoveContainer" containerID="909dec30f8b8e3ebdaf2f8f1bece6faa1360fd99f9302fbb3c9b72ef42d33c15" Apr 16 22:39:58.532423 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.532401 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq"] Apr 16 22:39:58.535057 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.535036 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-5944f7v7nq"] Apr 16 22:39:58.535683 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.535666 2574 scope.go:117] "RemoveContainer" containerID="28eb3085126bbbf928d775100e5def90b3b667ae54c3379b782593c28b7a51f0" Apr 16 22:39:58.535961 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:39:58.535940 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28eb3085126bbbf928d775100e5def90b3b667ae54c3379b782593c28b7a51f0\": container with ID starting with 28eb3085126bbbf928d775100e5def90b3b667ae54c3379b782593c28b7a51f0 not found: ID does not exist" containerID="28eb3085126bbbf928d775100e5def90b3b667ae54c3379b782593c28b7a51f0" Apr 16 22:39:58.536008 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.535970 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28eb3085126bbbf928d775100e5def90b3b667ae54c3379b782593c28b7a51f0"} err="failed to get container status \"28eb3085126bbbf928d775100e5def90b3b667ae54c3379b782593c28b7a51f0\": rpc error: code = NotFound desc = could not find container \"28eb3085126bbbf928d775100e5def90b3b667ae54c3379b782593c28b7a51f0\": container with ID starting with 28eb3085126bbbf928d775100e5def90b3b667ae54c3379b782593c28b7a51f0 not found: ID does not exist" Apr 16 22:39:58.536008 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.535988 2574 scope.go:117] "RemoveContainer" containerID="6a88ae5f6732fa4369e026903d829e9541efd9409bf2f3d9dcd70609092f8fef" Apr 16 22:39:58.536224 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:39:58.536205 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a88ae5f6732fa4369e026903d829e9541efd9409bf2f3d9dcd70609092f8fef\": container with ID starting with 6a88ae5f6732fa4369e026903d829e9541efd9409bf2f3d9dcd70609092f8fef not found: ID does not exist" containerID="6a88ae5f6732fa4369e026903d829e9541efd9409bf2f3d9dcd70609092f8fef" Apr 16 22:39:58.536265 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.536231 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a88ae5f6732fa4369e026903d829e9541efd9409bf2f3d9dcd70609092f8fef"} err="failed to get container status \"6a88ae5f6732fa4369e026903d829e9541efd9409bf2f3d9dcd70609092f8fef\": rpc error: code = NotFound desc = could not find container \"6a88ae5f6732fa4369e026903d829e9541efd9409bf2f3d9dcd70609092f8fef\": container with ID starting with 6a88ae5f6732fa4369e026903d829e9541efd9409bf2f3d9dcd70609092f8fef not found: ID does not exist" Apr 16 22:39:58.536265 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.536247 2574 scope.go:117] "RemoveContainer" containerID="909dec30f8b8e3ebdaf2f8f1bece6faa1360fd99f9302fbb3c9b72ef42d33c15" Apr 16 22:39:58.536451 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:39:58.536435 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"909dec30f8b8e3ebdaf2f8f1bece6faa1360fd99f9302fbb3c9b72ef42d33c15\": container with ID starting with 909dec30f8b8e3ebdaf2f8f1bece6faa1360fd99f9302fbb3c9b72ef42d33c15 not found: ID does not exist" containerID="909dec30f8b8e3ebdaf2f8f1bece6faa1360fd99f9302fbb3c9b72ef42d33c15" Apr 16 22:39:58.536497 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:39:58.536455 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"909dec30f8b8e3ebdaf2f8f1bece6faa1360fd99f9302fbb3c9b72ef42d33c15"} err="failed to get container status \"909dec30f8b8e3ebdaf2f8f1bece6faa1360fd99f9302fbb3c9b72ef42d33c15\": rpc error: code = NotFound desc = could not find container \"909dec30f8b8e3ebdaf2f8f1bece6faa1360fd99f9302fbb3c9b72ef42d33c15\": container with ID starting with 909dec30f8b8e3ebdaf2f8f1bece6faa1360fd99f9302fbb3c9b72ef42d33c15 not found: ID does not exist" Apr 16 22:40:00.264192 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:00.264158 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72514fde-37e3-456b-8b26-40b5ea5d5f21" path="/var/lib/kubelet/pods/72514fde-37e3-456b-8b26-40b5ea5d5f21/volumes" Apr 16 22:40:11.014102 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:11.014063 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d"] Apr 16 22:40:11.014481 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:11.014378 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" podUID="34a56726-7522-4dc6-ba50-7237450efa6e" containerName="main" containerID="cri-o://9ecef53c5fdd74165c3c8355f090f1646b30a0b6920a9f7aaaa338e461501f7b" gracePeriod=30 Apr 16 22:40:11.014551 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:11.014454 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" podUID="34a56726-7522-4dc6-ba50-7237450efa6e" containerName="tokenizer" containerID="cri-o://f45e58b6ad07bb916b342098c36088bd31d263ce78503c6f044c28e119de315b" gracePeriod=30 Apr 16 22:40:11.172465 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:40:11.172438 2574 logging.go:55] [core] [Channel #430 SubChannel #431]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.39:9003", ServerName: "10.133.0.39:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.39:9003: connect: connection refused" Apr 16 22:40:11.555812 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:11.555770 2574 generic.go:358] "Generic (PLEG): container finished" podID="34a56726-7522-4dc6-ba50-7237450efa6e" containerID="9ecef53c5fdd74165c3c8355f090f1646b30a0b6920a9f7aaaa338e461501f7b" exitCode=0 Apr 16 22:40:11.555996 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:11.555834 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" event={"ID":"34a56726-7522-4dc6-ba50-7237450efa6e","Type":"ContainerDied","Data":"9ecef53c5fdd74165c3c8355f090f1646b30a0b6920a9f7aaaa338e461501f7b"} Apr 16 22:40:12.172364 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.172323 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" podUID="34a56726-7522-4dc6-ba50-7237450efa6e" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.39:9003\" within 1s: context deadline exceeded" Apr 16 22:40:12.265967 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.265947 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" Apr 16 22:40:12.356908 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.356828 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/34a56726-7522-4dc6-ba50-7237450efa6e-tokenizer-uds\") pod \"34a56726-7522-4dc6-ba50-7237450efa6e\" (UID: \"34a56726-7522-4dc6-ba50-7237450efa6e\") " Apr 16 22:40:12.356908 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.356861 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34a56726-7522-4dc6-ba50-7237450efa6e-kserve-provision-location\") pod \"34a56726-7522-4dc6-ba50-7237450efa6e\" (UID: \"34a56726-7522-4dc6-ba50-7237450efa6e\") " Apr 16 22:40:12.356908 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.356895 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/34a56726-7522-4dc6-ba50-7237450efa6e-tls-certs\") pod \"34a56726-7522-4dc6-ba50-7237450efa6e\" (UID: \"34a56726-7522-4dc6-ba50-7237450efa6e\") " Apr 16 22:40:12.357133 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.356948 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/34a56726-7522-4dc6-ba50-7237450efa6e-tokenizer-tmp\") pod \"34a56726-7522-4dc6-ba50-7237450efa6e\" (UID: \"34a56726-7522-4dc6-ba50-7237450efa6e\") " Apr 16 22:40:12.357133 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.356987 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/34a56726-7522-4dc6-ba50-7237450efa6e-tokenizer-cache\") pod \"34a56726-7522-4dc6-ba50-7237450efa6e\" (UID: \"34a56726-7522-4dc6-ba50-7237450efa6e\") " Apr 16 22:40:12.357133 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.357014 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h97mp\" (UniqueName: \"kubernetes.io/projected/34a56726-7522-4dc6-ba50-7237450efa6e-kube-api-access-h97mp\") pod \"34a56726-7522-4dc6-ba50-7237450efa6e\" (UID: \"34a56726-7522-4dc6-ba50-7237450efa6e\") " Apr 16 22:40:12.357296 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.357133 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34a56726-7522-4dc6-ba50-7237450efa6e-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "34a56726-7522-4dc6-ba50-7237450efa6e" (UID: "34a56726-7522-4dc6-ba50-7237450efa6e"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:40:12.357296 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.357279 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34a56726-7522-4dc6-ba50-7237450efa6e-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "34a56726-7522-4dc6-ba50-7237450efa6e" (UID: "34a56726-7522-4dc6-ba50-7237450efa6e"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:40:12.357407 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.357313 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/34a56726-7522-4dc6-ba50-7237450efa6e-tokenizer-uds\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:40:12.357407 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.357362 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34a56726-7522-4dc6-ba50-7237450efa6e-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "34a56726-7522-4dc6-ba50-7237450efa6e" (UID: "34a56726-7522-4dc6-ba50-7237450efa6e"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:40:12.357731 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.357707 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34a56726-7522-4dc6-ba50-7237450efa6e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "34a56726-7522-4dc6-ba50-7237450efa6e" (UID: "34a56726-7522-4dc6-ba50-7237450efa6e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:40:12.359001 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.358973 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34a56726-7522-4dc6-ba50-7237450efa6e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "34a56726-7522-4dc6-ba50-7237450efa6e" (UID: "34a56726-7522-4dc6-ba50-7237450efa6e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:40:12.359177 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.359154 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34a56726-7522-4dc6-ba50-7237450efa6e-kube-api-access-h97mp" (OuterVolumeSpecName: "kube-api-access-h97mp") pod "34a56726-7522-4dc6-ba50-7237450efa6e" (UID: "34a56726-7522-4dc6-ba50-7237450efa6e"). InnerVolumeSpecName "kube-api-access-h97mp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:40:12.457687 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.457647 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/34a56726-7522-4dc6-ba50-7237450efa6e-tokenizer-tmp\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:40:12.457687 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.457682 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/34a56726-7522-4dc6-ba50-7237450efa6e-tokenizer-cache\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:40:12.457687 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.457691 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h97mp\" (UniqueName: \"kubernetes.io/projected/34a56726-7522-4dc6-ba50-7237450efa6e-kube-api-access-h97mp\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:40:12.457950 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.457701 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34a56726-7522-4dc6-ba50-7237450efa6e-kserve-provision-location\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:40:12.457950 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.457711 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/34a56726-7522-4dc6-ba50-7237450efa6e-tls-certs\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:40:12.561963 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.561929 2574 generic.go:358] "Generic (PLEG): container finished" podID="34a56726-7522-4dc6-ba50-7237450efa6e" containerID="f45e58b6ad07bb916b342098c36088bd31d263ce78503c6f044c28e119de315b" exitCode=0 Apr 16 22:40:12.562126 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.561998 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" Apr 16 22:40:12.562126 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.562009 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" event={"ID":"34a56726-7522-4dc6-ba50-7237450efa6e","Type":"ContainerDied","Data":"f45e58b6ad07bb916b342098c36088bd31d263ce78503c6f044c28e119de315b"} Apr 16 22:40:12.562126 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.562047 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d" event={"ID":"34a56726-7522-4dc6-ba50-7237450efa6e","Type":"ContainerDied","Data":"f9a8b96dcabf981e9bf5a0aa6f92fca7364240b27e684fde3c6454421eeaa54a"} Apr 16 22:40:12.562126 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.562065 2574 scope.go:117] "RemoveContainer" containerID="f45e58b6ad07bb916b342098c36088bd31d263ce78503c6f044c28e119de315b" Apr 16 22:40:12.571590 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.571575 2574 scope.go:117] "RemoveContainer" containerID="9ecef53c5fdd74165c3c8355f090f1646b30a0b6920a9f7aaaa338e461501f7b" Apr 16 22:40:12.578446 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.578431 2574 scope.go:117] "RemoveContainer" containerID="9566f7127df35dd8f459889bf034e5c04e9dfe8d1cfa03c4f11c7b9c098e7cb8" Apr 16 22:40:12.585327 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.585305 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d"] Apr 16 22:40:12.586161 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.586144 2574 scope.go:117] "RemoveContainer" containerID="f45e58b6ad07bb916b342098c36088bd31d263ce78503c6f044c28e119de315b" Apr 16 22:40:12.586396 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:40:12.586376 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f45e58b6ad07bb916b342098c36088bd31d263ce78503c6f044c28e119de315b\": container with ID starting with f45e58b6ad07bb916b342098c36088bd31d263ce78503c6f044c28e119de315b not found: ID does not exist" containerID="f45e58b6ad07bb916b342098c36088bd31d263ce78503c6f044c28e119de315b" Apr 16 22:40:12.586447 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.586404 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f45e58b6ad07bb916b342098c36088bd31d263ce78503c6f044c28e119de315b"} err="failed to get container status \"f45e58b6ad07bb916b342098c36088bd31d263ce78503c6f044c28e119de315b\": rpc error: code = NotFound desc = could not find container \"f45e58b6ad07bb916b342098c36088bd31d263ce78503c6f044c28e119de315b\": container with ID starting with f45e58b6ad07bb916b342098c36088bd31d263ce78503c6f044c28e119de315b not found: ID does not exist" Apr 16 22:40:12.586447 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.586425 2574 scope.go:117] "RemoveContainer" containerID="9ecef53c5fdd74165c3c8355f090f1646b30a0b6920a9f7aaaa338e461501f7b" Apr 16 22:40:12.586644 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:40:12.586629 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ecef53c5fdd74165c3c8355f090f1646b30a0b6920a9f7aaaa338e461501f7b\": container with ID starting with 9ecef53c5fdd74165c3c8355f090f1646b30a0b6920a9f7aaaa338e461501f7b not found: ID does not exist" containerID="9ecef53c5fdd74165c3c8355f090f1646b30a0b6920a9f7aaaa338e461501f7b" Apr 16 22:40:12.586684 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.586648 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ecef53c5fdd74165c3c8355f090f1646b30a0b6920a9f7aaaa338e461501f7b"} err="failed to get container status \"9ecef53c5fdd74165c3c8355f090f1646b30a0b6920a9f7aaaa338e461501f7b\": rpc error: code = NotFound desc = could not find container \"9ecef53c5fdd74165c3c8355f090f1646b30a0b6920a9f7aaaa338e461501f7b\": container with ID starting with 9ecef53c5fdd74165c3c8355f090f1646b30a0b6920a9f7aaaa338e461501f7b not found: ID does not exist" Apr 16 22:40:12.586684 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.586660 2574 scope.go:117] "RemoveContainer" containerID="9566f7127df35dd8f459889bf034e5c04e9dfe8d1cfa03c4f11c7b9c098e7cb8" Apr 16 22:40:12.586922 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:40:12.586904 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9566f7127df35dd8f459889bf034e5c04e9dfe8d1cfa03c4f11c7b9c098e7cb8\": container with ID starting with 9566f7127df35dd8f459889bf034e5c04e9dfe8d1cfa03c4f11c7b9c098e7cb8 not found: ID does not exist" containerID="9566f7127df35dd8f459889bf034e5c04e9dfe8d1cfa03c4f11c7b9c098e7cb8" Apr 16 22:40:12.586974 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.586927 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9566f7127df35dd8f459889bf034e5c04e9dfe8d1cfa03c4f11c7b9c098e7cb8"} err="failed to get container status \"9566f7127df35dd8f459889bf034e5c04e9dfe8d1cfa03c4f11c7b9c098e7cb8\": rpc error: code = NotFound desc = could not find container \"9566f7127df35dd8f459889bf034e5c04e9dfe8d1cfa03c4f11c7b9c098e7cb8\": container with ID starting with 9566f7127df35dd8f459889bf034e5c04e9dfe8d1cfa03c4f11c7b9c098e7cb8 not found: ID does not exist" Apr 16 22:40:12.591438 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:12.591420 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-698bcb566d"] Apr 16 22:40:14.263418 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:14.263388 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34a56726-7522-4dc6-ba50-7237450efa6e" path="/var/lib/kubelet/pods/34a56726-7522-4dc6-ba50-7237450efa6e/volumes" Apr 16 22:40:25.268336 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.268256 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6"] Apr 16 22:40:25.268779 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.268768 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34a56726-7522-4dc6-ba50-7237450efa6e" containerName="storage-initializer" Apr 16 22:40:25.268854 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.268786 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="34a56726-7522-4dc6-ba50-7237450efa6e" containerName="storage-initializer" Apr 16 22:40:25.268854 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.268801 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34a56726-7522-4dc6-ba50-7237450efa6e" containerName="main" Apr 16 22:40:25.268854 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.268809 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="34a56726-7522-4dc6-ba50-7237450efa6e" containerName="main" Apr 16 22:40:25.268854 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.268819 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72514fde-37e3-456b-8b26-40b5ea5d5f21" containerName="tokenizer" Apr 16 22:40:25.268854 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.268828 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="72514fde-37e3-456b-8b26-40b5ea5d5f21" containerName="tokenizer" Apr 16 22:40:25.268854 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.268845 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72514fde-37e3-456b-8b26-40b5ea5d5f21" containerName="main" Apr 16 22:40:25.268854 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.268852 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="72514fde-37e3-456b-8b26-40b5ea5d5f21" containerName="main" Apr 16 22:40:25.269508 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.268868 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34a56726-7522-4dc6-ba50-7237450efa6e" containerName="tokenizer" Apr 16 22:40:25.269508 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.268877 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="34a56726-7522-4dc6-ba50-7237450efa6e" containerName="tokenizer" Apr 16 22:40:25.269508 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.268885 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72514fde-37e3-456b-8b26-40b5ea5d5f21" containerName="storage-initializer" Apr 16 22:40:25.269508 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.268893 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="72514fde-37e3-456b-8b26-40b5ea5d5f21" containerName="storage-initializer" Apr 16 22:40:25.269508 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.268994 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="72514fde-37e3-456b-8b26-40b5ea5d5f21" containerName="tokenizer" Apr 16 22:40:25.269508 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.269006 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="72514fde-37e3-456b-8b26-40b5ea5d5f21" containerName="main" Apr 16 22:40:25.269508 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.269027 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="34a56726-7522-4dc6-ba50-7237450efa6e" containerName="tokenizer" Apr 16 22:40:25.269508 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.269043 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="34a56726-7522-4dc6-ba50-7237450efa6e" containerName="main" Apr 16 22:40:25.276500 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.276475 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" Apr 16 22:40:25.280192 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.280163 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-qhxz2\"" Apr 16 22:40:25.280315 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.280198 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 16 22:40:25.280480 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.280428 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-pt4qs\"" Apr 16 22:40:25.280784 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.280762 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6"] Apr 16 22:40:25.363615 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.363586 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/090e554a-8691-4d88-a1c8-c706f23deee4-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6\" (UID: \"090e554a-8691-4d88-a1c8-c706f23deee4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" Apr 16 22:40:25.363765 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.363621 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/090e554a-8691-4d88-a1c8-c706f23deee4-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6\" (UID: \"090e554a-8691-4d88-a1c8-c706f23deee4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" Apr 16 22:40:25.363765 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.363648 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rlrj\" (UniqueName: \"kubernetes.io/projected/090e554a-8691-4d88-a1c8-c706f23deee4-kube-api-access-5rlrj\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6\" (UID: \"090e554a-8691-4d88-a1c8-c706f23deee4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" Apr 16 22:40:25.363765 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.363723 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/090e554a-8691-4d88-a1c8-c706f23deee4-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6\" (UID: \"090e554a-8691-4d88-a1c8-c706f23deee4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" Apr 16 22:40:25.363937 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.363817 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/090e554a-8691-4d88-a1c8-c706f23deee4-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6\" (UID: \"090e554a-8691-4d88-a1c8-c706f23deee4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" Apr 16 22:40:25.363937 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.363853 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/090e554a-8691-4d88-a1c8-c706f23deee4-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6\" (UID: \"090e554a-8691-4d88-a1c8-c706f23deee4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" Apr 16 22:40:25.464626 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.464594 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/090e554a-8691-4d88-a1c8-c706f23deee4-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6\" (UID: \"090e554a-8691-4d88-a1c8-c706f23deee4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" Apr 16 22:40:25.464827 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.464651 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/090e554a-8691-4d88-a1c8-c706f23deee4-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6\" (UID: \"090e554a-8691-4d88-a1c8-c706f23deee4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" Apr 16 22:40:25.464827 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.464690 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/090e554a-8691-4d88-a1c8-c706f23deee4-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6\" (UID: \"090e554a-8691-4d88-a1c8-c706f23deee4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" Apr 16 22:40:25.464827 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.464725 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/090e554a-8691-4d88-a1c8-c706f23deee4-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6\" (UID: \"090e554a-8691-4d88-a1c8-c706f23deee4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" Apr 16 22:40:25.464952 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.464908 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rlrj\" (UniqueName: \"kubernetes.io/projected/090e554a-8691-4d88-a1c8-c706f23deee4-kube-api-access-5rlrj\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6\" (UID: \"090e554a-8691-4d88-a1c8-c706f23deee4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" Apr 16 22:40:25.465005 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.464960 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/090e554a-8691-4d88-a1c8-c706f23deee4-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6\" (UID: \"090e554a-8691-4d88-a1c8-c706f23deee4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" Apr 16 22:40:25.465072 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.465054 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/090e554a-8691-4d88-a1c8-c706f23deee4-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6\" (UID: \"090e554a-8691-4d88-a1c8-c706f23deee4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" Apr 16 22:40:25.465121 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.465102 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/090e554a-8691-4d88-a1c8-c706f23deee4-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6\" (UID: \"090e554a-8691-4d88-a1c8-c706f23deee4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" Apr 16 22:40:25.465175 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.465155 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/090e554a-8691-4d88-a1c8-c706f23deee4-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6\" (UID: \"090e554a-8691-4d88-a1c8-c706f23deee4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" Apr 16 22:40:25.465304 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.465289 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/090e554a-8691-4d88-a1c8-c706f23deee4-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6\" (UID: \"090e554a-8691-4d88-a1c8-c706f23deee4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" Apr 16 22:40:25.467009 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.466987 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/090e554a-8691-4d88-a1c8-c706f23deee4-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6\" (UID: \"090e554a-8691-4d88-a1c8-c706f23deee4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" Apr 16 22:40:25.473301 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.473277 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rlrj\" (UniqueName: \"kubernetes.io/projected/090e554a-8691-4d88-a1c8-c706f23deee4-kube-api-access-5rlrj\") pod \"router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6\" (UID: \"090e554a-8691-4d88-a1c8-c706f23deee4\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" Apr 16 22:40:25.588269 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.588190 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" Apr 16 22:40:25.713787 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:25.713760 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6"] Apr 16 22:40:25.716361 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:40:25.716333 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod090e554a_8691_4d88_a1c8_c706f23deee4.slice/crio-1b146030fe63d992b488adeaeb7a468f7a860c0985180872db583fc9af7415eb WatchSource:0}: Error finding container 1b146030fe63d992b488adeaeb7a468f7a860c0985180872db583fc9af7415eb: Status 404 returned error can't find the container with id 1b146030fe63d992b488adeaeb7a468f7a860c0985180872db583fc9af7415eb Apr 16 22:40:26.609932 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:26.609899 2574 generic.go:358] "Generic (PLEG): container finished" podID="090e554a-8691-4d88-a1c8-c706f23deee4" containerID="118fe8bef9fc0d730f41415780cd088727a3b73561e1ed4e7c6f514e6f5b8601" exitCode=0 Apr 16 22:40:26.610304 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:26.609961 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" event={"ID":"090e554a-8691-4d88-a1c8-c706f23deee4","Type":"ContainerDied","Data":"118fe8bef9fc0d730f41415780cd088727a3b73561e1ed4e7c6f514e6f5b8601"} Apr 16 22:40:26.610304 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:26.609982 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" event={"ID":"090e554a-8691-4d88-a1c8-c706f23deee4","Type":"ContainerStarted","Data":"1b146030fe63d992b488adeaeb7a468f7a860c0985180872db583fc9af7415eb"} Apr 16 22:40:27.614636 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:27.614601 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" event={"ID":"090e554a-8691-4d88-a1c8-c706f23deee4","Type":"ContainerStarted","Data":"5073a99af1e6c53b48e670c38929b4c694c63ebdfc74bb54b0fdf5b3effade14"} Apr 16 22:40:27.614636 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:27.614635 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" event={"ID":"090e554a-8691-4d88-a1c8-c706f23deee4","Type":"ContainerStarted","Data":"eb4adbf2361f09ea48b9e438d7a1ed7328ee7ccf44e569df20e2ef6bc88ff847"} Apr 16 22:40:27.615065 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:27.614724 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" Apr 16 22:40:27.633312 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:27.633271 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" podStartSLOduration=2.6332593859999998 podStartE2EDuration="2.633259386s" podCreationTimestamp="2026-04-16 22:40:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:40:27.631884326 +0000 UTC m=+1618.000290512" watchObservedRunningTime="2026-04-16 22:40:27.633259386 +0000 UTC m=+1618.001665536" Apr 16 22:40:35.588819 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:35.588774 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" Apr 16 22:40:35.589250 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:35.588913 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" Apr 16 22:40:35.591671 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:35.591649 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" Apr 16 22:40:35.643655 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:35.643625 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" Apr 16 22:40:57.650680 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:40:57.650651 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" Apr 16 22:42:15.590225 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:15.590191 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" podUID="090e554a-8691-4d88-a1c8-c706f23deee4" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 22:42:15.621012 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:15.620981 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6"] Apr 16 22:42:15.621364 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:15.621335 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" podUID="090e554a-8691-4d88-a1c8-c706f23deee4" containerName="main" containerID="cri-o://eb4adbf2361f09ea48b9e438d7a1ed7328ee7ccf44e569df20e2ef6bc88ff847" gracePeriod=30 Apr 16 22:42:15.621471 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:15.621352 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" podUID="090e554a-8691-4d88-a1c8-c706f23deee4" containerName="tokenizer" containerID="cri-o://5073a99af1e6c53b48e670c38929b4c694c63ebdfc74bb54b0fdf5b3effade14" gracePeriod=30 Apr 16 22:42:15.643404 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:15.643362 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" podUID="090e554a-8691-4d88-a1c8-c706f23deee4" containerName="tokenizer" probeResult="failure" output="Get \"http://10.133.0.41:8082/healthz\": dial tcp 10.133.0.41:8082: connect: connection refused" Apr 16 22:42:15.973839 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:15.973751 2574 generic.go:358] "Generic (PLEG): container finished" podID="090e554a-8691-4d88-a1c8-c706f23deee4" containerID="eb4adbf2361f09ea48b9e438d7a1ed7328ee7ccf44e569df20e2ef6bc88ff847" exitCode=0 Apr 16 22:42:15.973839 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:15.973811 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" event={"ID":"090e554a-8691-4d88-a1c8-c706f23deee4","Type":"ContainerDied","Data":"eb4adbf2361f09ea48b9e438d7a1ed7328ee7ccf44e569df20e2ef6bc88ff847"} Apr 16 22:42:16.775994 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:16.775972 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" Apr 16 22:42:16.845435 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:16.845358 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/090e554a-8691-4d88-a1c8-c706f23deee4-tokenizer-uds\") pod \"090e554a-8691-4d88-a1c8-c706f23deee4\" (UID: \"090e554a-8691-4d88-a1c8-c706f23deee4\") " Apr 16 22:42:16.845435 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:16.845400 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/090e554a-8691-4d88-a1c8-c706f23deee4-tokenizer-cache\") pod \"090e554a-8691-4d88-a1c8-c706f23deee4\" (UID: \"090e554a-8691-4d88-a1c8-c706f23deee4\") " Apr 16 22:42:16.845435 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:16.845431 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rlrj\" (UniqueName: \"kubernetes.io/projected/090e554a-8691-4d88-a1c8-c706f23deee4-kube-api-access-5rlrj\") pod \"090e554a-8691-4d88-a1c8-c706f23deee4\" (UID: \"090e554a-8691-4d88-a1c8-c706f23deee4\") " Apr 16 22:42:16.845666 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:16.845457 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/090e554a-8691-4d88-a1c8-c706f23deee4-tokenizer-tmp\") pod \"090e554a-8691-4d88-a1c8-c706f23deee4\" (UID: \"090e554a-8691-4d88-a1c8-c706f23deee4\") " Apr 16 22:42:16.845666 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:16.845509 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/090e554a-8691-4d88-a1c8-c706f23deee4-tls-certs\") pod \"090e554a-8691-4d88-a1c8-c706f23deee4\" (UID: \"090e554a-8691-4d88-a1c8-c706f23deee4\") " Apr 16 22:42:16.845666 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:16.845530 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/090e554a-8691-4d88-a1c8-c706f23deee4-kserve-provision-location\") pod \"090e554a-8691-4d88-a1c8-c706f23deee4\" (UID: \"090e554a-8691-4d88-a1c8-c706f23deee4\") " Apr 16 22:42:16.845666 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:16.845632 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/090e554a-8691-4d88-a1c8-c706f23deee4-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "090e554a-8691-4d88-a1c8-c706f23deee4" (UID: "090e554a-8691-4d88-a1c8-c706f23deee4"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:42:16.845911 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:16.845667 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/090e554a-8691-4d88-a1c8-c706f23deee4-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "090e554a-8691-4d88-a1c8-c706f23deee4" (UID: "090e554a-8691-4d88-a1c8-c706f23deee4"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:42:16.845911 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:16.845841 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/090e554a-8691-4d88-a1c8-c706f23deee4-tokenizer-uds\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:42:16.845911 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:16.845848 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/090e554a-8691-4d88-a1c8-c706f23deee4-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "090e554a-8691-4d88-a1c8-c706f23deee4" (UID: "090e554a-8691-4d88-a1c8-c706f23deee4"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:42:16.845911 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:16.845860 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/090e554a-8691-4d88-a1c8-c706f23deee4-tokenizer-cache\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:42:16.846399 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:16.846373 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/090e554a-8691-4d88-a1c8-c706f23deee4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "090e554a-8691-4d88-a1c8-c706f23deee4" (UID: "090e554a-8691-4d88-a1c8-c706f23deee4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:42:16.847581 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:16.847556 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090e554a-8691-4d88-a1c8-c706f23deee4-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "090e554a-8691-4d88-a1c8-c706f23deee4" (UID: "090e554a-8691-4d88-a1c8-c706f23deee4"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:42:16.847581 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:16.847562 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/090e554a-8691-4d88-a1c8-c706f23deee4-kube-api-access-5rlrj" (OuterVolumeSpecName: "kube-api-access-5rlrj") pod "090e554a-8691-4d88-a1c8-c706f23deee4" (UID: "090e554a-8691-4d88-a1c8-c706f23deee4"). InnerVolumeSpecName "kube-api-access-5rlrj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:42:16.947114 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:16.947084 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5rlrj\" (UniqueName: \"kubernetes.io/projected/090e554a-8691-4d88-a1c8-c706f23deee4-kube-api-access-5rlrj\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:42:16.947114 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:16.947109 2574 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/090e554a-8691-4d88-a1c8-c706f23deee4-tokenizer-tmp\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:42:16.947114 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:16.947119 2574 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/090e554a-8691-4d88-a1c8-c706f23deee4-tls-certs\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:42:16.947335 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:16.947129 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/090e554a-8691-4d88-a1c8-c706f23deee4-kserve-provision-location\") on node \"ip-10-0-133-72.ec2.internal\" DevicePath \"\"" Apr 16 22:42:16.978445 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:16.978416 2574 generic.go:358] "Generic (PLEG): container finished" podID="090e554a-8691-4d88-a1c8-c706f23deee4" containerID="5073a99af1e6c53b48e670c38929b4c694c63ebdfc74bb54b0fdf5b3effade14" exitCode=0 Apr 16 22:42:16.978585 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:16.978480 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" Apr 16 22:42:16.978585 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:16.978503 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" event={"ID":"090e554a-8691-4d88-a1c8-c706f23deee4","Type":"ContainerDied","Data":"5073a99af1e6c53b48e670c38929b4c694c63ebdfc74bb54b0fdf5b3effade14"} Apr 16 22:42:16.978585 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:16.978551 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6" event={"ID":"090e554a-8691-4d88-a1c8-c706f23deee4","Type":"ContainerDied","Data":"1b146030fe63d992b488adeaeb7a468f7a860c0985180872db583fc9af7415eb"} Apr 16 22:42:16.978585 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:16.978571 2574 scope.go:117] "RemoveContainer" containerID="5073a99af1e6c53b48e670c38929b4c694c63ebdfc74bb54b0fdf5b3effade14" Apr 16 22:42:16.989817 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:16.988327 2574 scope.go:117] "RemoveContainer" containerID="eb4adbf2361f09ea48b9e438d7a1ed7328ee7ccf44e569df20e2ef6bc88ff847" Apr 16 22:42:16.996538 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:16.996519 2574 scope.go:117] "RemoveContainer" containerID="118fe8bef9fc0d730f41415780cd088727a3b73561e1ed4e7c6f514e6f5b8601" Apr 16 22:42:17.003459 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:17.003443 2574 scope.go:117] "RemoveContainer" containerID="5073a99af1e6c53b48e670c38929b4c694c63ebdfc74bb54b0fdf5b3effade14" Apr 16 22:42:17.003726 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:42:17.003702 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5073a99af1e6c53b48e670c38929b4c694c63ebdfc74bb54b0fdf5b3effade14\": container with ID starting with 5073a99af1e6c53b48e670c38929b4c694c63ebdfc74bb54b0fdf5b3effade14 not found: ID does not exist" containerID="5073a99af1e6c53b48e670c38929b4c694c63ebdfc74bb54b0fdf5b3effade14" Apr 16 22:42:17.003874 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:17.003734 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5073a99af1e6c53b48e670c38929b4c694c63ebdfc74bb54b0fdf5b3effade14"} err="failed to get container status \"5073a99af1e6c53b48e670c38929b4c694c63ebdfc74bb54b0fdf5b3effade14\": rpc error: code = NotFound desc = could not find container \"5073a99af1e6c53b48e670c38929b4c694c63ebdfc74bb54b0fdf5b3effade14\": container with ID starting with 5073a99af1e6c53b48e670c38929b4c694c63ebdfc74bb54b0fdf5b3effade14 not found: ID does not exist" Apr 16 22:42:17.003874 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:17.003794 2574 scope.go:117] "RemoveContainer" containerID="eb4adbf2361f09ea48b9e438d7a1ed7328ee7ccf44e569df20e2ef6bc88ff847" Apr 16 22:42:17.004153 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:42:17.004136 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb4adbf2361f09ea48b9e438d7a1ed7328ee7ccf44e569df20e2ef6bc88ff847\": container with ID starting with eb4adbf2361f09ea48b9e438d7a1ed7328ee7ccf44e569df20e2ef6bc88ff847 not found: ID does not exist" containerID="eb4adbf2361f09ea48b9e438d7a1ed7328ee7ccf44e569df20e2ef6bc88ff847" Apr 16 22:42:17.004202 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:17.004158 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb4adbf2361f09ea48b9e438d7a1ed7328ee7ccf44e569df20e2ef6bc88ff847"} err="failed to get container status \"eb4adbf2361f09ea48b9e438d7a1ed7328ee7ccf44e569df20e2ef6bc88ff847\": rpc error: code = NotFound desc = could not find container \"eb4adbf2361f09ea48b9e438d7a1ed7328ee7ccf44e569df20e2ef6bc88ff847\": container with ID starting with eb4adbf2361f09ea48b9e438d7a1ed7328ee7ccf44e569df20e2ef6bc88ff847 not found: ID does not exist" Apr 16 22:42:17.004202 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:17.004173 2574 scope.go:117] "RemoveContainer" containerID="118fe8bef9fc0d730f41415780cd088727a3b73561e1ed4e7c6f514e6f5b8601" Apr 16 22:42:17.004401 ip-10-0-133-72 kubenswrapper[2574]: E0416 22:42:17.004383 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"118fe8bef9fc0d730f41415780cd088727a3b73561e1ed4e7c6f514e6f5b8601\": container with ID starting with 118fe8bef9fc0d730f41415780cd088727a3b73561e1ed4e7c6f514e6f5b8601 not found: ID does not exist" containerID="118fe8bef9fc0d730f41415780cd088727a3b73561e1ed4e7c6f514e6f5b8601" Apr 16 22:42:17.004441 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:17.004407 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"118fe8bef9fc0d730f41415780cd088727a3b73561e1ed4e7c6f514e6f5b8601"} err="failed to get container status \"118fe8bef9fc0d730f41415780cd088727a3b73561e1ed4e7c6f514e6f5b8601\": rpc error: code = NotFound desc = could not find container \"118fe8bef9fc0d730f41415780cd088727a3b73561e1ed4e7c6f514e6f5b8601\": container with ID starting with 118fe8bef9fc0d730f41415780cd088727a3b73561e1ed4e7c6f514e6f5b8601 not found: ID does not exist" Apr 16 22:42:17.010177 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:17.010154 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6"] Apr 16 22:42:17.015141 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:17.015122 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-5c954fdf6fxsh6"] Apr 16 22:42:18.262361 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:18.262328 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="090e554a-8691-4d88-a1c8-c706f23deee4" path="/var/lib/kubelet/pods/090e554a-8691-4d88-a1c8-c706f23deee4/volumes" Apr 16 22:42:30.895597 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:30.895564 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-cft8h_bc78cd9a-99b3-4352-b1e0-5e88a17d0c32/istio-proxy/0.log" Apr 16 22:42:31.950062 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:31.950032 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-cft8h_bc78cd9a-99b3-4352-b1e0-5e88a17d0c32/istio-proxy/0.log" Apr 16 22:42:32.965930 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:32.965902 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-cft8h_bc78cd9a-99b3-4352-b1e0-5e88a17d0c32/istio-proxy/0.log" Apr 16 22:42:33.964898 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:33.964861 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-cft8h_bc78cd9a-99b3-4352-b1e0-5e88a17d0c32/istio-proxy/0.log" Apr 16 22:42:34.956763 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:34.956709 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-cft8h_bc78cd9a-99b3-4352-b1e0-5e88a17d0c32/istio-proxy/0.log" Apr 16 22:42:35.922724 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:35.922682 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-cft8h_bc78cd9a-99b3-4352-b1e0-5e88a17d0c32/istio-proxy/0.log" Apr 16 22:42:36.904387 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:36.904352 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-cft8h_bc78cd9a-99b3-4352-b1e0-5e88a17d0c32/istio-proxy/0.log" Apr 16 22:42:37.914916 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:37.914880 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-cft8h_bc78cd9a-99b3-4352-b1e0-5e88a17d0c32/istio-proxy/0.log" Apr 16 22:42:38.951930 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:38.951904 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-cft8h_bc78cd9a-99b3-4352-b1e0-5e88a17d0c32/istio-proxy/0.log" Apr 16 22:42:39.951358 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:39.951330 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-cft8h_bc78cd9a-99b3-4352-b1e0-5e88a17d0c32/istio-proxy/0.log" Apr 16 22:42:40.981255 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:40.981230 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-cft8h_bc78cd9a-99b3-4352-b1e0-5e88a17d0c32/istio-proxy/0.log" Apr 16 22:42:42.009760 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:42.009712 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-cft8h_bc78cd9a-99b3-4352-b1e0-5e88a17d0c32/istio-proxy/0.log" Apr 16 22:42:42.990338 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:42.990310 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-cft8h_bc78cd9a-99b3-4352-b1e0-5e88a17d0c32/istio-proxy/0.log" Apr 16 22:42:43.974012 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:43.973980 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-cft8h_bc78cd9a-99b3-4352-b1e0-5e88a17d0c32/istio-proxy/0.log" Apr 16 22:42:45.000939 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:45.000909 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-t6fzj_cd3dff7e-3d30-4c96-8e98-be1bceff5295/istio-proxy/0.log" Apr 16 22:42:45.836953 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:45.836923 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-t6fzj_cd3dff7e-3d30-4c96-8e98-be1bceff5295/istio-proxy/0.log" Apr 16 22:42:46.624694 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:46.624664 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-rmwzh_ef6ec432-e618-4ffe-b27f-93fdba577807/authorino/0.log" Apr 16 22:42:46.648204 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:46.648160 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-59knw_11810e52-ac90-4e12-ad4b-5be0969566a6/manager/0.log" Apr 16 22:42:49.035729 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:49.035689 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6fvlv/must-gather-tkfbf"] Apr 16 22:42:49.036228 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:49.036206 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="090e554a-8691-4d88-a1c8-c706f23deee4" containerName="storage-initializer" Apr 16 22:42:49.036319 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:49.036231 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="090e554a-8691-4d88-a1c8-c706f23deee4" containerName="storage-initializer" Apr 16 22:42:49.036319 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:49.036246 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="090e554a-8691-4d88-a1c8-c706f23deee4" containerName="tokenizer" Apr 16 22:42:49.036319 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:49.036254 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="090e554a-8691-4d88-a1c8-c706f23deee4" containerName="tokenizer" Apr 16 22:42:49.036319 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:49.036273 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="090e554a-8691-4d88-a1c8-c706f23deee4" containerName="main" Apr 16 22:42:49.036319 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:49.036283 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="090e554a-8691-4d88-a1c8-c706f23deee4" containerName="main" Apr 16 22:42:49.036567 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:49.036375 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="090e554a-8691-4d88-a1c8-c706f23deee4" containerName="tokenizer" Apr 16 22:42:49.036567 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:49.036391 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="090e554a-8691-4d88-a1c8-c706f23deee4" containerName="main" Apr 16 22:42:49.040678 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:49.040658 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6fvlv/must-gather-tkfbf" Apr 16 22:42:49.043217 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:49.043193 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6fvlv\"/\"kube-root-ca.crt\"" Apr 16 22:42:49.043903 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:49.043885 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-6fvlv\"/\"default-dockercfg-t4h5p\"" Apr 16 22:42:49.043903 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:49.043896 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6fvlv\"/\"openshift-service-ca.crt\"" Apr 16 22:42:49.050641 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:49.050617 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6fvlv/must-gather-tkfbf"] Apr 16 22:42:49.121965 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:49.121931 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/80f7fb3d-1ea3-4640-8898-b4cf67a13337-must-gather-output\") pod \"must-gather-tkfbf\" (UID: \"80f7fb3d-1ea3-4640-8898-b4cf67a13337\") " pod="openshift-must-gather-6fvlv/must-gather-tkfbf" Apr 16 22:42:49.122131 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:49.121975 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4vtb\" (UniqueName: \"kubernetes.io/projected/80f7fb3d-1ea3-4640-8898-b4cf67a13337-kube-api-access-k4vtb\") pod \"must-gather-tkfbf\" (UID: \"80f7fb3d-1ea3-4640-8898-b4cf67a13337\") " pod="openshift-must-gather-6fvlv/must-gather-tkfbf" Apr 16 22:42:49.222777 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:49.222722 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/80f7fb3d-1ea3-4640-8898-b4cf67a13337-must-gather-output\") pod \"must-gather-tkfbf\" (UID: \"80f7fb3d-1ea3-4640-8898-b4cf67a13337\") " pod="openshift-must-gather-6fvlv/must-gather-tkfbf" Apr 16 22:42:49.222777 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:49.222782 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4vtb\" (UniqueName: \"kubernetes.io/projected/80f7fb3d-1ea3-4640-8898-b4cf67a13337-kube-api-access-k4vtb\") pod \"must-gather-tkfbf\" (UID: \"80f7fb3d-1ea3-4640-8898-b4cf67a13337\") " pod="openshift-must-gather-6fvlv/must-gather-tkfbf" Apr 16 22:42:49.223110 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:49.223088 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/80f7fb3d-1ea3-4640-8898-b4cf67a13337-must-gather-output\") pod \"must-gather-tkfbf\" (UID: \"80f7fb3d-1ea3-4640-8898-b4cf67a13337\") " pod="openshift-must-gather-6fvlv/must-gather-tkfbf" Apr 16 22:42:49.233635 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:49.233608 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4vtb\" (UniqueName: \"kubernetes.io/projected/80f7fb3d-1ea3-4640-8898-b4cf67a13337-kube-api-access-k4vtb\") pod \"must-gather-tkfbf\" (UID: \"80f7fb3d-1ea3-4640-8898-b4cf67a13337\") " pod="openshift-must-gather-6fvlv/must-gather-tkfbf" Apr 16 22:42:49.350045 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:49.349958 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6fvlv/must-gather-tkfbf" Apr 16 22:42:49.468633 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:49.468609 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6fvlv/must-gather-tkfbf"] Apr 16 22:42:49.470973 ip-10-0-133-72 kubenswrapper[2574]: W0416 22:42:49.470944 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80f7fb3d_1ea3_4640_8898_b4cf67a13337.slice/crio-c26cc9e50feb382e3e8d8bd007f4c61abdda2ebc1688e946c9faa110ea83e41a WatchSource:0}: Error finding container c26cc9e50feb382e3e8d8bd007f4c61abdda2ebc1688e946c9faa110ea83e41a: Status 404 returned error can't find the container with id c26cc9e50feb382e3e8d8bd007f4c61abdda2ebc1688e946c9faa110ea83e41a Apr 16 22:42:50.083637 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:50.083602 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6fvlv/must-gather-tkfbf" event={"ID":"80f7fb3d-1ea3-4640-8898-b4cf67a13337","Type":"ContainerStarted","Data":"c26cc9e50feb382e3e8d8bd007f4c61abdda2ebc1688e946c9faa110ea83e41a"} Apr 16 22:42:51.090547 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:51.090512 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6fvlv/must-gather-tkfbf" event={"ID":"80f7fb3d-1ea3-4640-8898-b4cf67a13337","Type":"ContainerStarted","Data":"1d7700c2829f8712483233719a5ace2473d7c12bd437b78548e11179497882bb"} Apr 16 22:42:51.090547 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:51.090551 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6fvlv/must-gather-tkfbf" event={"ID":"80f7fb3d-1ea3-4640-8898-b4cf67a13337","Type":"ContainerStarted","Data":"a18f778cb8794d0e5c9af3e8563792a563bbec2525eccce41270f8ce33d88862"} Apr 16 22:42:51.106538 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:51.106492 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6fvlv/must-gather-tkfbf" podStartSLOduration=1.346645099 podStartE2EDuration="2.106476554s" podCreationTimestamp="2026-04-16 22:42:49 +0000 UTC" firstStartedPulling="2026-04-16 22:42:49.472831908 +0000 UTC m=+1759.841238041" lastFinishedPulling="2026-04-16 22:42:50.232663368 +0000 UTC m=+1760.601069496" observedRunningTime="2026-04-16 22:42:51.104865303 +0000 UTC m=+1761.473271454" watchObservedRunningTime="2026-04-16 22:42:51.106476554 +0000 UTC m=+1761.474882739" Apr 16 22:42:51.706086 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:51.706055 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-7m5s7_f60e0c7e-0c9f-4696-ba69-04969deb255d/global-pull-secret-syncer/0.log" Apr 16 22:42:51.816638 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:51.816612 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-9hrlh_e515cbfe-da1c-405f-8e8d-a7ddc73de30a/konnectivity-agent/0.log" Apr 16 22:42:51.888413 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:51.888358 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-72.ec2.internal_b707a3c20ff0ea56ea13d746a8edc26b/haproxy/0.log" Apr 16 22:42:55.668299 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:55.668232 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-rmwzh_ef6ec432-e618-4ffe-b27f-93fdba577807/authorino/0.log" Apr 16 22:42:55.749399 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:55.749345 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-844548ff4c-59knw_11810e52-ac90-4e12-ad4b-5be0969566a6/manager/0.log" Apr 16 22:42:57.078419 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:57.078392 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-d7wxq_ead46a40-67d9-4343-a6dd-14f5815b264c/kube-state-metrics/0.log" Apr 16 22:42:57.098280 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:57.098249 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-d7wxq_ead46a40-67d9-4343-a6dd-14f5815b264c/kube-rbac-proxy-main/0.log" Apr 16 22:42:57.119706 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:57.119668 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-d7wxq_ead46a40-67d9-4343-a6dd-14f5815b264c/kube-rbac-proxy-self/0.log" Apr 16 22:42:57.204239 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:57.204128 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lr78b_430b1bf1-900c-4030-8524-0be782a10fc1/node-exporter/0.log" Apr 16 22:42:57.226907 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:57.226843 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lr78b_430b1bf1-900c-4030-8524-0be782a10fc1/kube-rbac-proxy/0.log" Apr 16 22:42:57.249627 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:57.249593 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lr78b_430b1bf1-900c-4030-8524-0be782a10fc1/init-textfile/0.log" Apr 16 22:42:57.524449 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:57.524418 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca/prometheus/0.log" Apr 16 22:42:57.549814 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:57.549789 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca/config-reloader/0.log" Apr 16 22:42:57.576071 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:57.576045 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca/thanos-sidecar/0.log" Apr 16 22:42:57.602143 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:57.602116 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca/kube-rbac-proxy-web/0.log" Apr 16 22:42:57.622965 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:57.622940 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca/kube-rbac-proxy/0.log" Apr 16 22:42:57.644357 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:57.644326 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca/kube-rbac-proxy-thanos/0.log" Apr 16 22:42:57.664564 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:57.664535 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c79cb3bf-d26c-4ea4-bd87-b46ec18fbaca/init-config-reloader/0.log" Apr 16 22:42:57.703198 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:57.703036 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-hw9tm_89f42b6c-a082-4cb0-8a46-a52958adcbac/prometheus-operator/0.log" Apr 16 22:42:57.726766 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:57.726698 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-hw9tm_89f42b6c-a082-4cb0-8a46-a52958adcbac/kube-rbac-proxy/0.log" Apr 16 22:42:57.752686 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:57.752629 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-h4l4q_82c95629-2e19-4874-bb54-906974bfced4/prometheus-operator-admission-webhook/0.log" Apr 16 22:42:59.885014 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:59.884975 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v4dxt_dba969ac-c53c-4354-ab6e-cc853b8c1449/console-operator/1.log" Apr 16 22:42:59.894298 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:42:59.894265 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v4dxt_dba969ac-c53c-4354-ab6e-cc853b8c1449/console-operator/2.log" Apr 16 22:43:00.370921 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:00.370896 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7dfd6686cc-mrxlk_aadb8510-0cd8-45e8-b0a4-9f6bf4c3a202/console/0.log" Apr 16 22:43:00.824316 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:00.824278 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6fvlv/perf-node-gather-daemonset-bghm4"] Apr 16 22:43:00.831167 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:00.831141 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-bghm4" Apr 16 22:43:00.833928 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:00.833901 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6fvlv/perf-node-gather-daemonset-bghm4"] Apr 16 22:43:00.938144 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:00.938107 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24tzf\" (UniqueName: \"kubernetes.io/projected/48428053-0190-4f2a-b5bb-d293a2f4968e-kube-api-access-24tzf\") pod \"perf-node-gather-daemonset-bghm4\" (UID: \"48428053-0190-4f2a-b5bb-d293a2f4968e\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-bghm4" Apr 16 22:43:00.938615 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:00.938591 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/48428053-0190-4f2a-b5bb-d293a2f4968e-podres\") pod \"perf-node-gather-daemonset-bghm4\" (UID: \"48428053-0190-4f2a-b5bb-d293a2f4968e\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-bghm4" Apr 16 22:43:00.938716 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:00.938700 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/48428053-0190-4f2a-b5bb-d293a2f4968e-lib-modules\") pod \"perf-node-gather-daemonset-bghm4\" (UID: \"48428053-0190-4f2a-b5bb-d293a2f4968e\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-bghm4" Apr 16 22:43:00.938808 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:00.938760 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/48428053-0190-4f2a-b5bb-d293a2f4968e-sys\") pod \"perf-node-gather-daemonset-bghm4\" (UID: \"48428053-0190-4f2a-b5bb-d293a2f4968e\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-bghm4" Apr 16 22:43:00.938874 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:00.938835 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/48428053-0190-4f2a-b5bb-d293a2f4968e-proc\") pod \"perf-node-gather-daemonset-bghm4\" (UID: \"48428053-0190-4f2a-b5bb-d293a2f4968e\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-bghm4" Apr 16 22:43:01.040011 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:01.039971 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-24tzf\" (UniqueName: \"kubernetes.io/projected/48428053-0190-4f2a-b5bb-d293a2f4968e-kube-api-access-24tzf\") pod \"perf-node-gather-daemonset-bghm4\" (UID: \"48428053-0190-4f2a-b5bb-d293a2f4968e\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-bghm4" Apr 16 22:43:01.040186 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:01.040044 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/48428053-0190-4f2a-b5bb-d293a2f4968e-podres\") pod \"perf-node-gather-daemonset-bghm4\" (UID: \"48428053-0190-4f2a-b5bb-d293a2f4968e\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-bghm4" Apr 16 22:43:01.040186 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:01.040129 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/48428053-0190-4f2a-b5bb-d293a2f4968e-lib-modules\") pod \"perf-node-gather-daemonset-bghm4\" (UID: \"48428053-0190-4f2a-b5bb-d293a2f4968e\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-bghm4" Apr 16 22:43:01.040186 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:01.040160 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/48428053-0190-4f2a-b5bb-d293a2f4968e-sys\") pod \"perf-node-gather-daemonset-bghm4\" (UID: \"48428053-0190-4f2a-b5bb-d293a2f4968e\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-bghm4" Apr 16 22:43:01.040350 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:01.040206 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/48428053-0190-4f2a-b5bb-d293a2f4968e-proc\") pod \"perf-node-gather-daemonset-bghm4\" (UID: \"48428053-0190-4f2a-b5bb-d293a2f4968e\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-bghm4" Apr 16 22:43:01.040350 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:01.040235 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/48428053-0190-4f2a-b5bb-d293a2f4968e-podres\") pod \"perf-node-gather-daemonset-bghm4\" (UID: \"48428053-0190-4f2a-b5bb-d293a2f4968e\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-bghm4" Apr 16 22:43:01.040350 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:01.040277 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/48428053-0190-4f2a-b5bb-d293a2f4968e-lib-modules\") pod \"perf-node-gather-daemonset-bghm4\" (UID: \"48428053-0190-4f2a-b5bb-d293a2f4968e\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-bghm4" Apr 16 22:43:01.040350 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:01.040301 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/48428053-0190-4f2a-b5bb-d293a2f4968e-proc\") pod \"perf-node-gather-daemonset-bghm4\" (UID: \"48428053-0190-4f2a-b5bb-d293a2f4968e\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-bghm4" Apr 16 22:43:01.040350 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:01.040295 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/48428053-0190-4f2a-b5bb-d293a2f4968e-sys\") pod \"perf-node-gather-daemonset-bghm4\" (UID: \"48428053-0190-4f2a-b5bb-d293a2f4968e\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-bghm4" Apr 16 22:43:01.048047 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:01.048019 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-24tzf\" (UniqueName: \"kubernetes.io/projected/48428053-0190-4f2a-b5bb-d293a2f4968e-kube-api-access-24tzf\") pod \"perf-node-gather-daemonset-bghm4\" (UID: \"48428053-0190-4f2a-b5bb-d293a2f4968e\") " pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-bghm4" Apr 16 22:43:01.147470 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:01.147374 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-bghm4" Apr 16 22:43:01.301022 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:01.300789 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6fvlv/perf-node-gather-daemonset-bghm4"] Apr 16 22:43:01.638845 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:01.638820 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xmqcp_cbfccc2d-f25a-4d8a-bd22-25a929f12d64/dns/0.log" Apr 16 22:43:01.657731 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:01.657700 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xmqcp_cbfccc2d-f25a-4d8a-bd22-25a929f12d64/kube-rbac-proxy/0.log" Apr 16 22:43:01.724326 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:01.724294 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9cdfs_76fa2914-d925-4555-87be-d9837e6295d8/dns-node-resolver/0.log" Apr 16 22:43:02.140288 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:02.140256 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-bghm4" event={"ID":"48428053-0190-4f2a-b5bb-d293a2f4968e","Type":"ContainerStarted","Data":"06ad56e544edc27a03d454be0d2b4751924f4cef8267a977c652c39bb831ce0b"} Apr 16 22:43:02.140288 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:02.140292 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-bghm4" event={"ID":"48428053-0190-4f2a-b5bb-d293a2f4968e","Type":"ContainerStarted","Data":"ce232c98e558cde096c3773fdf45527c12d2186b93523955c4dcec02a60da871"} Apr 16 22:43:02.140777 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:02.140331 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-bghm4" Apr 16 22:43:02.155595 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:02.155542 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-bghm4" podStartSLOduration=2.155523945 podStartE2EDuration="2.155523945s" podCreationTimestamp="2026-04-16 22:43:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:43:02.153409706 +0000 UTC m=+1772.521815858" watchObservedRunningTime="2026-04-16 22:43:02.155523945 +0000 UTC m=+1772.523930097" Apr 16 22:43:02.305634 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:02.305604 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zfz79_46cb706e-dcb9-4950-aa25-14e582448ea8/node-ca/0.log" Apr 16 22:43:03.143286 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:03.143255 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-t6fzj_cd3dff7e-3d30-4c96-8e98-be1bceff5295/istio-proxy/0.log" Apr 16 22:43:03.645452 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:03.645399 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-ffxpx_d146454c-862c-4665-b411-fd4c29e30335/serve-healthcheck-canary/0.log" Apr 16 22:43:04.380841 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:04.380808 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vdsxv_031bb873-9d06-48ea-b341-4885a796a0eb/kube-rbac-proxy/0.log" Apr 16 22:43:04.403905 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:04.403888 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vdsxv_031bb873-9d06-48ea-b341-4885a796a0eb/exporter/0.log" Apr 16 22:43:04.424678 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:04.424659 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vdsxv_031bb873-9d06-48ea-b341-4885a796a0eb/extractor/0.log" Apr 16 22:43:06.955469 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:06.955437 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5cdcb589b5-sdjkx_f45534a0-756e-4125-b237-74268df6b42d/manager/0.log" Apr 16 22:43:07.482674 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:07.482639 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-84d7d5cfc6-4xdxd_3a52ec15-1ac7-4dad-b763-9ce2039d5f3c/manager/0.log" Apr 16 22:43:08.153954 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:08.153918 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-6fvlv/perf-node-gather-daemonset-bghm4" Apr 16 22:43:13.690582 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:13.690544 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8pm2c_ede793eb-64b1-4045-a60c-349b6c07e08b/kube-multus/0.log" Apr 16 22:43:13.866659 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:13.866628 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dx88j_38e7a963-729e-40af-91e7-9fa6910bc258/kube-multus-additional-cni-plugins/0.log" Apr 16 22:43:13.887174 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:13.887148 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dx88j_38e7a963-729e-40af-91e7-9fa6910bc258/egress-router-binary-copy/0.log" Apr 16 22:43:13.906262 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:13.906238 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dx88j_38e7a963-729e-40af-91e7-9fa6910bc258/cni-plugins/0.log" Apr 16 22:43:13.929857 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:13.929831 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dx88j_38e7a963-729e-40af-91e7-9fa6910bc258/bond-cni-plugin/0.log" Apr 16 22:43:13.949926 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:13.949860 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dx88j_38e7a963-729e-40af-91e7-9fa6910bc258/routeoverride-cni/0.log" Apr 16 22:43:13.969122 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:13.969098 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dx88j_38e7a963-729e-40af-91e7-9fa6910bc258/whereabouts-cni-bincopy/0.log" Apr 16 22:43:13.987826 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:13.987805 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dx88j_38e7a963-729e-40af-91e7-9fa6910bc258/whereabouts-cni/0.log" Apr 16 22:43:14.311150 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:14.311101 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wvq6s_c220c5af-4b42-4b44-a789-17aa37d44b90/network-metrics-daemon/0.log" Apr 16 22:43:14.330689 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:14.330642 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wvq6s_c220c5af-4b42-4b44-a789-17aa37d44b90/kube-rbac-proxy/0.log" Apr 16 22:43:15.638176 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:15.638142 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hm4t5_cdab8cce-9b55-478d-b1b5-740aa9746143/ovn-controller/0.log" Apr 16 22:43:15.658056 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:15.658026 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hm4t5_cdab8cce-9b55-478d-b1b5-740aa9746143/ovn-acl-logging/0.log" Apr 16 22:43:15.665713 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:15.665690 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hm4t5_cdab8cce-9b55-478d-b1b5-740aa9746143/ovn-acl-logging/1.log" Apr 16 22:43:15.684522 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:15.684498 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hm4t5_cdab8cce-9b55-478d-b1b5-740aa9746143/kube-rbac-proxy-node/0.log" Apr 16 22:43:15.703709 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:15.703684 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hm4t5_cdab8cce-9b55-478d-b1b5-740aa9746143/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 22:43:15.721123 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:15.721100 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hm4t5_cdab8cce-9b55-478d-b1b5-740aa9746143/northd/0.log" Apr 16 22:43:15.740139 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:15.740109 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hm4t5_cdab8cce-9b55-478d-b1b5-740aa9746143/nbdb/0.log" Apr 16 22:43:15.759849 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:15.759829 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hm4t5_cdab8cce-9b55-478d-b1b5-740aa9746143/sbdb/0.log" Apr 16 22:43:15.876084 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:15.876054 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hm4t5_cdab8cce-9b55-478d-b1b5-740aa9746143/ovnkube-controller/0.log" Apr 16 22:43:16.973071 ip-10-0-133-72 kubenswrapper[2574]: I0416 22:43:16.973043 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-4pbqm_65a893c9-3b9b-48c6-a82b-6236d443cacf/network-check-target-container/0.log"