Apr 20 20:05:17.462151 ip-10-0-130-227 systemd[1]: Starting Kubernetes Kubelet... Apr 20 20:05:17.920557 ip-10-0-130-227 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:05:17.920557 ip-10-0-130-227 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 20:05:17.920557 ip-10-0-130-227 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:05:17.920557 ip-10-0-130-227 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 20:05:17.920557 ip-10-0-130-227 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:05:17.923057 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.922953 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 20:05:17.928095 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928075 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:17.928095 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928093 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:17.928095 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928096 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:17.928095 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928099 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:17.928095 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928103 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:17.928282 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928106 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:17.928282 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928109 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:17.928282 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928112 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:17.928282 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928115 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:17.928282 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928117 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:17.928282 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928120 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:17.928282 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928123 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:17.928282 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928125 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:17.928282 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928128 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:17.928282 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928131 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:17.928282 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928133 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:17.928282 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928136 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:17.928282 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928139 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:17.928282 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928141 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:17.928282 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928144 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:17.928282 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928146 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:17.928282 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928149 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:17.928282 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928153 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:17.928282 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928157 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:17.928737 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928160 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:17.928737 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928166 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:17.928737 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928168 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:17.928737 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928171 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:17.928737 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928174 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:17.928737 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928176 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:17.928737 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928179 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:17.928737 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928182 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:17.928737 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928184 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:17.928737 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928187 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:17.928737 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928190 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:17.928737 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928193 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:17.928737 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928195 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:17.928737 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928198 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:17.928737 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928202 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:17.928737 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928205 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:17.928737 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928209 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:17.928737 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928212 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:17.928737 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928215 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:17.928737 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928218 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:17.929279 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928221 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:17.929279 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928224 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:17.929279 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928226 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:17.929279 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928229 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:17.929279 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928232 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:17.929279 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928235 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:17.929279 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928237 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:17.929279 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928240 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:17.929279 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928242 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:17.929279 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928245 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:17.929279 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928248 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:17.929279 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928250 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:17.929279 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928253 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:17.929279 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928255 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:17.929279 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928258 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:17.929279 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928260 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:17.929279 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928263 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:17.929279 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928265 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:17.929279 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928268 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:17.929279 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928270 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:17.929769 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928273 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:17.929769 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928275 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:17.929769 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928278 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:17.929769 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928280 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:17.929769 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928304 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:17.929769 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928309 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:17.929769 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928313 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:17.929769 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928316 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:17.929769 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928319 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:17.929769 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928322 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:17.929769 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928325 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:17.929769 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928328 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:17.929769 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928330 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:17.929769 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928333 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:17.929769 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928336 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:17.929769 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928339 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:17.929769 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928342 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:17.929769 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928345 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:17.929769 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928347 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:17.929769 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928350 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:17.930281 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928352 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:17.930281 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928355 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:17.930281 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928730 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:17.930281 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928734 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:17.930281 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928737 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:17.930281 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928740 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:17.930281 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928743 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:17.930281 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928746 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:17.930281 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928749 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:17.930281 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928751 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:17.930281 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928754 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:17.930281 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928757 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:17.930281 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928760 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:17.930281 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928762 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:17.930281 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928765 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:17.930281 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928767 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:17.930281 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928770 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:17.930281 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928772 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:17.930281 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928778 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:17.930281 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928782 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:17.930761 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928785 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:17.930761 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928788 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:17.930761 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928791 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:17.930761 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928793 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:17.930761 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928796 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:17.930761 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928798 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:17.930761 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928802 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:17.930761 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928804 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:17.930761 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928807 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:17.930761 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928809 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:17.930761 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928812 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:17.930761 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928814 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:17.930761 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928817 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:17.930761 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928819 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:17.930761 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928822 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:17.930761 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928824 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:17.930761 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928826 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:17.930761 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928829 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:17.930761 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928831 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:17.930761 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928834 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:17.931274 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928836 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:17.931274 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928838 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:17.931274 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928842 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:17.931274 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928844 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:17.931274 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928847 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:17.931274 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928865 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:17.931274 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928870 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:17.931274 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928874 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:17.931274 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928877 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:17.931274 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928880 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:17.931274 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928883 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:17.931274 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928886 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:17.931274 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928889 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:17.931274 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928892 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:17.931274 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928895 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:17.931274 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928898 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:17.931274 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928900 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:17.931274 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928903 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:17.931274 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928906 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:17.931767 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928911 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:17.931767 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928913 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:17.931767 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928916 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:17.931767 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928919 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:17.931767 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928921 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:17.931767 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928924 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:17.931767 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928926 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:17.931767 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928929 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:17.931767 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928932 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:17.931767 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928934 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:17.931767 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928936 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:17.931767 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928939 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:17.931767 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928942 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:17.931767 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928945 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:17.931767 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928949 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:17.931767 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928952 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:17.931767 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928954 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:17.931767 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928957 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:17.931767 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928959 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:17.932253 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928962 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:17.932253 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928964 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:17.932253 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928967 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:17.932253 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928970 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:17.932253 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928972 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:17.932253 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928975 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:17.932253 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928977 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:17.932253 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928980 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:17.932253 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928983 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:17.932253 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.928985 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:17.932253 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929774 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 20:05:17.932253 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929783 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 20:05:17.932253 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929795 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 20:05:17.932253 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929800 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 20:05:17.932253 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929805 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 20:05:17.932253 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929809 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 20:05:17.932253 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929813 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 20:05:17.932253 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929818 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 20:05:17.932253 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929821 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 20:05:17.932253 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929824 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 20:05:17.932253 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929827 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 20:05:17.932762 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929831 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 20:05:17.932762 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929834 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 20:05:17.932762 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929837 2573 flags.go:64] FLAG: --cgroup-root="" Apr 20 20:05:17.932762 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929840 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 20:05:17.932762 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929843 2573 flags.go:64] FLAG: --client-ca-file="" Apr 20 20:05:17.932762 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929846 2573 flags.go:64] FLAG: --cloud-config="" Apr 20 20:05:17.932762 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929865 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 20 20:05:17.932762 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929869 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 20:05:17.932762 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929873 2573 flags.go:64] FLAG: --cluster-domain="" Apr 20 20:05:17.932762 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929876 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 20:05:17.932762 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929879 2573 flags.go:64] FLAG: --config-dir="" Apr 20 20:05:17.932762 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929882 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 20:05:17.932762 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929886 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 20:05:17.932762 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929890 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 20:05:17.932762 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929893 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 20:05:17.932762 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929896 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 20:05:17.932762 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929899 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 20:05:17.932762 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929902 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 20 20:05:17.932762 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929905 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 20:05:17.932762 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929908 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 20:05:17.932762 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929911 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 20:05:17.932762 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929914 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 20:05:17.932762 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929919 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 20:05:17.932762 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929922 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 20:05:17.932762 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929925 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 20:05:17.933398 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929927 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 20:05:17.933398 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929931 2573 flags.go:64] FLAG: --enable-server="true" Apr 20 20:05:17.933398 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929934 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 20:05:17.933398 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929939 2573 flags.go:64] FLAG: --event-burst="100" Apr 20 20:05:17.933398 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929942 2573 flags.go:64] FLAG: --event-qps="50" Apr 20 20:05:17.933398 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929945 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 20:05:17.933398 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929948 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 20:05:17.933398 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929956 2573 flags.go:64] FLAG: --eviction-hard="" Apr 20 20:05:17.933398 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929960 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 20:05:17.933398 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929963 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 20:05:17.933398 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929966 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 20:05:17.933398 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929969 2573 flags.go:64] FLAG: --eviction-soft="" Apr 20 20:05:17.933398 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929972 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 20:05:17.933398 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929977 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 20:05:17.933398 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929980 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 20:05:17.933398 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929983 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 20:05:17.933398 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929986 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 20:05:17.933398 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929989 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 20:05:17.933398 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.929992 2573 flags.go:64] FLAG: --feature-gates="" Apr 20 20:05:17.933398 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930000 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 20:05:17.933398 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930003 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 20:05:17.933398 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930006 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 20:05:17.933398 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930009 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 20:05:17.933398 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930013 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 20 20:05:17.933398 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930016 2573 flags.go:64] FLAG: --help="false" Apr 20 20:05:17.934020 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930019 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-130-227.ec2.internal" Apr 20 20:05:17.934020 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930022 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 20:05:17.934020 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930024 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 20:05:17.934020 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930028 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 20:05:17.934020 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930031 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 20:05:17.934020 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930034 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 20:05:17.934020 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930037 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 20:05:17.934020 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930040 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 20:05:17.934020 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930043 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 20:05:17.934020 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930046 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 20:05:17.934020 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930050 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 20:05:17.934020 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930053 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 20:05:17.934020 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930055 2573 flags.go:64] FLAG: --kube-reserved="" Apr 20 20:05:17.934020 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930058 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 20:05:17.934020 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930062 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 20:05:17.934020 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930065 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 20:05:17.934020 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930068 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 20:05:17.934020 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930071 2573 flags.go:64] FLAG: --lock-file="" Apr 20 20:05:17.934020 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930074 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 20:05:17.934020 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930077 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 20:05:17.934020 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930081 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 20:05:17.934020 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930087 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 20:05:17.934020 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930090 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 20:05:17.934587 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930092 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 20:05:17.934587 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930095 2573 flags.go:64] FLAG: --logging-format="text" Apr 20 20:05:17.934587 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930098 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 20:05:17.934587 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930102 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 20:05:17.934587 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930105 2573 flags.go:64] FLAG: --manifest-url="" Apr 20 20:05:17.934587 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930108 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 20 20:05:17.934587 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930112 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 20:05:17.934587 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930115 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 20:05:17.934587 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930119 2573 flags.go:64] FLAG: --max-pods="110" Apr 20 20:05:17.934587 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930122 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 20:05:17.934587 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930125 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 20:05:17.934587 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930128 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 20:05:17.934587 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930131 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 20:05:17.934587 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930134 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 20:05:17.934587 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930137 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 20:05:17.934587 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930140 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 20:05:17.934587 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930148 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 20:05:17.934587 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930151 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 20:05:17.934587 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930153 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 20:05:17.934587 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930157 2573 flags.go:64] FLAG: --pod-cidr="" Apr 20 20:05:17.934587 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930160 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 20:05:17.934587 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930166 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 20:05:17.934587 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930169 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 20:05:17.934587 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930172 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 20 20:05:17.935186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930177 2573 flags.go:64] FLAG: --port="10250" Apr 20 20:05:17.935186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930180 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 20:05:17.935186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930183 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-04c4c97a0f613016f" Apr 20 20:05:17.935186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930186 2573 flags.go:64] FLAG: --qos-reserved="" Apr 20 20:05:17.935186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930189 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 20 20:05:17.935186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930193 2573 flags.go:64] FLAG: --register-node="true" Apr 20 20:05:17.935186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930196 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 20 20:05:17.935186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930199 2573 flags.go:64] FLAG: --register-with-taints="" Apr 20 20:05:17.935186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930203 2573 flags.go:64] FLAG: --registry-burst="10" Apr 20 20:05:17.935186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930205 2573 flags.go:64] FLAG: --registry-qps="5" Apr 20 20:05:17.935186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930208 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 20 20:05:17.935186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930211 2573 flags.go:64] FLAG: --reserved-memory="" Apr 20 20:05:17.935186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930215 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 20:05:17.935186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930218 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 20:05:17.935186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930221 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 20:05:17.935186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930224 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 20:05:17.935186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930227 2573 flags.go:64] FLAG: --runonce="false" Apr 20 20:05:17.935186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930229 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 20:05:17.935186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930233 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 20:05:17.935186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930236 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 20 20:05:17.935186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930239 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 20:05:17.935186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930242 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 20:05:17.935186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930245 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 20:05:17.935186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930248 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 20:05:17.935186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930251 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 20:05:17.935186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930253 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 20:05:17.935832 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930256 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 20:05:17.935832 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930259 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 20:05:17.935832 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930262 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 20:05:17.935832 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930265 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 20:05:17.935832 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930268 2573 flags.go:64] FLAG: --system-cgroups="" Apr 20 20:05:17.935832 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930271 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 20:05:17.935832 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930278 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 20:05:17.935832 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930281 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 20 20:05:17.935832 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930284 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 20:05:17.935832 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930288 2573 flags.go:64] FLAG: --tls-min-version="" Apr 20 20:05:17.935832 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930291 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 20:05:17.935832 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930295 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 20:05:17.935832 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930298 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 20:05:17.935832 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930301 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 20:05:17.935832 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930304 2573 flags.go:64] FLAG: --v="2" Apr 20 20:05:17.935832 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930308 2573 flags.go:64] FLAG: --version="false" Apr 20 20:05:17.935832 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930312 2573 flags.go:64] FLAG: --vmodule="" Apr 20 20:05:17.935832 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930316 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 20:05:17.935832 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.930320 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 20:05:17.935832 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930413 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:17.935832 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930417 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:17.935832 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930420 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:17.935832 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930422 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:17.935832 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930425 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:17.936433 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930428 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:17.936433 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930430 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:17.936433 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930433 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:17.936433 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930436 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:17.936433 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930438 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:17.936433 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930441 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:17.936433 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930444 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:17.936433 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930446 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:17.936433 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930449 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:17.936433 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930452 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:17.936433 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930454 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:17.936433 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930457 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:17.936433 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930460 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:17.936433 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930463 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:17.936433 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930466 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:17.936433 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930469 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:17.936433 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930472 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:17.936433 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930475 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:17.936433 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930477 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:17.936433 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930480 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:17.937033 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930483 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:17.937033 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930485 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:17.937033 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930489 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:17.937033 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930492 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:17.937033 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930495 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:17.937033 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930498 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:17.937033 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930501 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:17.937033 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930503 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:17.937033 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930506 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:17.937033 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930508 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:17.937033 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930511 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:17.937033 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930514 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:17.937033 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930516 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:17.937033 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930518 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:17.937033 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930521 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:17.937033 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930523 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:17.937033 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930526 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:17.937033 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930529 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:17.937033 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930531 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:17.937503 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930534 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:17.937503 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930536 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:17.937503 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930539 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:17.937503 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930541 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:17.937503 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930544 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:17.937503 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930547 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:17.937503 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930550 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:17.937503 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930554 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:17.937503 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930557 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:17.937503 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930560 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:17.937503 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930562 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:17.937503 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930565 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:17.937503 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930568 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:17.937503 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930570 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:17.937503 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930574 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:17.937503 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930578 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:17.937503 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930581 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:17.937503 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930584 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:17.937503 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930587 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:17.938000 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930590 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:17.938000 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930593 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:17.938000 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930595 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:17.938000 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930598 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:17.938000 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930601 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:17.938000 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930604 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:17.938000 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930606 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:17.938000 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930609 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:17.938000 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930612 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:17.938000 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930614 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:17.938000 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930617 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:17.938000 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930620 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:17.938000 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930622 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:17.938000 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930625 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:17.938000 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930627 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:17.938000 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930630 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:17.938000 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930632 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:17.938000 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930635 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:17.938000 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930637 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:17.938000 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930640 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:17.938486 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930645 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:17.938486 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930647 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:17.938486 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.930650 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:17.938486 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.931350 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:05:17.938486 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.937865 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 20:05:17.938486 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.937883 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 20:05:17.938486 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.937933 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:17.938486 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.937938 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:17.938486 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.937941 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:17.938486 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.937944 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:17.938486 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.937947 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:17.938486 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.937950 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:17.938486 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.937952 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:17.938486 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.937955 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:17.938486 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.937958 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:17.938935 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.937961 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:17.938935 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.937963 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:17.938935 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.937966 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:17.938935 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.937969 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:17.938935 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.937971 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:17.938935 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.937974 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:17.938935 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.937976 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:17.938935 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.937979 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:17.938935 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.937981 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:17.938935 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.937984 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:17.938935 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.937987 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:17.938935 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.937989 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:17.938935 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.937992 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:17.938935 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.937995 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:17.938935 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.937997 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:17.938935 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938000 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:17.938935 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938006 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:17.938935 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938009 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:17.938935 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938012 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:17.938935 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938015 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:17.939443 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938017 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:17.939443 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938021 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:17.939443 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938024 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:17.939443 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938027 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:17.939443 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938031 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:17.939443 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938035 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:17.939443 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938039 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:17.939443 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938042 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:17.939443 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938045 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:17.939443 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938048 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:17.939443 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938050 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:17.939443 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938053 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:17.939443 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938056 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:17.939443 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938058 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:17.939443 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938061 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:17.939443 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938063 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:17.939443 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938066 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:17.939443 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938069 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:17.939443 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938072 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:17.939443 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938074 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:17.939949 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938077 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:17.939949 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938079 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:17.939949 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938082 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:17.939949 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938084 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:17.939949 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938087 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:17.939949 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938089 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:17.939949 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938092 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:17.939949 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938095 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:17.939949 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938098 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:17.939949 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938101 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:17.939949 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938105 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:17.939949 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938109 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:17.939949 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938112 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:17.939949 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938115 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:17.939949 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938118 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:17.939949 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938120 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:17.939949 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938123 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:17.939949 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938126 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:17.939949 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938129 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:17.940428 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938131 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:17.940428 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938134 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:17.940428 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938137 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:17.940428 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938139 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:17.940428 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938142 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:17.940428 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938144 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:17.940428 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938147 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:17.940428 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938150 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:17.940428 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938152 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:17.940428 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938155 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:17.940428 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938157 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:17.940428 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938160 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:17.940428 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938162 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:17.940428 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938165 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:17.940428 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938167 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:17.940428 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938170 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:17.940428 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938173 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:17.940428 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938175 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:17.940974 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.938180 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:05:17.940974 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938279 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:17.940974 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938283 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:17.940974 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938287 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:17.940974 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938290 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:17.940974 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938302 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:17.940974 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938305 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:17.940974 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938308 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:17.940974 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938311 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:17.940974 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938313 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:17.940974 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938316 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:17.940974 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938319 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:17.940974 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938322 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:17.940974 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938324 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:17.940974 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938327 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:17.940974 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938330 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:17.941415 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938332 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:17.941415 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938335 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:17.941415 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938337 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:17.941415 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938340 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:17.941415 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938342 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:17.941415 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938345 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:17.941415 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938348 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:17.941415 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938350 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:17.941415 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938353 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:17.941415 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938356 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:17.941415 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938358 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:17.941415 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938361 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:17.941415 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938364 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:17.941415 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938366 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:17.941415 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938369 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:17.941415 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938371 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:17.941415 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938374 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:17.941415 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938377 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:17.941415 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938380 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:17.941916 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938383 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:17.941916 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938386 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:17.941916 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938389 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:17.941916 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938395 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:17.941916 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938398 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:17.941916 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938400 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:17.941916 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938403 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:17.941916 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938406 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:17.941916 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938409 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:17.941916 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938414 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:17.941916 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938417 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:17.941916 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938420 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:17.941916 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938423 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:17.941916 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938426 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:17.941916 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938428 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:17.941916 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938431 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:17.941916 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938433 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:17.941916 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938436 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:17.941916 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938438 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:17.942381 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938441 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:17.942381 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938462 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:17.942381 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938465 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:17.942381 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938469 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:17.942381 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938471 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:17.942381 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938475 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:17.942381 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938478 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:17.942381 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938481 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:17.942381 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938483 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:17.942381 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938486 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:17.942381 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938489 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:17.942381 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938493 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:17.942381 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938496 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:17.942381 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938499 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:17.942381 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938501 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:17.942381 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938504 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:17.942381 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938507 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:17.942381 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938510 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:17.942381 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938512 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:17.942846 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938515 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:17.942846 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938518 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:17.942846 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938520 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:17.942846 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938523 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:17.942846 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938525 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:17.942846 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938528 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:17.942846 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938530 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:17.942846 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938533 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:17.942846 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938536 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:17.942846 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938538 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:17.942846 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938541 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:17.942846 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938543 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:17.942846 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938546 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:17.942846 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:17.938548 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:17.942846 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.938553 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:05:17.942846 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.938647 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 20:05:17.943337 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.940656 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 20:05:17.943337 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.941603 2573 server.go:1019] "Starting client certificate rotation" Apr 20 20:05:17.943337 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.941701 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 20:05:17.943337 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.941738 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 20:05:17.968402 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.968383 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 20:05:17.972571 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.972548 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 20:05:17.988424 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.988404 2573 log.go:25] "Validated CRI v1 runtime API" Apr 20 20:05:17.994158 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.994139 2573 log.go:25] "Validated CRI v1 image API" Apr 20 20:05:17.995305 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.995290 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 20:05:17.997863 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.997834 2573 fs.go:135] Filesystem UUIDs: map[05969783-1654-4012-9090-1793cd8f4f07:/dev/nvme0n1p3 25085ce9-e74b-4075-862b-3ad96966b028:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 20 20:05:17.997921 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.997866 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 20:05:17.998606 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:17.998592 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 20:05:18.003659 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.003551 2573 manager.go:217] Machine: {Timestamp:2026-04-20 20:05:18.001520864 +0000 UTC m=+0.416813597 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099499 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2251c665f0427470cd4e8335c0be39 SystemUUID:ec2251c6-65f0-4274-70cd-4e8335c0be39 BootID:80fd5bd4-5dc7-4892-a4f5-b13b971a1461 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:d0:5d:82:c6:f7 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:d0:5d:82:c6:f7 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:1e:a1:03:c5:41:72 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 20:05:18.003659 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.003648 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 20:05:18.003791 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.003749 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 20:05:18.006227 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.006203 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 20:05:18.006365 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.006230 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-227.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 20:05:18.006412 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.006375 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 20:05:18.006412 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.006384 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 20:05:18.006412 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.006397 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 20:05:18.007401 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.007391 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 20:05:18.008949 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.008940 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 20 20:05:18.009058 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.009048 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 20:05:18.011795 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.011784 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 20 20:05:18.011845 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.011801 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 20:05:18.011845 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.011821 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 20:05:18.011932 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.011847 2573 kubelet.go:397] "Adding apiserver pod source" Apr 20 20:05:18.011932 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.011881 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 20:05:18.012906 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.012891 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 20:05:18.012906 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.012908 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 20:05:18.015988 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.015972 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 20:05:18.017353 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.017339 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 20:05:18.019113 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.019094 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 20:05:18.019207 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.019122 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 20:05:18.019207 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.019133 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 20:05:18.019207 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.019143 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 20:05:18.019207 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.019152 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 20:05:18.019207 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.019163 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 20:05:18.019207 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.019176 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 20:05:18.019207 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.019185 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 20:05:18.019207 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.019196 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 20:05:18.019207 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.019208 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 20:05:18.019515 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.019224 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 20:05:18.019515 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.019240 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 20:05:18.020310 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.020297 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 20:05:18.020364 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.020314 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 20:05:18.023335 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.023296 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qnxjw" Apr 20 20:05:18.024607 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.024589 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-227.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 20:05:18.024765 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:18.024743 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 20:05:18.024821 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:18.024774 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-227.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 20:05:18.025110 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.025094 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 20:05:18.025176 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.025166 2573 server.go:1295] "Started kubelet" Apr 20 20:05:18.025364 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.025326 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 20:05:18.025742 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.025299 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 20:05:18.025785 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.025762 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 20:05:18.026099 ip-10-0-130-227 systemd[1]: Started Kubernetes Kubelet. Apr 20 20:05:18.026895 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.026833 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 20:05:18.028280 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.028265 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 20 20:05:18.028483 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.028463 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qnxjw" Apr 20 20:05:18.030960 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:18.029776 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-227.ec2.internal.18a82959a6cf9220 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-227.ec2.internal,UID:ip-10-0-130-227.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-130-227.ec2.internal,},FirstTimestamp:2026-04-20 20:05:18.025110048 +0000 UTC m=+0.440402785,LastTimestamp:2026-04-20 20:05:18.025110048 +0000 UTC m=+0.440402785,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-227.ec2.internal,}" Apr 20 20:05:18.032991 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.032971 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 20:05:18.033614 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.033601 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 20:05:18.034304 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.034287 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 20:05:18.034402 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.034308 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 20:05:18.034456 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.034285 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 20:05:18.034456 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:18.034441 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 20:05:18.034539 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.034466 2573 factory.go:55] Registering systemd factory Apr 20 20:05:18.034539 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.034483 2573 factory.go:223] Registration of the systemd container factory successfully Apr 20 20:05:18.034539 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.034470 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 20 20:05:18.034539 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.034515 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 20 20:05:18.034539 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:18.034501 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-227.ec2.internal\" not found" Apr 20 20:05:18.034803 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.034686 2573 factory.go:153] Registering CRI-O factory Apr 20 20:05:18.034803 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.034697 2573 factory.go:223] Registration of the crio container factory successfully Apr 20 20:05:18.034803 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.034745 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 20:05:18.034803 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.034764 2573 factory.go:103] Registering Raw factory Apr 20 20:05:18.034803 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.034776 2573 manager.go:1196] Started watching for new ooms in manager Apr 20 20:05:18.035204 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.035191 2573 manager.go:319] Starting recovery of all containers Apr 20 20:05:18.045054 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.045022 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:18.045195 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.045179 2573 manager.go:324] Recovery completed Apr 20 20:05:18.047234 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:18.047191 2573 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 20 20:05:18.047965 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:18.047944 2573 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-130-227.ec2.internal\" not found" node="ip-10-0-130-227.ec2.internal" Apr 20 20:05:18.050575 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.050563 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:18.053257 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.053241 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:18.053323 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.053272 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:18.053323 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.053282 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:18.053757 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.053744 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 20:05:18.053757 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.053754 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 20:05:18.053864 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.053788 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 20 20:05:18.056263 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.056252 2573 policy_none.go:49] "None policy: Start" Apr 20 20:05:18.056298 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.056267 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 20:05:18.056298 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.056277 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 20 20:05:18.082384 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.082366 2573 manager.go:341] "Starting Device Plugin manager" Apr 20 20:05:18.082481 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:18.082398 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 20:05:18.082481 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.082407 2573 server.go:85] "Starting device plugin registration server" Apr 20 20:05:18.082627 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.082613 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 20:05:18.082678 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.082628 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 20:05:18.082719 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.082706 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 20:05:18.082808 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.082791 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 20:05:18.082808 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.082804 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 20:05:18.083199 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:18.083181 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 20:05:18.083286 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:18.083220 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-227.ec2.internal\" not found" Apr 20 20:05:18.164717 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.164686 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 20:05:18.165763 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.165744 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 20:05:18.165845 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.165772 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 20:05:18.165845 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.165791 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 20:05:18.165845 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.165798 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 20:05:18.166003 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:18.165875 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 20:05:18.169327 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.169308 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:18.183286 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.183248 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:18.189739 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.189725 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:18.189824 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.189756 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:18.189824 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.189772 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:18.189824 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.189800 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-227.ec2.internal" Apr 20 20:05:18.198057 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.198043 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-227.ec2.internal" Apr 20 20:05:18.198116 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:18.198062 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-227.ec2.internal\": node \"ip-10-0-130-227.ec2.internal\" not found" Apr 20 20:05:18.212990 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:18.212967 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-227.ec2.internal\" not found" Apr 20 20:05:18.266952 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.266924 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-227.ec2.internal"] Apr 20 20:05:18.267015 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.266989 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:18.267765 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.267752 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:18.267827 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.267777 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:18.267827 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.267787 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:18.268965 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.268953 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:18.269127 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.269112 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal" Apr 20 20:05:18.269163 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.269143 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:18.269618 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.269593 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:18.269703 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.269620 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:18.269703 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.269600 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:18.269703 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.269635 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:18.269703 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.269650 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:18.269703 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.269662 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:18.270737 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.270720 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-227.ec2.internal" Apr 20 20:05:18.270813 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.270748 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:18.271443 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.271425 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:18.271521 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.271455 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:18.271521 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.271466 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:18.301405 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:18.301382 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-227.ec2.internal\" not found" node="ip-10-0-130-227.ec2.internal" Apr 20 20:05:18.305722 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:18.305707 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-227.ec2.internal\" not found" node="ip-10-0-130-227.ec2.internal" Apr 20 20:05:18.313882 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:18.313868 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-227.ec2.internal\" not found" Apr 20 20:05:18.335957 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.335939 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0de14f2d8a312df1fd0f43b1fd02a43e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal\" (UID: \"0de14f2d8a312df1fd0f43b1fd02a43e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal" Apr 20 20:05:18.336024 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.335961 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0de14f2d8a312df1fd0f43b1fd02a43e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal\" (UID: \"0de14f2d8a312df1fd0f43b1fd02a43e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal" Apr 20 20:05:18.336024 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.335979 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1e12d5376c1eb684628dd9cf17dac4a9-config\") pod \"kube-apiserver-proxy-ip-10-0-130-227.ec2.internal\" (UID: \"1e12d5376c1eb684628dd9cf17dac4a9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-227.ec2.internal" Apr 20 20:05:18.414451 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:18.414430 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-227.ec2.internal\" not found" Apr 20 20:05:18.437000 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.436952 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0de14f2d8a312df1fd0f43b1fd02a43e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal\" (UID: \"0de14f2d8a312df1fd0f43b1fd02a43e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal" Apr 20 20:05:18.437000 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.436979 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0de14f2d8a312df1fd0f43b1fd02a43e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal\" (UID: \"0de14f2d8a312df1fd0f43b1fd02a43e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal" Apr 20 20:05:18.437000 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.436994 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1e12d5376c1eb684628dd9cf17dac4a9-config\") pod \"kube-apiserver-proxy-ip-10-0-130-227.ec2.internal\" (UID: \"1e12d5376c1eb684628dd9cf17dac4a9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-227.ec2.internal" Apr 20 20:05:18.437135 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.437031 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1e12d5376c1eb684628dd9cf17dac4a9-config\") pod \"kube-apiserver-proxy-ip-10-0-130-227.ec2.internal\" (UID: \"1e12d5376c1eb684628dd9cf17dac4a9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-227.ec2.internal" Apr 20 20:05:18.437135 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.437051 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0de14f2d8a312df1fd0f43b1fd02a43e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal\" (UID: \"0de14f2d8a312df1fd0f43b1fd02a43e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal" Apr 20 20:05:18.437135 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.437053 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0de14f2d8a312df1fd0f43b1fd02a43e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal\" (UID: \"0de14f2d8a312df1fd0f43b1fd02a43e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal" Apr 20 20:05:18.515322 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:18.515284 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-227.ec2.internal\" not found" Apr 20 20:05:18.603820 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.603786 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal" Apr 20 20:05:18.608611 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.608585 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-227.ec2.internal" Apr 20 20:05:18.616186 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:18.616168 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-227.ec2.internal\" not found" Apr 20 20:05:18.716741 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:18.716688 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-227.ec2.internal\" not found" Apr 20 20:05:18.817167 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:18.817143 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-227.ec2.internal\" not found" Apr 20 20:05:18.905075 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.905049 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:18.918152 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:18.918130 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-227.ec2.internal\" not found" Apr 20 20:05:18.941595 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.941574 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 20:05:18.942140 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.941685 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 20:05:18.942140 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.941727 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 20:05:18.942140 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:18.941727 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 20:05:19.019161 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:19.019127 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-227.ec2.internal\" not found" Apr 20 20:05:19.031156 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:19.031013 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 20:00:18 +0000 UTC" deadline="2027-11-09 15:37:08.461922408 +0000 UTC" Apr 20 20:05:19.031156 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:19.031150 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13627h31m49.430776913s" Apr 20 20:05:19.033471 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:19.033451 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 20:05:19.043371 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:19.043350 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 20:05:19.063607 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:19.063587 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-qc2vr" Apr 20 20:05:19.069085 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:19.069069 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-qc2vr" Apr 20 20:05:19.087567 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:19.087526 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0de14f2d8a312df1fd0f43b1fd02a43e.slice/crio-8bbaca88127d348a462b91a20462eab55c3058a5dccca14ad82d6d40f355b754 WatchSource:0}: Error finding container 8bbaca88127d348a462b91a20462eab55c3058a5dccca14ad82d6d40f355b754: Status 404 returned error can't find the container with id 8bbaca88127d348a462b91a20462eab55c3058a5dccca14ad82d6d40f355b754 Apr 20 20:05:19.087847 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:19.087828 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e12d5376c1eb684628dd9cf17dac4a9.slice/crio-8eb4a80d2d6b02a6c5be18858a7b6bfef32647407306287cf80e900f0672b6e2 WatchSource:0}: Error finding container 8eb4a80d2d6b02a6c5be18858a7b6bfef32647407306287cf80e900f0672b6e2: Status 404 returned error can't find the container with id 8eb4a80d2d6b02a6c5be18858a7b6bfef32647407306287cf80e900f0672b6e2 Apr 20 20:05:19.092268 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:19.092255 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:05:19.119877 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:19.119844 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-227.ec2.internal\" not found" Apr 20 20:05:19.168913 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:19.168869 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal" event={"ID":"0de14f2d8a312df1fd0f43b1fd02a43e","Type":"ContainerStarted","Data":"8bbaca88127d348a462b91a20462eab55c3058a5dccca14ad82d6d40f355b754"} Apr 20 20:05:19.169751 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:19.169732 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-227.ec2.internal" event={"ID":"1e12d5376c1eb684628dd9cf17dac4a9","Type":"ContainerStarted","Data":"8eb4a80d2d6b02a6c5be18858a7b6bfef32647407306287cf80e900f0672b6e2"} Apr 20 20:05:19.220958 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:19.220940 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-227.ec2.internal\" not found" Apr 20 20:05:19.321426 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:19.321352 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-227.ec2.internal\" not found" Apr 20 20:05:19.338161 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:19.338142 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:19.434287 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:19.434265 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal" Apr 20 20:05:19.445719 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:19.445702 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 20:05:19.446651 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:19.446640 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-227.ec2.internal" Apr 20 20:05:19.455359 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:19.455342 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 20:05:19.883885 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:19.883847 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:19.948866 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:19.948822 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:20.013326 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.013300 2573 apiserver.go:52] "Watching apiserver" Apr 20 20:05:20.024620 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.024596 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 20:05:20.027597 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.027566 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ks7s9","kube-system/global-pull-secret-syncer-k96mb","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmvq6","openshift-dns/node-resolver-pgbf5","openshift-multus/multus-6ksh7","openshift-multus/multus-additional-cni-plugins-rcsnj","openshift-network-operator/iptables-alerter-n68c4","kube-system/konnectivity-agent-lvg7k","kube-system/kube-apiserver-proxy-ip-10-0-130-227.ec2.internal","openshift-cluster-node-tuning-operator/tuned-qb6xj","openshift-image-registry/node-ca-knsf9","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal","openshift-multus/network-metrics-daemon-npkgv","openshift-network-diagnostics/network-check-target-hwpzm"] Apr 20 20:05:20.030098 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.030073 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-n68c4" Apr 20 20:05:20.030217 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.030158 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:05:20.030274 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:20.030225 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npkgv" podUID="923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d" Apr 20 20:05:20.031287 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.031269 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmvq6" Apr 20 20:05:20.032789 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.032766 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pgbf5" Apr 20 20:05:20.033961 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.033939 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.035266 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.035245 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rcsnj" Apr 20 20:05:20.035559 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.035536 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 20:05:20.035559 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.035542 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 20:05:20.035691 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.035605 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:05:20.035691 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.035680 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 20:05:20.036077 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.035907 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 20:05:20.036077 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.035956 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-nh28c\"" Apr 20 20:05:20.036236 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.036217 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 20:05:20.036512 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.036484 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hwpzm" Apr 20 20:05:20.036602 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:20.036550 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hwpzm" podUID="983cba91-1490-41d1-acd9-67e8ffb4ce55" Apr 20 20:05:20.036827 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.036807 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 20:05:20.037319 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.037280 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 20:05:20.037407 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.037330 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 20:05:20.037407 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.037353 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-5pn6f\"" Apr 20 20:05:20.038964 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.038117 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-74xqh\"" Apr 20 20:05:20.038964 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.038122 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 20:05:20.038964 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.038311 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 20:05:20.038964 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.038361 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 20:05:20.038964 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.038455 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-7xljf\"" Apr 20 20:05:20.038964 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.038539 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 20:05:20.038964 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.038672 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 20:05:20.039287 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.039074 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-7q7bq\"" Apr 20 20:05:20.039415 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.039392 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.040763 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.040652 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:20.040763 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:20.040722 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k96mb" podUID="d1323090-2026-43df-829f-115ae2bc0438" Apr 20 20:05:20.041900 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.041881 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 20:05:20.042256 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.042234 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 20:05:20.042435 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.042418 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 20:05:20.042694 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.042676 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 20:05:20.042694 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.042690 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 20:05:20.043653 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.043636 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.043941 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.043919 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-knsf9" Apr 20 20:05:20.044141 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.044124 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 20:05:20.044260 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.044241 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-s4fh9\"" Apr 20 20:05:20.045538 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.045522 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-lvg7k" Apr 20 20:05:20.046064 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.046044 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-zjqw7\"" Apr 20 20:05:20.046273 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.046254 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 20:05:20.046273 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.046260 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 20:05:20.046601 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.046580 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-s66wg\"" Apr 20 20:05:20.046713 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.046603 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:05:20.047398 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.047376 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 20:05:20.047490 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.047376 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 20:05:20.047490 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.047475 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-cnibin\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.047590 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.047547 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-host-var-lib-cni-multus\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.047590 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.047584 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-host-run-k8s-cni-cncf-io\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.049173 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.047742 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c02e57e8-2b76-4827-a61a-dac826a87aa2-host-slash\") pod \"iptables-alerter-n68c4\" (UID: \"c02e57e8-2b76-4827-a61a-dac826a87aa2\") " pod="openshift-network-operator/iptables-alerter-n68c4" Apr 20 20:05:20.049173 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.047783 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6e906053-dbcb-4d63-8f1f-4eb6a911e9e3-socket-dir\") pod \"aws-ebs-csi-driver-node-gmvq6\" (UID: \"6e906053-dbcb-4d63-8f1f-4eb6a911e9e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmvq6" Apr 20 20:05:20.049173 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.047819 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6e906053-dbcb-4d63-8f1f-4eb6a911e9e3-device-dir\") pod \"aws-ebs-csi-driver-node-gmvq6\" (UID: \"6e906053-dbcb-4d63-8f1f-4eb6a911e9e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmvq6" Apr 20 20:05:20.049173 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.047867 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s27cz\" (UniqueName: \"kubernetes.io/projected/6e906053-dbcb-4d63-8f1f-4eb6a911e9e3-kube-api-access-s27cz\") pod \"aws-ebs-csi-driver-node-gmvq6\" (UID: \"6e906053-dbcb-4d63-8f1f-4eb6a911e9e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmvq6" Apr 20 20:05:20.049173 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.047899 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-system-cni-dir\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.049173 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.047930 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-os-release\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.049173 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.047958 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e906053-dbcb-4d63-8f1f-4eb6a911e9e3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gmvq6\" (UID: \"6e906053-dbcb-4d63-8f1f-4eb6a911e9e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmvq6" Apr 20 20:05:20.049173 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.048037 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6e906053-dbcb-4d63-8f1f-4eb6a911e9e3-registration-dir\") pod \"aws-ebs-csi-driver-node-gmvq6\" (UID: \"6e906053-dbcb-4d63-8f1f-4eb6a911e9e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmvq6" Apr 20 20:05:20.049173 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.048066 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-multus-cni-dir\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.049173 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.048096 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-host-var-lib-cni-bin\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.049173 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.048193 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-multus-socket-dir-parent\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.049173 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.048224 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-hostroot\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.049173 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.048248 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-multus-conf-dir\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.049173 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.048275 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 20:05:20.049173 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.048292 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-host-run-multus-certs\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.049173 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.048349 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 20:05:20.049173 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.048455 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sknh8\" (UniqueName: \"kubernetes.io/projected/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-kube-api-access-sknh8\") pod \"network-metrics-daemon-npkgv\" (UID: \"923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d\") " pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:05:20.049173 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.048484 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f81027c8-2ac8-4e5e-b754-45c3af3ec095-cni-binary-copy\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.050078 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.048550 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-qlnrh\"" Apr 20 20:05:20.050078 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.048636 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ba60b2b3-08e4-40aa-842f-6be514920597-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rcsnj\" (UID: \"ba60b2b3-08e4-40aa-842f-6be514920597\") " pod="openshift-multus/multus-additional-cni-plugins-rcsnj" Apr 20 20:05:20.050078 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.048655 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ba60b2b3-08e4-40aa-842f-6be514920597-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rcsnj\" (UID: \"ba60b2b3-08e4-40aa-842f-6be514920597\") " pod="openshift-multus/multus-additional-cni-plugins-rcsnj" Apr 20 20:05:20.050078 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.048756 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ba60b2b3-08e4-40aa-842f-6be514920597-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rcsnj\" (UID: \"ba60b2b3-08e4-40aa-842f-6be514920597\") " pod="openshift-multus/multus-additional-cni-plugins-rcsnj" Apr 20 20:05:20.050078 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.048949 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f81027c8-2ac8-4e5e-b754-45c3af3ec095-multus-daemon-config\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.050078 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.048981 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-etc-kubernetes\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.050078 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.049144 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s24p2\" (UniqueName: \"kubernetes.io/projected/f81027c8-2ac8-4e5e-b754-45c3af3ec095-kube-api-access-s24p2\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.050078 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.049189 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-host-var-lib-kubelet\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.050078 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.049257 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba60b2b3-08e4-40aa-842f-6be514920597-system-cni-dir\") pod \"multus-additional-cni-plugins-rcsnj\" (UID: \"ba60b2b3-08e4-40aa-842f-6be514920597\") " pod="openshift-multus/multus-additional-cni-plugins-rcsnj" Apr 20 20:05:20.050078 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.049315 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ba60b2b3-08e4-40aa-842f-6be514920597-cnibin\") pod \"multus-additional-cni-plugins-rcsnj\" (UID: \"ba60b2b3-08e4-40aa-842f-6be514920597\") " pod="openshift-multus/multus-additional-cni-plugins-rcsnj" Apr 20 20:05:20.050078 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.049356 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ba60b2b3-08e4-40aa-842f-6be514920597-os-release\") pod \"multus-additional-cni-plugins-rcsnj\" (UID: \"ba60b2b3-08e4-40aa-842f-6be514920597\") " pod="openshift-multus/multus-additional-cni-plugins-rcsnj" Apr 20 20:05:20.050078 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.049389 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4thph\" (UniqueName: \"kubernetes.io/projected/ba60b2b3-08e4-40aa-842f-6be514920597-kube-api-access-4thph\") pod \"multus-additional-cni-plugins-rcsnj\" (UID: \"ba60b2b3-08e4-40aa-842f-6be514920597\") " pod="openshift-multus/multus-additional-cni-plugins-rcsnj" Apr 20 20:05:20.050078 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.049423 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6e906053-dbcb-4d63-8f1f-4eb6a911e9e3-etc-selinux\") pod \"aws-ebs-csi-driver-node-gmvq6\" (UID: \"6e906053-dbcb-4d63-8f1f-4eb6a911e9e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmvq6" Apr 20 20:05:20.050078 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.049457 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-259kr\" (UniqueName: \"kubernetes.io/projected/d645d5ae-1405-421d-8f27-65e056976e28-kube-api-access-259kr\") pod \"node-resolver-pgbf5\" (UID: \"d645d5ae-1405-421d-8f27-65e056976e28\") " pod="openshift-dns/node-resolver-pgbf5" Apr 20 20:05:20.050078 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.049484 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-host-run-netns\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.050078 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.049520 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-metrics-certs\") pod \"network-metrics-daemon-npkgv\" (UID: \"923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d\") " pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:05:20.050738 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.049552 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6e906053-dbcb-4d63-8f1f-4eb6a911e9e3-sys-fs\") pod \"aws-ebs-csi-driver-node-gmvq6\" (UID: \"6e906053-dbcb-4d63-8f1f-4eb6a911e9e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmvq6" Apr 20 20:05:20.050738 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.049584 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d645d5ae-1405-421d-8f27-65e056976e28-tmp-dir\") pod \"node-resolver-pgbf5\" (UID: \"d645d5ae-1405-421d-8f27-65e056976e28\") " pod="openshift-dns/node-resolver-pgbf5" Apr 20 20:05:20.050738 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.049614 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ba60b2b3-08e4-40aa-842f-6be514920597-cni-binary-copy\") pod \"multus-additional-cni-plugins-rcsnj\" (UID: \"ba60b2b3-08e4-40aa-842f-6be514920597\") " pod="openshift-multus/multus-additional-cni-plugins-rcsnj" Apr 20 20:05:20.050738 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.049697 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c02e57e8-2b76-4827-a61a-dac826a87aa2-iptables-alerter-script\") pod \"iptables-alerter-n68c4\" (UID: \"c02e57e8-2b76-4827-a61a-dac826a87aa2\") " pod="openshift-network-operator/iptables-alerter-n68c4" Apr 20 20:05:20.050738 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.049731 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfx75\" (UniqueName: \"kubernetes.io/projected/c02e57e8-2b76-4827-a61a-dac826a87aa2-kube-api-access-qfx75\") pod \"iptables-alerter-n68c4\" (UID: \"c02e57e8-2b76-4827-a61a-dac826a87aa2\") " pod="openshift-network-operator/iptables-alerter-n68c4" Apr 20 20:05:20.050738 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.049783 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d645d5ae-1405-421d-8f27-65e056976e28-hosts-file\") pod \"node-resolver-pgbf5\" (UID: \"d645d5ae-1405-421d-8f27-65e056976e28\") " pod="openshift-dns/node-resolver-pgbf5" Apr 20 20:05:20.069898 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.069874 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 20:00:19 +0000 UTC" deadline="2027-09-23 16:42:47.208278524 +0000 UTC" Apr 20 20:05:20.069898 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.069896 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12500h37m27.138384725s" Apr 20 20:05:20.135485 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.135432 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 20:05:20.150835 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.150809 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-run-systemd\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.150958 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.150844 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-run-ovn\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.150958 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.150880 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-node-log\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.150958 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.150903 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.150958 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.150927 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/54652553-5a61-482e-b688-ce64c57b917b-env-overrides\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.150958 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.150949 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/54652553-5a61-482e-b688-ce64c57b917b-ovnkube-script-lib\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.151166 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.151004 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-host-var-lib-kubelet\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.151166 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.151045 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba60b2b3-08e4-40aa-842f-6be514920597-system-cni-dir\") pod \"multus-additional-cni-plugins-rcsnj\" (UID: \"ba60b2b3-08e4-40aa-842f-6be514920597\") " pod="openshift-multus/multus-additional-cni-plugins-rcsnj" Apr 20 20:05:20.151166 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.151085 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba60b2b3-08e4-40aa-842f-6be514920597-system-cni-dir\") pod \"multus-additional-cni-plugins-rcsnj\" (UID: \"ba60b2b3-08e4-40aa-842f-6be514920597\") " pod="openshift-multus/multus-additional-cni-plugins-rcsnj" Apr 20 20:05:20.151166 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.151100 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-host-var-lib-kubelet\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.151166 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.151151 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ba60b2b3-08e4-40aa-842f-6be514920597-os-release\") pod \"multus-additional-cni-plugins-rcsnj\" (UID: \"ba60b2b3-08e4-40aa-842f-6be514920597\") " pod="openshift-multus/multus-additional-cni-plugins-rcsnj" Apr 20 20:05:20.151369 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.151182 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/26b28667-df5c-4e73-a021-1d3b5430daaf-etc-tuned\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.151369 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.151210 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-log-socket\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.151369 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.151232 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-host-cni-bin\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.151369 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.151279 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-metrics-certs\") pod \"network-metrics-daemon-npkgv\" (UID: \"923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d\") " pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:05:20.151369 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.151311 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6e906053-dbcb-4d63-8f1f-4eb6a911e9e3-sys-fs\") pod \"aws-ebs-csi-driver-node-gmvq6\" (UID: \"6e906053-dbcb-4d63-8f1f-4eb6a911e9e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmvq6" Apr 20 20:05:20.151369 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.151314 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ba60b2b3-08e4-40aa-842f-6be514920597-os-release\") pod \"multus-additional-cni-plugins-rcsnj\" (UID: \"ba60b2b3-08e4-40aa-842f-6be514920597\") " pod="openshift-multus/multus-additional-cni-plugins-rcsnj" Apr 20 20:05:20.151369 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.151339 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-lib-modules\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.151369 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.151365 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-cnibin\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.151715 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.151381 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6e906053-dbcb-4d63-8f1f-4eb6a911e9e3-sys-fs\") pod \"aws-ebs-csi-driver-node-gmvq6\" (UID: \"6e906053-dbcb-4d63-8f1f-4eb6a911e9e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmvq6" Apr 20 20:05:20.151715 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.151392 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-host-var-lib-cni-multus\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.151715 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:20.151413 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:20.151715 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.151417 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-etc-systemd\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.151715 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.151442 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/54652553-5a61-482e-b688-ce64c57b917b-ovn-node-metrics-cert\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.151715 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.151473 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c02e57e8-2b76-4827-a61a-dac826a87aa2-host-slash\") pod \"iptables-alerter-n68c4\" (UID: \"c02e57e8-2b76-4827-a61a-dac826a87aa2\") " pod="openshift-network-operator/iptables-alerter-n68c4" Apr 20 20:05:20.151715 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.151446 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-cnibin\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.151715 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.151478 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-host-var-lib-cni-multus\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.151715 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:20.151525 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-metrics-certs podName:923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d nodeName:}" failed. No retries permitted until 2026-04-20 20:05:20.651476319 +0000 UTC m=+3.066769041 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-metrics-certs") pod "network-metrics-daemon-npkgv" (UID: "923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:20.151715 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.151535 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c02e57e8-2b76-4827-a61a-dac826a87aa2-host-slash\") pod \"iptables-alerter-n68c4\" (UID: \"c02e57e8-2b76-4827-a61a-dac826a87aa2\") " pod="openshift-network-operator/iptables-alerter-n68c4" Apr 20 20:05:20.151715 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.151573 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6e906053-dbcb-4d63-8f1f-4eb6a911e9e3-socket-dir\") pod \"aws-ebs-csi-driver-node-gmvq6\" (UID: \"6e906053-dbcb-4d63-8f1f-4eb6a911e9e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmvq6" Apr 20 20:05:20.151715 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.151601 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6e906053-dbcb-4d63-8f1f-4eb6a911e9e3-device-dir\") pod \"aws-ebs-csi-driver-node-gmvq6\" (UID: \"6e906053-dbcb-4d63-8f1f-4eb6a911e9e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmvq6" Apr 20 20:05:20.151715 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.151627 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-host-run-multus-certs\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.151715 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.151684 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-host-run-multus-certs\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.152340 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.151801 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6e906053-dbcb-4d63-8f1f-4eb6a911e9e3-socket-dir\") pod \"aws-ebs-csi-driver-node-gmvq6\" (UID: \"6e906053-dbcb-4d63-8f1f-4eb6a911e9e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmvq6" Apr 20 20:05:20.152340 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.151837 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6e906053-dbcb-4d63-8f1f-4eb6a911e9e3-device-dir\") pod \"aws-ebs-csi-driver-node-gmvq6\" (UID: \"6e906053-dbcb-4d63-8f1f-4eb6a911e9e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmvq6" Apr 20 20:05:20.152340 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.151887 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-etc-kubernetes\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.152340 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.151917 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-host\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.152340 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.151972 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qjbv\" (UniqueName: \"kubernetes.io/projected/26b28667-df5c-4e73-a021-1d3b5430daaf-kube-api-access-4qjbv\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.152340 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152028 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e906053-dbcb-4d63-8f1f-4eb6a911e9e3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gmvq6\" (UID: \"6e906053-dbcb-4d63-8f1f-4eb6a911e9e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmvq6" Apr 20 20:05:20.152340 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152067 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6e906053-dbcb-4d63-8f1f-4eb6a911e9e3-registration-dir\") pod \"aws-ebs-csi-driver-node-gmvq6\" (UID: \"6e906053-dbcb-4d63-8f1f-4eb6a911e9e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmvq6" Apr 20 20:05:20.152340 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152080 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e906053-dbcb-4d63-8f1f-4eb6a911e9e3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gmvq6\" (UID: \"6e906053-dbcb-4d63-8f1f-4eb6a911e9e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmvq6" Apr 20 20:05:20.152340 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152121 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6e906053-dbcb-4d63-8f1f-4eb6a911e9e3-registration-dir\") pod \"aws-ebs-csi-driver-node-gmvq6\" (UID: \"6e906053-dbcb-4d63-8f1f-4eb6a911e9e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmvq6" Apr 20 20:05:20.152340 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152121 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-host-var-lib-cni-bin\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.152340 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152167 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-host-var-lib-cni-bin\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.152340 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152202 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-hostroot\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.152340 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152252 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ba60b2b3-08e4-40aa-842f-6be514920597-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rcsnj\" (UID: \"ba60b2b3-08e4-40aa-842f-6be514920597\") " pod="openshift-multus/multus-additional-cni-plugins-rcsnj" Apr 20 20:05:20.152340 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152271 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-hostroot\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.152340 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152296 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d1323090-2026-43df-829f-115ae2bc0438-kubelet-config\") pod \"global-pull-secret-syncer-k96mb\" (UID: \"d1323090-2026-43df-829f-115ae2bc0438\") " pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:20.152340 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152323 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-etc-sysctl-d\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.153106 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152347 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-systemd-units\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.153106 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152397 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-run\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.153106 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152425 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sknh8\" (UniqueName: \"kubernetes.io/projected/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-kube-api-access-sknh8\") pod \"network-metrics-daemon-npkgv\" (UID: \"923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d\") " pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:05:20.153106 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152450 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ba60b2b3-08e4-40aa-842f-6be514920597-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rcsnj\" (UID: \"ba60b2b3-08e4-40aa-842f-6be514920597\") " pod="openshift-multus/multus-additional-cni-plugins-rcsnj" Apr 20 20:05:20.153106 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152479 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/09577d69-c26a-4291-a3bd-a9d8b123245a-konnectivity-ca\") pod \"konnectivity-agent-lvg7k\" (UID: \"09577d69-c26a-4291-a3bd-a9d8b123245a\") " pod="kube-system/konnectivity-agent-lvg7k" Apr 20 20:05:20.153106 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152504 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-host-run-netns\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.153106 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152530 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-etc-openvswitch\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.153106 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152559 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/54652553-5a61-482e-b688-ce64c57b917b-ovnkube-config\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.153106 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152588 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-run-openvswitch\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.153106 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152632 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7f6b4fe9-415b-4c1d-91f4-70456b92ec7e-serviceca\") pod \"node-ca-knsf9\" (UID: \"7f6b4fe9-415b-4c1d-91f4-70456b92ec7e\") " pod="openshift-image-registry/node-ca-knsf9" Apr 20 20:05:20.153106 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152658 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ba60b2b3-08e4-40aa-842f-6be514920597-cnibin\") pod \"multus-additional-cni-plugins-rcsnj\" (UID: \"ba60b2b3-08e4-40aa-842f-6be514920597\") " pod="openshift-multus/multus-additional-cni-plugins-rcsnj" Apr 20 20:05:20.153106 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152687 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4thph\" (UniqueName: \"kubernetes.io/projected/ba60b2b3-08e4-40aa-842f-6be514920597-kube-api-access-4thph\") pod \"multus-additional-cni-plugins-rcsnj\" (UID: \"ba60b2b3-08e4-40aa-842f-6be514920597\") " pod="openshift-multus/multus-additional-cni-plugins-rcsnj" Apr 20 20:05:20.153106 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152708 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ba60b2b3-08e4-40aa-842f-6be514920597-cnibin\") pod \"multus-additional-cni-plugins-rcsnj\" (UID: \"ba60b2b3-08e4-40aa-842f-6be514920597\") " pod="openshift-multus/multus-additional-cni-plugins-rcsnj" Apr 20 20:05:20.153106 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152713 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6e906053-dbcb-4d63-8f1f-4eb6a911e9e3-etc-selinux\") pod \"aws-ebs-csi-driver-node-gmvq6\" (UID: \"6e906053-dbcb-4d63-8f1f-4eb6a911e9e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmvq6" Apr 20 20:05:20.153106 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152787 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ba60b2b3-08e4-40aa-842f-6be514920597-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rcsnj\" (UID: \"ba60b2b3-08e4-40aa-842f-6be514920597\") " pod="openshift-multus/multus-additional-cni-plugins-rcsnj" Apr 20 20:05:20.153106 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152794 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-259kr\" (UniqueName: \"kubernetes.io/projected/d645d5ae-1405-421d-8f27-65e056976e28-kube-api-access-259kr\") pod \"node-resolver-pgbf5\" (UID: \"d645d5ae-1405-421d-8f27-65e056976e28\") " pod="openshift-dns/node-resolver-pgbf5" Apr 20 20:05:20.153106 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152894 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-host-run-netns\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.153952 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152898 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6e906053-dbcb-4d63-8f1f-4eb6a911e9e3-etc-selinux\") pod \"aws-ebs-csi-driver-node-gmvq6\" (UID: \"6e906053-dbcb-4d63-8f1f-4eb6a911e9e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmvq6" Apr 20 20:05:20.153952 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152923 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/09577d69-c26a-4291-a3bd-a9d8b123245a-agent-certs\") pod \"konnectivity-agent-lvg7k\" (UID: \"09577d69-c26a-4291-a3bd-a9d8b123245a\") " pod="kube-system/konnectivity-agent-lvg7k" Apr 20 20:05:20.153952 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152957 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22hj6\" (UniqueName: \"kubernetes.io/projected/983cba91-1490-41d1-acd9-67e8ffb4ce55-kube-api-access-22hj6\") pod \"network-check-target-hwpzm\" (UID: \"983cba91-1490-41d1-acd9-67e8ffb4ce55\") " pod="openshift-network-diagnostics/network-check-target-hwpzm" Apr 20 20:05:20.153952 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152959 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-host-run-netns\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.153952 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152982 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-host-run-ovn-kubernetes\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.153952 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.152999 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ba60b2b3-08e4-40aa-842f-6be514920597-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rcsnj\" (UID: \"ba60b2b3-08e4-40aa-842f-6be514920597\") " pod="openshift-multus/multus-additional-cni-plugins-rcsnj" Apr 20 20:05:20.153952 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153016 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-652gx\" (UniqueName: \"kubernetes.io/projected/54652553-5a61-482e-b688-ce64c57b917b-kube-api-access-652gx\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.153952 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153053 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rhvx\" (UniqueName: \"kubernetes.io/projected/7f6b4fe9-415b-4c1d-91f4-70456b92ec7e-kube-api-access-6rhvx\") pod \"node-ca-knsf9\" (UID: \"7f6b4fe9-415b-4c1d-91f4-70456b92ec7e\") " pod="openshift-image-registry/node-ca-knsf9" Apr 20 20:05:20.153952 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153085 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d645d5ae-1405-421d-8f27-65e056976e28-tmp-dir\") pod \"node-resolver-pgbf5\" (UID: \"d645d5ae-1405-421d-8f27-65e056976e28\") " pod="openshift-dns/node-resolver-pgbf5" Apr 20 20:05:20.153952 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153111 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ba60b2b3-08e4-40aa-842f-6be514920597-cni-binary-copy\") pod \"multus-additional-cni-plugins-rcsnj\" (UID: \"ba60b2b3-08e4-40aa-842f-6be514920597\") " pod="openshift-multus/multus-additional-cni-plugins-rcsnj" Apr 20 20:05:20.153952 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153127 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c02e57e8-2b76-4827-a61a-dac826a87aa2-iptables-alerter-script\") pod \"iptables-alerter-n68c4\" (UID: \"c02e57e8-2b76-4827-a61a-dac826a87aa2\") " pod="openshift-network-operator/iptables-alerter-n68c4" Apr 20 20:05:20.153952 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153147 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfx75\" (UniqueName: \"kubernetes.io/projected/c02e57e8-2b76-4827-a61a-dac826a87aa2-kube-api-access-qfx75\") pod \"iptables-alerter-n68c4\" (UID: \"c02e57e8-2b76-4827-a61a-dac826a87aa2\") " pod="openshift-network-operator/iptables-alerter-n68c4" Apr 20 20:05:20.153952 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153170 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d645d5ae-1405-421d-8f27-65e056976e28-hosts-file\") pod \"node-resolver-pgbf5\" (UID: \"d645d5ae-1405-421d-8f27-65e056976e28\") " pod="openshift-dns/node-resolver-pgbf5" Apr 20 20:05:20.153952 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153195 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-host-run-k8s-cni-cncf-io\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.153952 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153219 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s27cz\" (UniqueName: \"kubernetes.io/projected/6e906053-dbcb-4d63-8f1f-4eb6a911e9e3-kube-api-access-s27cz\") pod \"aws-ebs-csi-driver-node-gmvq6\" (UID: \"6e906053-dbcb-4d63-8f1f-4eb6a911e9e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmvq6" Apr 20 20:05:20.153952 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153243 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-system-cni-dir\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.153952 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153266 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-os-release\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.154702 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153283 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-host-run-k8s-cni-cncf-io\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.154702 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153309 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-multus-socket-dir-parent\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.154702 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153333 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-etc-kubernetes\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.154702 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153354 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d645d5ae-1405-421d-8f27-65e056976e28-hosts-file\") pod \"node-resolver-pgbf5\" (UID: \"d645d5ae-1405-421d-8f27-65e056976e28\") " pod="openshift-dns/node-resolver-pgbf5" Apr 20 20:05:20.154702 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153359 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s24p2\" (UniqueName: \"kubernetes.io/projected/f81027c8-2ac8-4e5e-b754-45c3af3ec095-kube-api-access-s24p2\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.154702 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153385 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d1323090-2026-43df-829f-115ae2bc0438-dbus\") pod \"global-pull-secret-syncer-k96mb\" (UID: \"d1323090-2026-43df-829f-115ae2bc0438\") " pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:20.154702 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153409 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-os-release\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.154702 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153414 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-multus-cni-dir\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.154702 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153426 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-etc-kubernetes\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.154702 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153414 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-multus-socket-dir-parent\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.154702 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153384 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d645d5ae-1405-421d-8f27-65e056976e28-tmp-dir\") pod \"node-resolver-pgbf5\" (UID: \"d645d5ae-1405-421d-8f27-65e056976e28\") " pod="openshift-dns/node-resolver-pgbf5" Apr 20 20:05:20.154702 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153463 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-multus-conf-dir\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.154702 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153471 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-multus-cni-dir\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.154702 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153484 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-system-cni-dir\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.154702 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153584 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d1323090-2026-43df-829f-115ae2bc0438-original-pull-secret\") pod \"global-pull-secret-syncer-k96mb\" (UID: \"d1323090-2026-43df-829f-115ae2bc0438\") " pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:20.154702 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153595 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f81027c8-2ac8-4e5e-b754-45c3af3ec095-multus-conf-dir\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.154702 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153632 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ba60b2b3-08e4-40aa-842f-6be514920597-cni-binary-copy\") pod \"multus-additional-cni-plugins-rcsnj\" (UID: \"ba60b2b3-08e4-40aa-842f-6be514920597\") " pod="openshift-multus/multus-additional-cni-plugins-rcsnj" Apr 20 20:05:20.154702 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153659 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-var-lib-kubelet\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.155230 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153673 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c02e57e8-2b76-4827-a61a-dac826a87aa2-iptables-alerter-script\") pod \"iptables-alerter-n68c4\" (UID: \"c02e57e8-2b76-4827-a61a-dac826a87aa2\") " pod="openshift-network-operator/iptables-alerter-n68c4" Apr 20 20:05:20.155230 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153687 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/26b28667-df5c-4e73-a021-1d3b5430daaf-tmp\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.155230 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153709 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-host-slash\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.155230 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153725 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-host-cni-netd\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.155230 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153753 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f6b4fe9-415b-4c1d-91f4-70456b92ec7e-host\") pod \"node-ca-knsf9\" (UID: \"7f6b4fe9-415b-4c1d-91f4-70456b92ec7e\") " pod="openshift-image-registry/node-ca-knsf9" Apr 20 20:05:20.155230 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153783 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-sys\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.155230 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153830 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f81027c8-2ac8-4e5e-b754-45c3af3ec095-cni-binary-copy\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.155230 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153876 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ba60b2b3-08e4-40aa-842f-6be514920597-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rcsnj\" (UID: \"ba60b2b3-08e4-40aa-842f-6be514920597\") " pod="openshift-multus/multus-additional-cni-plugins-rcsnj" Apr 20 20:05:20.155230 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153907 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-etc-modprobe-d\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.155230 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153932 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-etc-sysconfig\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.155230 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153957 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-etc-sysctl-conf\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.155230 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.153981 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-host-kubelet\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.155230 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.154005 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ba60b2b3-08e4-40aa-842f-6be514920597-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rcsnj\" (UID: \"ba60b2b3-08e4-40aa-842f-6be514920597\") " pod="openshift-multus/multus-additional-cni-plugins-rcsnj" Apr 20 20:05:20.155230 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.154016 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-var-lib-openvswitch\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.155230 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.154071 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f81027c8-2ac8-4e5e-b754-45c3af3ec095-multus-daemon-config\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.155230 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.154297 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f81027c8-2ac8-4e5e-b754-45c3af3ec095-cni-binary-copy\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.155230 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.154541 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f81027c8-2ac8-4e5e-b754-45c3af3ec095-multus-daemon-config\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.162806 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.162784 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 20:05:20.166289 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.166266 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sknh8\" (UniqueName: \"kubernetes.io/projected/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-kube-api-access-sknh8\") pod \"network-metrics-daemon-npkgv\" (UID: \"923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d\") " pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:05:20.166375 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.166313 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s24p2\" (UniqueName: \"kubernetes.io/projected/f81027c8-2ac8-4e5e-b754-45c3af3ec095-kube-api-access-s24p2\") pod \"multus-6ksh7\" (UID: \"f81027c8-2ac8-4e5e-b754-45c3af3ec095\") " pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.166375 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.166325 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s27cz\" (UniqueName: \"kubernetes.io/projected/6e906053-dbcb-4d63-8f1f-4eb6a911e9e3-kube-api-access-s27cz\") pod \"aws-ebs-csi-driver-node-gmvq6\" (UID: \"6e906053-dbcb-4d63-8f1f-4eb6a911e9e3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmvq6" Apr 20 20:05:20.166375 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.166325 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-259kr\" (UniqueName: \"kubernetes.io/projected/d645d5ae-1405-421d-8f27-65e056976e28-kube-api-access-259kr\") pod \"node-resolver-pgbf5\" (UID: \"d645d5ae-1405-421d-8f27-65e056976e28\") " pod="openshift-dns/node-resolver-pgbf5" Apr 20 20:05:20.167017 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.166996 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4thph\" (UniqueName: \"kubernetes.io/projected/ba60b2b3-08e4-40aa-842f-6be514920597-kube-api-access-4thph\") pod \"multus-additional-cni-plugins-rcsnj\" (UID: \"ba60b2b3-08e4-40aa-842f-6be514920597\") " pod="openshift-multus/multus-additional-cni-plugins-rcsnj" Apr 20 20:05:20.168247 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.168230 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfx75\" (UniqueName: \"kubernetes.io/projected/c02e57e8-2b76-4827-a61a-dac826a87aa2-kube-api-access-qfx75\") pod \"iptables-alerter-n68c4\" (UID: \"c02e57e8-2b76-4827-a61a-dac826a87aa2\") " pod="openshift-network-operator/iptables-alerter-n68c4" Apr 20 20:05:20.254978 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.254943 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/54652553-5a61-482e-b688-ce64c57b917b-ovnkube-script-lib\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.255097 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.254985 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/26b28667-df5c-4e73-a021-1d3b5430daaf-etc-tuned\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.255097 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255009 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-log-socket\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.255097 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255033 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-host-cni-bin\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.255097 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255064 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-lib-modules\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.255097 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255080 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-etc-systemd\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.255097 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255084 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-log-socket\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.255343 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255095 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/54652553-5a61-482e-b688-ce64c57b917b-ovn-node-metrics-cert\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.255343 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255150 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-etc-systemd\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.255343 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255156 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-host-cni-bin\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.255343 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255152 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-etc-kubernetes\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.255343 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255183 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-etc-kubernetes\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.255343 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255196 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-host\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.255343 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255221 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qjbv\" (UniqueName: \"kubernetes.io/projected/26b28667-df5c-4e73-a021-1d3b5430daaf-kube-api-access-4qjbv\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.255343 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255245 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-lib-modules\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.255343 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255249 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d1323090-2026-43df-829f-115ae2bc0438-kubelet-config\") pod \"global-pull-secret-syncer-k96mb\" (UID: \"d1323090-2026-43df-829f-115ae2bc0438\") " pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:20.255343 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255274 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-etc-sysctl-d\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.255343 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255298 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-systemd-units\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.255343 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255310 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d1323090-2026-43df-829f-115ae2bc0438-kubelet-config\") pod \"global-pull-secret-syncer-k96mb\" (UID: \"d1323090-2026-43df-829f-115ae2bc0438\") " pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:20.255343 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255320 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-run\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.255343 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255348 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-systemd-units\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.256012 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255348 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/09577d69-c26a-4291-a3bd-a9d8b123245a-konnectivity-ca\") pod \"konnectivity-agent-lvg7k\" (UID: \"09577d69-c26a-4291-a3bd-a9d8b123245a\") " pod="kube-system/konnectivity-agent-lvg7k" Apr 20 20:05:20.256012 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255381 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-host-run-netns\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.256012 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255407 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-etc-openvswitch\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.256012 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255406 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-etc-sysctl-d\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.256012 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255430 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/54652553-5a61-482e-b688-ce64c57b917b-ovnkube-config\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.256012 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255455 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-host-run-netns\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.256012 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255492 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-etc-openvswitch\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.256012 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255457 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-run-openvswitch\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.256012 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255529 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7f6b4fe9-415b-4c1d-91f4-70456b92ec7e-serviceca\") pod \"node-ca-knsf9\" (UID: \"7f6b4fe9-415b-4c1d-91f4-70456b92ec7e\") " pod="openshift-image-registry/node-ca-knsf9" Apr 20 20:05:20.256012 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255556 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/09577d69-c26a-4291-a3bd-a9d8b123245a-agent-certs\") pod \"konnectivity-agent-lvg7k\" (UID: \"09577d69-c26a-4291-a3bd-a9d8b123245a\") " pod="kube-system/konnectivity-agent-lvg7k" Apr 20 20:05:20.256012 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255280 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-host\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.256012 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255579 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22hj6\" (UniqueName: \"kubernetes.io/projected/983cba91-1490-41d1-acd9-67e8ffb4ce55-kube-api-access-22hj6\") pod \"network-check-target-hwpzm\" (UID: \"983cba91-1490-41d1-acd9-67e8ffb4ce55\") " pod="openshift-network-diagnostics/network-check-target-hwpzm" Apr 20 20:05:20.256012 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255607 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-host-run-ovn-kubernetes\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.256012 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255631 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-652gx\" (UniqueName: \"kubernetes.io/projected/54652553-5a61-482e-b688-ce64c57b917b-kube-api-access-652gx\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.256012 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255649 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/54652553-5a61-482e-b688-ce64c57b917b-ovnkube-script-lib\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.256012 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255550 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-run\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.256012 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255897 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/09577d69-c26a-4291-a3bd-a9d8b123245a-konnectivity-ca\") pod \"konnectivity-agent-lvg7k\" (UID: \"09577d69-c26a-4291-a3bd-a9d8b123245a\") " pod="kube-system/konnectivity-agent-lvg7k" Apr 20 20:05:20.256744 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255655 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6rhvx\" (UniqueName: \"kubernetes.io/projected/7f6b4fe9-415b-4c1d-91f4-70456b92ec7e-kube-api-access-6rhvx\") pod \"node-ca-knsf9\" (UID: \"7f6b4fe9-415b-4c1d-91f4-70456b92ec7e\") " pod="openshift-image-registry/node-ca-knsf9" Apr 20 20:05:20.256744 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255976 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/54652553-5a61-482e-b688-ce64c57b917b-ovnkube-config\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.256744 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.255998 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d1323090-2026-43df-829f-115ae2bc0438-dbus\") pod \"global-pull-secret-syncer-k96mb\" (UID: \"d1323090-2026-43df-829f-115ae2bc0438\") " pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:20.256744 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256011 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7f6b4fe9-415b-4c1d-91f4-70456b92ec7e-serviceca\") pod \"node-ca-knsf9\" (UID: \"7f6b4fe9-415b-4c1d-91f4-70456b92ec7e\") " pod="openshift-image-registry/node-ca-knsf9" Apr 20 20:05:20.256744 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256026 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d1323090-2026-43df-829f-115ae2bc0438-original-pull-secret\") pod \"global-pull-secret-syncer-k96mb\" (UID: \"d1323090-2026-43df-829f-115ae2bc0438\") " pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:20.256744 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256028 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-run-openvswitch\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.256744 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256061 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-var-lib-kubelet\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.256744 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256069 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-host-run-ovn-kubernetes\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.256744 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256089 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/26b28667-df5c-4e73-a021-1d3b5430daaf-tmp\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.256744 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:20.256100 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:20.256744 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256108 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d1323090-2026-43df-829f-115ae2bc0438-dbus\") pod \"global-pull-secret-syncer-k96mb\" (UID: \"d1323090-2026-43df-829f-115ae2bc0438\") " pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:20.256744 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256114 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-host-slash\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.256744 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256123 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-var-lib-kubelet\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.256744 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:20.256151 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1323090-2026-43df-829f-115ae2bc0438-original-pull-secret podName:d1323090-2026-43df-829f-115ae2bc0438 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:20.756133076 +0000 UTC m=+3.171425801 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d1323090-2026-43df-829f-115ae2bc0438-original-pull-secret") pod "global-pull-secret-syncer-k96mb" (UID: "d1323090-2026-43df-829f-115ae2bc0438") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:20.256744 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256173 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-host-cni-netd\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.256744 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256183 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-host-slash\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.256744 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256199 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f6b4fe9-415b-4c1d-91f4-70456b92ec7e-host\") pod \"node-ca-knsf9\" (UID: \"7f6b4fe9-415b-4c1d-91f4-70456b92ec7e\") " pod="openshift-image-registry/node-ca-knsf9" Apr 20 20:05:20.257354 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256237 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-host-cni-netd\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.257354 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256227 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-sys\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.257354 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256272 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-etc-modprobe-d\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.257354 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256279 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f6b4fe9-415b-4c1d-91f4-70456b92ec7e-host\") pod \"node-ca-knsf9\" (UID: \"7f6b4fe9-415b-4c1d-91f4-70456b92ec7e\") " pod="openshift-image-registry/node-ca-knsf9" Apr 20 20:05:20.257354 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256301 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-etc-sysconfig\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.257354 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256340 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-sys\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.257354 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256353 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-etc-sysconfig\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.257354 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256390 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-etc-modprobe-d\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.257354 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256435 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-etc-sysctl-conf\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.257354 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256467 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-host-kubelet\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.257354 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256511 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-host-kubelet\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.257354 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256545 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-var-lib-openvswitch\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.257354 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256571 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-run-systemd\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.257354 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256579 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/26b28667-df5c-4e73-a021-1d3b5430daaf-etc-sysctl-conf\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.257354 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256594 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-run-ovn\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.257354 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256617 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-node-log\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.257354 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256625 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-var-lib-openvswitch\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.257354 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256642 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.258112 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256667 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/54652553-5a61-482e-b688-ce64c57b917b-env-overrides\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.258112 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256688 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-run-ovn\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.258112 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256735 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-run-systemd\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.258112 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256751 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.258112 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.256892 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/54652553-5a61-482e-b688-ce64c57b917b-node-log\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.258112 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.257159 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/54652553-5a61-482e-b688-ce64c57b917b-env-overrides\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.258112 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.257865 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/26b28667-df5c-4e73-a021-1d3b5430daaf-etc-tuned\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.258112 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.257870 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/54652553-5a61-482e-b688-ce64c57b917b-ovn-node-metrics-cert\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.258521 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.258501 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/26b28667-df5c-4e73-a021-1d3b5430daaf-tmp\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.258706 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.258688 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/09577d69-c26a-4291-a3bd-a9d8b123245a-agent-certs\") pod \"konnectivity-agent-lvg7k\" (UID: \"09577d69-c26a-4291-a3bd-a9d8b123245a\") " pod="kube-system/konnectivity-agent-lvg7k" Apr 20 20:05:20.270953 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:20.270931 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:20.271032 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:20.270960 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:20.271032 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:20.270973 2573 projected.go:194] Error preparing data for projected volume kube-api-access-22hj6 for pod openshift-network-diagnostics/network-check-target-hwpzm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:20.271138 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:20.271043 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/983cba91-1490-41d1-acd9-67e8ffb4ce55-kube-api-access-22hj6 podName:983cba91-1490-41d1-acd9-67e8ffb4ce55 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:20.771026521 +0000 UTC m=+3.186319247 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-22hj6" (UniqueName: "kubernetes.io/projected/983cba91-1490-41d1-acd9-67e8ffb4ce55-kube-api-access-22hj6") pod "network-check-target-hwpzm" (UID: "983cba91-1490-41d1-acd9-67e8ffb4ce55") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:20.272679 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.272660 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qjbv\" (UniqueName: \"kubernetes.io/projected/26b28667-df5c-4e73-a021-1d3b5430daaf-kube-api-access-4qjbv\") pod \"tuned-qb6xj\" (UID: \"26b28667-df5c-4e73-a021-1d3b5430daaf\") " pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.272963 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.272942 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rhvx\" (UniqueName: \"kubernetes.io/projected/7f6b4fe9-415b-4c1d-91f4-70456b92ec7e-kube-api-access-6rhvx\") pod \"node-ca-knsf9\" (UID: \"7f6b4fe9-415b-4c1d-91f4-70456b92ec7e\") " pod="openshift-image-registry/node-ca-knsf9" Apr 20 20:05:20.273708 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.273687 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-652gx\" (UniqueName: \"kubernetes.io/projected/54652553-5a61-482e-b688-ce64c57b917b-kube-api-access-652gx\") pod \"ovnkube-node-ks7s9\" (UID: \"54652553-5a61-482e-b688-ce64c57b917b\") " pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.342959 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.342930 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-n68c4" Apr 20 20:05:20.350578 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.350553 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmvq6" Apr 20 20:05:20.360381 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.360356 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pgbf5" Apr 20 20:05:20.363945 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.363923 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6ksh7" Apr 20 20:05:20.371477 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.371457 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rcsnj" Apr 20 20:05:20.377173 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.377157 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:20.384686 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.384671 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" Apr 20 20:05:20.392220 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.392173 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-knsf9" Apr 20 20:05:20.398705 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.398686 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-lvg7k" Apr 20 20:05:20.659567 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.659489 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-metrics-certs\") pod \"network-metrics-daemon-npkgv\" (UID: \"923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d\") " pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:05:20.659723 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:20.659643 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:20.659723 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:20.659712 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-metrics-certs podName:923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d nodeName:}" failed. No retries permitted until 2026-04-20 20:05:21.659691801 +0000 UTC m=+4.074984541 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-metrics-certs") pod "network-metrics-daemon-npkgv" (UID: "923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:20.760322 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.760281 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d1323090-2026-43df-829f-115ae2bc0438-original-pull-secret\") pod \"global-pull-secret-syncer-k96mb\" (UID: \"d1323090-2026-43df-829f-115ae2bc0438\") " pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:20.760472 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:20.760381 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:20.760472 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:20.760434 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1323090-2026-43df-829f-115ae2bc0438-original-pull-secret podName:d1323090-2026-43df-829f-115ae2bc0438 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:21.760416406 +0000 UTC m=+4.175709125 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d1323090-2026-43df-829f-115ae2bc0438-original-pull-secret") pod "global-pull-secret-syncer-k96mb" (UID: "d1323090-2026-43df-829f-115ae2bc0438") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:20.777036 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:20.777007 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54652553_5a61_482e_b688_ce64c57b917b.slice/crio-7482eb21eb8782c80efe539d9944c8a23371ef0a3756971718e8fbed246cf3fb WatchSource:0}: Error finding container 7482eb21eb8782c80efe539d9944c8a23371ef0a3756971718e8fbed246cf3fb: Status 404 returned error can't find the container with id 7482eb21eb8782c80efe539d9944c8a23371ef0a3756971718e8fbed246cf3fb Apr 20 20:05:20.778069 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:20.778039 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc02e57e8_2b76_4827_a61a_dac826a87aa2.slice/crio-17036e9d1c5016ee7a72f437fd92b9f7696428e19dfe1f6489df7182f8f57a81 WatchSource:0}: Error finding container 17036e9d1c5016ee7a72f437fd92b9f7696428e19dfe1f6489df7182f8f57a81: Status 404 returned error can't find the container with id 17036e9d1c5016ee7a72f437fd92b9f7696428e19dfe1f6489df7182f8f57a81 Apr 20 20:05:20.780276 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:20.780235 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f6b4fe9_415b_4c1d_91f4_70456b92ec7e.slice/crio-8ace9403efc669dc320293e920c0c2565dd767d1c14ca373b7ff3e56e0e4bfeb WatchSource:0}: Error finding container 8ace9403efc669dc320293e920c0c2565dd767d1c14ca373b7ff3e56e0e4bfeb: Status 404 returned error can't find the container with id 8ace9403efc669dc320293e920c0c2565dd767d1c14ca373b7ff3e56e0e4bfeb Apr 20 20:05:20.783154 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:20.783134 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09577d69_c26a_4291_a3bd_a9d8b123245a.slice/crio-0976715c580da6a297b3e64496c9efe549936d8c9b4061a2743f906f26ce2cae WatchSource:0}: Error finding container 0976715c580da6a297b3e64496c9efe549936d8c9b4061a2743f906f26ce2cae: Status 404 returned error can't find the container with id 0976715c580da6a297b3e64496c9efe549936d8c9b4061a2743f906f26ce2cae Apr 20 20:05:20.783865 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:20.783836 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26b28667_df5c_4e73_a021_1d3b5430daaf.slice/crio-31eae430d481a0d99980ff08d921f4e5b13e231b3e02ca48b79db4818e6b10b9 WatchSource:0}: Error finding container 31eae430d481a0d99980ff08d921f4e5b13e231b3e02ca48b79db4818e6b10b9: Status 404 returned error can't find the container with id 31eae430d481a0d99980ff08d921f4e5b13e231b3e02ca48b79db4818e6b10b9 Apr 20 20:05:20.786004 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:20.785709 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e906053_dbcb_4d63_8f1f_4eb6a911e9e3.slice/crio-5e8942656f954c48582a6e0496f15e6aabafbf4ee953c1fec8b18a86e77c0b65 WatchSource:0}: Error finding container 5e8942656f954c48582a6e0496f15e6aabafbf4ee953c1fec8b18a86e77c0b65: Status 404 returned error can't find the container with id 5e8942656f954c48582a6e0496f15e6aabafbf4ee953c1fec8b18a86e77c0b65 Apr 20 20:05:20.786361 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:20.786342 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba60b2b3_08e4_40aa_842f_6be514920597.slice/crio-a3d12cd0a01e8dc23d8ed10f4011c0d8297e28b4f6c69f92b83013f05afa471e WatchSource:0}: Error finding container a3d12cd0a01e8dc23d8ed10f4011c0d8297e28b4f6c69f92b83013f05afa471e: Status 404 returned error can't find the container with id a3d12cd0a01e8dc23d8ed10f4011c0d8297e28b4f6c69f92b83013f05afa471e Apr 20 20:05:20.787117 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:20.786909 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf81027c8_2ac8_4e5e_b754_45c3af3ec095.slice/crio-858b412178d53708d39dbd2229b2e1fe2089debf8b5a8c2f675c21dc12341b8a WatchSource:0}: Error finding container 858b412178d53708d39dbd2229b2e1fe2089debf8b5a8c2f675c21dc12341b8a: Status 404 returned error can't find the container with id 858b412178d53708d39dbd2229b2e1fe2089debf8b5a8c2f675c21dc12341b8a Apr 20 20:05:20.788262 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:20.788143 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd645d5ae_1405_421d_8f27_65e056976e28.slice/crio-61dbafd86b1fb98764916ef3f063c7cc9db0765ba3bf825a682483f01af76855 WatchSource:0}: Error finding container 61dbafd86b1fb98764916ef3f063c7cc9db0765ba3bf825a682483f01af76855: Status 404 returned error can't find the container with id 61dbafd86b1fb98764916ef3f063c7cc9db0765ba3bf825a682483f01af76855 Apr 20 20:05:20.861312 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.861289 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22hj6\" (UniqueName: \"kubernetes.io/projected/983cba91-1490-41d1-acd9-67e8ffb4ce55-kube-api-access-22hj6\") pod \"network-check-target-hwpzm\" (UID: \"983cba91-1490-41d1-acd9-67e8ffb4ce55\") " pod="openshift-network-diagnostics/network-check-target-hwpzm" Apr 20 20:05:20.861465 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:20.861447 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:20.861529 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:20.861471 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:20.861529 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:20.861483 2573 projected.go:194] Error preparing data for projected volume kube-api-access-22hj6 for pod openshift-network-diagnostics/network-check-target-hwpzm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:20.861602 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:20.861537 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/983cba91-1490-41d1-acd9-67e8ffb4ce55-kube-api-access-22hj6 podName:983cba91-1490-41d1-acd9-67e8ffb4ce55 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:21.861519523 +0000 UTC m=+4.276812256 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-22hj6" (UniqueName: "kubernetes.io/projected/983cba91-1490-41d1-acd9-67e8ffb4ce55-kube-api-access-22hj6") pod "network-check-target-hwpzm" (UID: "983cba91-1490-41d1-acd9-67e8ffb4ce55") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:20.927745 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:20.927684 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:21.070510 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:21.070477 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 20:00:19 +0000 UTC" deadline="2027-11-10 01:13:10.555750175 +0000 UTC" Apr 20 20:05:21.070510 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:21.070505 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13637h7m49.485248777s" Apr 20 20:05:21.167226 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:21.166783 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hwpzm" Apr 20 20:05:21.167226 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:21.166900 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hwpzm" podUID="983cba91-1490-41d1-acd9-67e8ffb4ce55" Apr 20 20:05:21.174066 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:21.174032 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pgbf5" event={"ID":"d645d5ae-1405-421d-8f27-65e056976e28","Type":"ContainerStarted","Data":"61dbafd86b1fb98764916ef3f063c7cc9db0765ba3bf825a682483f01af76855"} Apr 20 20:05:21.175737 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:21.175552 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6ksh7" event={"ID":"f81027c8-2ac8-4e5e-b754-45c3af3ec095","Type":"ContainerStarted","Data":"858b412178d53708d39dbd2229b2e1fe2089debf8b5a8c2f675c21dc12341b8a"} Apr 20 20:05:21.177837 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:21.177781 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmvq6" event={"ID":"6e906053-dbcb-4d63-8f1f-4eb6a911e9e3","Type":"ContainerStarted","Data":"5e8942656f954c48582a6e0496f15e6aabafbf4ee953c1fec8b18a86e77c0b65"} Apr 20 20:05:21.179721 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:21.179695 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" event={"ID":"26b28667-df5c-4e73-a021-1d3b5430daaf","Type":"ContainerStarted","Data":"31eae430d481a0d99980ff08d921f4e5b13e231b3e02ca48b79db4818e6b10b9"} Apr 20 20:05:21.181711 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:21.181683 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-n68c4" event={"ID":"c02e57e8-2b76-4827-a61a-dac826a87aa2","Type":"ContainerStarted","Data":"17036e9d1c5016ee7a72f437fd92b9f7696428e19dfe1f6489df7182f8f57a81"} Apr 20 20:05:21.183590 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:21.183566 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-227.ec2.internal" event={"ID":"1e12d5376c1eb684628dd9cf17dac4a9","Type":"ContainerStarted","Data":"3da7a09847dea07c96b85ce27f594df37bb41999549348e568595b729fe65172"} Apr 20 20:05:21.186284 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:21.186259 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rcsnj" event={"ID":"ba60b2b3-08e4-40aa-842f-6be514920597","Type":"ContainerStarted","Data":"a3d12cd0a01e8dc23d8ed10f4011c0d8297e28b4f6c69f92b83013f05afa471e"} Apr 20 20:05:21.192494 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:21.192470 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-lvg7k" event={"ID":"09577d69-c26a-4291-a3bd-a9d8b123245a","Type":"ContainerStarted","Data":"0976715c580da6a297b3e64496c9efe549936d8c9b4061a2743f906f26ce2cae"} Apr 20 20:05:21.194949 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:21.194925 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-knsf9" event={"ID":"7f6b4fe9-415b-4c1d-91f4-70456b92ec7e","Type":"ContainerStarted","Data":"8ace9403efc669dc320293e920c0c2565dd767d1c14ca373b7ff3e56e0e4bfeb"} Apr 20 20:05:21.199617 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:21.199563 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-227.ec2.internal" podStartSLOduration=2.199547393 podStartE2EDuration="2.199547393s" podCreationTimestamp="2026-04-20 20:05:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:05:21.198636041 +0000 UTC m=+3.613928784" watchObservedRunningTime="2026-04-20 20:05:21.199547393 +0000 UTC m=+3.614840136" Apr 20 20:05:21.199952 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:21.199931 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" event={"ID":"54652553-5a61-482e-b688-ce64c57b917b","Type":"ContainerStarted","Data":"7482eb21eb8782c80efe539d9944c8a23371ef0a3756971718e8fbed246cf3fb"} Apr 20 20:05:21.668662 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:21.667814 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-metrics-certs\") pod \"network-metrics-daemon-npkgv\" (UID: \"923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d\") " pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:05:21.668662 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:21.668008 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:21.668662 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:21.668066 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-metrics-certs podName:923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d nodeName:}" failed. No retries permitted until 2026-04-20 20:05:23.668048827 +0000 UTC m=+6.083341560 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-metrics-certs") pod "network-metrics-daemon-npkgv" (UID: "923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:21.769112 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:21.768474 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d1323090-2026-43df-829f-115ae2bc0438-original-pull-secret\") pod \"global-pull-secret-syncer-k96mb\" (UID: \"d1323090-2026-43df-829f-115ae2bc0438\") " pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:21.769112 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:21.768644 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:21.769112 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:21.768756 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1323090-2026-43df-829f-115ae2bc0438-original-pull-secret podName:d1323090-2026-43df-829f-115ae2bc0438 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:23.76873693 +0000 UTC m=+6.184029654 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d1323090-2026-43df-829f-115ae2bc0438-original-pull-secret") pod "global-pull-secret-syncer-k96mb" (UID: "d1323090-2026-43df-829f-115ae2bc0438") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:21.870052 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:21.869318 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22hj6\" (UniqueName: \"kubernetes.io/projected/983cba91-1490-41d1-acd9-67e8ffb4ce55-kube-api-access-22hj6\") pod \"network-check-target-hwpzm\" (UID: \"983cba91-1490-41d1-acd9-67e8ffb4ce55\") " pod="openshift-network-diagnostics/network-check-target-hwpzm" Apr 20 20:05:21.870052 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:21.869478 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:21.870052 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:21.869515 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:21.870052 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:21.869529 2573 projected.go:194] Error preparing data for projected volume kube-api-access-22hj6 for pod openshift-network-diagnostics/network-check-target-hwpzm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:21.870052 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:21.869601 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/983cba91-1490-41d1-acd9-67e8ffb4ce55-kube-api-access-22hj6 podName:983cba91-1490-41d1-acd9-67e8ffb4ce55 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:23.869583139 +0000 UTC m=+6.284875873 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-22hj6" (UniqueName: "kubernetes.io/projected/983cba91-1490-41d1-acd9-67e8ffb4ce55-kube-api-access-22hj6") pod "network-check-target-hwpzm" (UID: "983cba91-1490-41d1-acd9-67e8ffb4ce55") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:22.167589 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:22.166985 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:05:22.167589 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:22.167134 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npkgv" podUID="923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d" Apr 20 20:05:22.167589 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:22.167270 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:22.167589 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:22.167388 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k96mb" podUID="d1323090-2026-43df-829f-115ae2bc0438" Apr 20 20:05:22.224492 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:22.224458 2573 generic.go:358] "Generic (PLEG): container finished" podID="0de14f2d8a312df1fd0f43b1fd02a43e" containerID="68da9c4225eb9e002d9f3f3e6ffcdd25251ebfdee53f252d2e262878b7016462" exitCode=0 Apr 20 20:05:22.224665 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:22.224542 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal" event={"ID":"0de14f2d8a312df1fd0f43b1fd02a43e","Type":"ContainerDied","Data":"68da9c4225eb9e002d9f3f3e6ffcdd25251ebfdee53f252d2e262878b7016462"} Apr 20 20:05:23.166421 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:23.166388 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hwpzm" Apr 20 20:05:23.166596 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:23.166517 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hwpzm" podUID="983cba91-1490-41d1-acd9-67e8ffb4ce55" Apr 20 20:05:23.241673 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:23.241638 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal" event={"ID":"0de14f2d8a312df1fd0f43b1fd02a43e","Type":"ContainerStarted","Data":"c33f45f7de2c145112f609b2cfc528a5d19f679f06930ac5f5e6526afea0eb0e"} Apr 20 20:05:23.683377 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:23.683338 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-metrics-certs\") pod \"network-metrics-daemon-npkgv\" (UID: \"923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d\") " pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:05:23.683553 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:23.683495 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:23.683609 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:23.683563 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-metrics-certs podName:923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d nodeName:}" failed. No retries permitted until 2026-04-20 20:05:27.683545615 +0000 UTC m=+10.098838349 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-metrics-certs") pod "network-metrics-daemon-npkgv" (UID: "923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:23.784529 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:23.784494 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d1323090-2026-43df-829f-115ae2bc0438-original-pull-secret\") pod \"global-pull-secret-syncer-k96mb\" (UID: \"d1323090-2026-43df-829f-115ae2bc0438\") " pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:23.784696 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:23.784677 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:23.784767 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:23.784750 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1323090-2026-43df-829f-115ae2bc0438-original-pull-secret podName:d1323090-2026-43df-829f-115ae2bc0438 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:27.78472985 +0000 UTC m=+10.200022589 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d1323090-2026-43df-829f-115ae2bc0438-original-pull-secret") pod "global-pull-secret-syncer-k96mb" (UID: "d1323090-2026-43df-829f-115ae2bc0438") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:23.885349 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:23.885308 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22hj6\" (UniqueName: \"kubernetes.io/projected/983cba91-1490-41d1-acd9-67e8ffb4ce55-kube-api-access-22hj6\") pod \"network-check-target-hwpzm\" (UID: \"983cba91-1490-41d1-acd9-67e8ffb4ce55\") " pod="openshift-network-diagnostics/network-check-target-hwpzm" Apr 20 20:05:23.885504 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:23.885447 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:23.885504 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:23.885472 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:23.885504 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:23.885486 2573 projected.go:194] Error preparing data for projected volume kube-api-access-22hj6 for pod openshift-network-diagnostics/network-check-target-hwpzm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:23.885650 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:23.885547 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/983cba91-1490-41d1-acd9-67e8ffb4ce55-kube-api-access-22hj6 podName:983cba91-1490-41d1-acd9-67e8ffb4ce55 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:27.88552954 +0000 UTC m=+10.300822275 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-22hj6" (UniqueName: "kubernetes.io/projected/983cba91-1490-41d1-acd9-67e8ffb4ce55-kube-api-access-22hj6") pod "network-check-target-hwpzm" (UID: "983cba91-1490-41d1-acd9-67e8ffb4ce55") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:24.167010 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:24.166979 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:05:24.167194 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:24.167122 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npkgv" podUID="923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d" Apr 20 20:05:24.168320 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:24.168287 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:24.168463 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:24.168442 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k96mb" podUID="d1323090-2026-43df-829f-115ae2bc0438" Apr 20 20:05:25.167419 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:25.166932 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hwpzm" Apr 20 20:05:25.167419 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:25.167057 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hwpzm" podUID="983cba91-1490-41d1-acd9-67e8ffb4ce55" Apr 20 20:05:26.168194 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:26.168161 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:26.168648 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:26.168300 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k96mb" podUID="d1323090-2026-43df-829f-115ae2bc0438" Apr 20 20:05:26.168648 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:26.168380 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:05:26.168648 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:26.168474 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npkgv" podUID="923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d" Apr 20 20:05:27.166044 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:27.166007 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hwpzm" Apr 20 20:05:27.166285 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:27.166108 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hwpzm" podUID="983cba91-1490-41d1-acd9-67e8ffb4ce55" Apr 20 20:05:27.717581 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:27.716996 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-metrics-certs\") pod \"network-metrics-daemon-npkgv\" (UID: \"923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d\") " pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:05:27.717581 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:27.717153 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:27.717581 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:27.717218 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-metrics-certs podName:923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d nodeName:}" failed. No retries permitted until 2026-04-20 20:05:35.717199595 +0000 UTC m=+18.132492326 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-metrics-certs") pod "network-metrics-daemon-npkgv" (UID: "923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:27.818433 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:27.818302 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d1323090-2026-43df-829f-115ae2bc0438-original-pull-secret\") pod \"global-pull-secret-syncer-k96mb\" (UID: \"d1323090-2026-43df-829f-115ae2bc0438\") " pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:27.818582 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:27.818444 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:27.818582 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:27.818512 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1323090-2026-43df-829f-115ae2bc0438-original-pull-secret podName:d1323090-2026-43df-829f-115ae2bc0438 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:35.818492936 +0000 UTC m=+18.233785656 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d1323090-2026-43df-829f-115ae2bc0438-original-pull-secret") pod "global-pull-secret-syncer-k96mb" (UID: "d1323090-2026-43df-829f-115ae2bc0438") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:27.918982 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:27.918948 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22hj6\" (UniqueName: \"kubernetes.io/projected/983cba91-1490-41d1-acd9-67e8ffb4ce55-kube-api-access-22hj6\") pod \"network-check-target-hwpzm\" (UID: \"983cba91-1490-41d1-acd9-67e8ffb4ce55\") " pod="openshift-network-diagnostics/network-check-target-hwpzm" Apr 20 20:05:27.919137 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:27.919078 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:27.919137 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:27.919098 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:27.919137 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:27.919111 2573 projected.go:194] Error preparing data for projected volume kube-api-access-22hj6 for pod openshift-network-diagnostics/network-check-target-hwpzm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:27.919293 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:27.919175 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/983cba91-1490-41d1-acd9-67e8ffb4ce55-kube-api-access-22hj6 podName:983cba91-1490-41d1-acd9-67e8ffb4ce55 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:35.919157015 +0000 UTC m=+18.334449759 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-22hj6" (UniqueName: "kubernetes.io/projected/983cba91-1490-41d1-acd9-67e8ffb4ce55-kube-api-access-22hj6") pod "network-check-target-hwpzm" (UID: "983cba91-1490-41d1-acd9-67e8ffb4ce55") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:28.168560 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:28.167716 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:05:28.168560 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:28.168063 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npkgv" podUID="923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d" Apr 20 20:05:28.168560 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:28.168425 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:28.168560 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:28.168515 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k96mb" podUID="d1323090-2026-43df-829f-115ae2bc0438" Apr 20 20:05:29.166748 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:29.166717 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hwpzm" Apr 20 20:05:29.167144 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:29.166824 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hwpzm" podUID="983cba91-1490-41d1-acd9-67e8ffb4ce55" Apr 20 20:05:30.166202 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:30.166174 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:05:30.166353 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:30.166288 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npkgv" podUID="923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d" Apr 20 20:05:30.166406 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:30.166348 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:30.166481 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:30.166459 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k96mb" podUID="d1323090-2026-43df-829f-115ae2bc0438" Apr 20 20:05:31.166093 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:31.166066 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hwpzm" Apr 20 20:05:31.166508 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:31.166180 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hwpzm" podUID="983cba91-1490-41d1-acd9-67e8ffb4ce55" Apr 20 20:05:32.166496 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:32.166004 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:32.166496 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:32.166004 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:05:32.166496 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:32.166130 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k96mb" podUID="d1323090-2026-43df-829f-115ae2bc0438" Apr 20 20:05:32.166496 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:32.166219 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npkgv" podUID="923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d" Apr 20 20:05:33.166820 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:33.166792 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hwpzm" Apr 20 20:05:33.167181 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:33.166908 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hwpzm" podUID="983cba91-1490-41d1-acd9-67e8ffb4ce55" Apr 20 20:05:34.166141 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:34.166107 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:05:34.166322 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:34.166107 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:34.166322 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:34.166222 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npkgv" podUID="923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d" Apr 20 20:05:34.166432 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:34.166349 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k96mb" podUID="d1323090-2026-43df-829f-115ae2bc0438" Apr 20 20:05:35.166160 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:35.166127 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hwpzm" Apr 20 20:05:35.166609 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:35.166256 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hwpzm" podUID="983cba91-1490-41d1-acd9-67e8ffb4ce55" Apr 20 20:05:35.779079 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:35.779048 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-metrics-certs\") pod \"network-metrics-daemon-npkgv\" (UID: \"923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d\") " pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:05:35.779246 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:35.779159 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:35.779246 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:35.779212 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-metrics-certs podName:923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d nodeName:}" failed. No retries permitted until 2026-04-20 20:05:51.779197485 +0000 UTC m=+34.194490205 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-metrics-certs") pod "network-metrics-daemon-npkgv" (UID: "923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:35.879403 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:35.879298 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d1323090-2026-43df-829f-115ae2bc0438-original-pull-secret\") pod \"global-pull-secret-syncer-k96mb\" (UID: \"d1323090-2026-43df-829f-115ae2bc0438\") " pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:35.879575 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:35.879420 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:35.879575 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:35.879500 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1323090-2026-43df-829f-115ae2bc0438-original-pull-secret podName:d1323090-2026-43df-829f-115ae2bc0438 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:51.879479612 +0000 UTC m=+34.294772333 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d1323090-2026-43df-829f-115ae2bc0438-original-pull-secret") pod "global-pull-secret-syncer-k96mb" (UID: "d1323090-2026-43df-829f-115ae2bc0438") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:35.980081 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:35.980047 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22hj6\" (UniqueName: \"kubernetes.io/projected/983cba91-1490-41d1-acd9-67e8ffb4ce55-kube-api-access-22hj6\") pod \"network-check-target-hwpzm\" (UID: \"983cba91-1490-41d1-acd9-67e8ffb4ce55\") " pod="openshift-network-diagnostics/network-check-target-hwpzm" Apr 20 20:05:35.980240 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:35.980199 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:35.980240 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:35.980218 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:35.980240 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:35.980227 2573 projected.go:194] Error preparing data for projected volume kube-api-access-22hj6 for pod openshift-network-diagnostics/network-check-target-hwpzm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:35.980377 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:35.980273 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/983cba91-1490-41d1-acd9-67e8ffb4ce55-kube-api-access-22hj6 podName:983cba91-1490-41d1-acd9-67e8ffb4ce55 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:51.980261225 +0000 UTC m=+34.395553945 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-22hj6" (UniqueName: "kubernetes.io/projected/983cba91-1490-41d1-acd9-67e8ffb4ce55-kube-api-access-22hj6") pod "network-check-target-hwpzm" (UID: "983cba91-1490-41d1-acd9-67e8ffb4ce55") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:36.166153 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:36.166072 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:36.166300 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:36.166072 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:05:36.166300 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:36.166203 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k96mb" podUID="d1323090-2026-43df-829f-115ae2bc0438" Apr 20 20:05:36.166300 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:36.166255 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npkgv" podUID="923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d" Apr 20 20:05:37.166184 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:37.166152 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hwpzm" Apr 20 20:05:37.166372 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:37.166263 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hwpzm" podUID="983cba91-1490-41d1-acd9-67e8ffb4ce55" Apr 20 20:05:38.167233 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:38.167015 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:05:38.167766 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:38.167069 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:38.167766 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:38.167342 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npkgv" podUID="923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d" Apr 20 20:05:38.167766 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:38.167375 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k96mb" podUID="d1323090-2026-43df-829f-115ae2bc0438" Apr 20 20:05:38.266812 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:38.266784 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pgbf5" event={"ID":"d645d5ae-1405-421d-8f27-65e056976e28","Type":"ContainerStarted","Data":"9d87f032f24fd5eaa00b26bff498988c16ecd32bbfbc4ff77ad2fda6753610dd"} Apr 20 20:05:38.268226 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:38.268203 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6ksh7" event={"ID":"f81027c8-2ac8-4e5e-b754-45c3af3ec095","Type":"ContainerStarted","Data":"0276057fddfc5717f378cfd2822970834b1224840e98e64a186ceb8dfd1c8a06"} Apr 20 20:05:38.269572 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:38.269552 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmvq6" event={"ID":"6e906053-dbcb-4d63-8f1f-4eb6a911e9e3","Type":"ContainerStarted","Data":"2a1548c9cc040184312ae20ced9a56bbd1023b6374ba33816e89df122716a470"} Apr 20 20:05:38.270868 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:38.270826 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" event={"ID":"26b28667-df5c-4e73-a021-1d3b5430daaf","Type":"ContainerStarted","Data":"ca60e64aa3503355db977be738ad17de84ccbe1e42db32e6aa310f7f1808351a"} Apr 20 20:05:38.272156 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:38.272129 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rcsnj" event={"ID":"ba60b2b3-08e4-40aa-842f-6be514920597","Type":"ContainerStarted","Data":"44bdd54402b35c9e2e68a52c07eb08732d0a8cab47c48380338d7ddc214b0f17"} Apr 20 20:05:38.273375 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:38.273357 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-lvg7k" event={"ID":"09577d69-c26a-4291-a3bd-a9d8b123245a","Type":"ContainerStarted","Data":"b56854208ae5bf327b3b7a64f8039a18677e50e17aee5f4ccb3ce560b2aa31aa"} Apr 20 20:05:38.274707 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:38.274678 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-knsf9" event={"ID":"7f6b4fe9-415b-4c1d-91f4-70456b92ec7e","Type":"ContainerStarted","Data":"d72b73b2f08810b827d0f0fe8c7852b65dd835151974ff22cd1eab143dc3d342"} Apr 20 20:05:38.276501 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:38.276486 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/ovn-acl-logging/0.log" Apr 20 20:05:38.276760 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:38.276744 2573 generic.go:358] "Generic (PLEG): container finished" podID="54652553-5a61-482e-b688-ce64c57b917b" containerID="ee17b6e484f173bd489663d2afc720b04355a0443145bf31d62411c7602f129d" exitCode=1 Apr 20 20:05:38.276813 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:38.276770 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" event={"ID":"54652553-5a61-482e-b688-ce64c57b917b","Type":"ContainerStarted","Data":"ad7688f62ebf72a1c68979667609ae2739223d545bcc88f7f1f8dbc5a60fc19f"} Apr 20 20:05:38.276813 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:38.276784 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" event={"ID":"54652553-5a61-482e-b688-ce64c57b917b","Type":"ContainerDied","Data":"ee17b6e484f173bd489663d2afc720b04355a0443145bf31d62411c7602f129d"} Apr 20 20:05:38.276813 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:38.276794 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" event={"ID":"54652553-5a61-482e-b688-ce64c57b917b","Type":"ContainerStarted","Data":"6bf3f887895d66b3e418d6043a2f8046df5ee696572553a6799064beb368cd30"} Apr 20 20:05:38.282326 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:38.282288 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-227.ec2.internal" podStartSLOduration=19.28227797 podStartE2EDuration="19.28227797s" podCreationTimestamp="2026-04-20 20:05:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:05:23.26306871 +0000 UTC m=+5.678361466" watchObservedRunningTime="2026-04-20 20:05:38.28227797 +0000 UTC m=+20.697570711" Apr 20 20:05:38.282660 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:38.282636 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-pgbf5" podStartSLOduration=7.980107405 podStartE2EDuration="20.282629073s" podCreationTimestamp="2026-04-20 20:05:18 +0000 UTC" firstStartedPulling="2026-04-20 20:05:20.791699314 +0000 UTC m=+3.206992035" lastFinishedPulling="2026-04-20 20:05:33.094220979 +0000 UTC m=+15.509513703" observedRunningTime="2026-04-20 20:05:38.282124193 +0000 UTC m=+20.697416936" watchObservedRunningTime="2026-04-20 20:05:38.282629073 +0000 UTC m=+20.697921815" Apr 20 20:05:38.302782 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:38.302747 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-qb6xj" podStartSLOduration=3.297346475 podStartE2EDuration="20.302737333s" podCreationTimestamp="2026-04-20 20:05:18 +0000 UTC" firstStartedPulling="2026-04-20 20:05:20.786276273 +0000 UTC m=+3.201568998" lastFinishedPulling="2026-04-20 20:05:37.791667121 +0000 UTC m=+20.206959856" observedRunningTime="2026-04-20 20:05:38.30212229 +0000 UTC m=+20.717415032" watchObservedRunningTime="2026-04-20 20:05:38.302737333 +0000 UTC m=+20.718030075" Apr 20 20:05:38.316798 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:38.316762 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-knsf9" podStartSLOduration=8.005075632 podStartE2EDuration="20.316752257s" podCreationTimestamp="2026-04-20 20:05:18 +0000 UTC" firstStartedPulling="2026-04-20 20:05:20.782543106 +0000 UTC m=+3.197835850" lastFinishedPulling="2026-04-20 20:05:33.094219736 +0000 UTC m=+15.509512475" observedRunningTime="2026-04-20 20:05:38.316622371 +0000 UTC m=+20.731915122" watchObservedRunningTime="2026-04-20 20:05:38.316752257 +0000 UTC m=+20.732044999" Apr 20 20:05:38.351146 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:38.351104 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-lvg7k" podStartSLOduration=3.344294915 podStartE2EDuration="20.351088377s" podCreationTimestamp="2026-04-20 20:05:18 +0000 UTC" firstStartedPulling="2026-04-20 20:05:20.784693291 +0000 UTC m=+3.199986013" lastFinishedPulling="2026-04-20 20:05:37.791486746 +0000 UTC m=+20.206779475" observedRunningTime="2026-04-20 20:05:38.332780548 +0000 UTC m=+20.748073291" watchObservedRunningTime="2026-04-20 20:05:38.351088377 +0000 UTC m=+20.766381121" Apr 20 20:05:38.376543 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:38.376500 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-6ksh7" podStartSLOduration=3.337779315 podStartE2EDuration="20.376485191s" podCreationTimestamp="2026-04-20 20:05:18 +0000 UTC" firstStartedPulling="2026-04-20 20:05:20.790371952 +0000 UTC m=+3.205664688" lastFinishedPulling="2026-04-20 20:05:37.829077838 +0000 UTC m=+20.244370564" observedRunningTime="2026-04-20 20:05:38.35091162 +0000 UTC m=+20.766204364" watchObservedRunningTime="2026-04-20 20:05:38.376485191 +0000 UTC m=+20.791777933" Apr 20 20:05:39.166992 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:39.166778 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hwpzm" Apr 20 20:05:39.167114 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:39.167016 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hwpzm" podUID="983cba91-1490-41d1-acd9-67e8ffb4ce55" Apr 20 20:05:39.279478 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:39.279451 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-n68c4" event={"ID":"c02e57e8-2b76-4827-a61a-dac826a87aa2","Type":"ContainerStarted","Data":"c022fc71f4cf784bd097f2821dad58805b5c1b6034debe23cff36eb51d361f66"} Apr 20 20:05:39.280630 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:39.280610 2573 generic.go:358] "Generic (PLEG): container finished" podID="ba60b2b3-08e4-40aa-842f-6be514920597" containerID="44bdd54402b35c9e2e68a52c07eb08732d0a8cab47c48380338d7ddc214b0f17" exitCode=0 Apr 20 20:05:39.280707 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:39.280677 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rcsnj" event={"ID":"ba60b2b3-08e4-40aa-842f-6be514920597","Type":"ContainerDied","Data":"44bdd54402b35c9e2e68a52c07eb08732d0a8cab47c48380338d7ddc214b0f17"} Apr 20 20:05:39.282827 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:39.282811 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/ovn-acl-logging/0.log" Apr 20 20:05:39.283103 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:39.283083 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" event={"ID":"54652553-5a61-482e-b688-ce64c57b917b","Type":"ContainerStarted","Data":"3dbbaf8fc450e2721c637d5a006752cbbc43b7ab22cfb50d35542a2242552886"} Apr 20 20:05:39.283173 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:39.283109 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" event={"ID":"54652553-5a61-482e-b688-ce64c57b917b","Type":"ContainerStarted","Data":"2a30f6bd30dc6b2309bb03dc146e00844ed220e88c0a49d605d4221831511705"} Apr 20 20:05:39.283173 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:39.283123 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" event={"ID":"54652553-5a61-482e-b688-ce64c57b917b","Type":"ContainerStarted","Data":"968496a0717c53cba89f1c9cc020c518532085feffa8cead54ebf1fb52faa890"} Apr 20 20:05:39.496795 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:39.496775 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 20:05:40.095106 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:40.094982 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T20:05:39.496791566Z","UUID":"83c6608f-2169-4be4-86eb-961dfb2863fb","Handler":null,"Name":"","Endpoint":""} Apr 20 20:05:40.099016 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:40.098990 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 20:05:40.099016 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:40.099023 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 20:05:40.166136 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:40.166003 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:05:40.166273 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:40.166137 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npkgv" podUID="923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d" Apr 20 20:05:40.166273 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:40.166208 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:40.166390 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:40.166306 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k96mb" podUID="d1323090-2026-43df-829f-115ae2bc0438" Apr 20 20:05:40.288046 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:40.288008 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmvq6" event={"ID":"6e906053-dbcb-4d63-8f1f-4eb6a911e9e3","Type":"ContainerStarted","Data":"b045742b239d7643ffd5a29c7cbf8a3a215cfa2e9d9eee257642751fd0f00b43"} Apr 20 20:05:41.166831 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:41.166759 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hwpzm" Apr 20 20:05:41.166998 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:41.166914 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hwpzm" podUID="983cba91-1490-41d1-acd9-67e8ffb4ce55" Apr 20 20:05:41.293080 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:41.293052 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/ovn-acl-logging/0.log" Apr 20 20:05:41.293700 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:41.293403 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" event={"ID":"54652553-5a61-482e-b688-ce64c57b917b","Type":"ContainerStarted","Data":"805a17523da3f14c89396d3441bf0e1f822726f0cc7e08fd02e241cb627a96f9"} Apr 20 20:05:41.295652 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:41.295626 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmvq6" event={"ID":"6e906053-dbcb-4d63-8f1f-4eb6a911e9e3","Type":"ContainerStarted","Data":"ddf1dc6f46e7ef6d7142f8322ecbe47352aceec2c90a4f8418d580a7d13c324e"} Apr 20 20:05:41.316831 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:41.316775 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gmvq6" podStartSLOduration=3.340463587 podStartE2EDuration="23.316759251s" podCreationTimestamp="2026-04-20 20:05:18 +0000 UTC" firstStartedPulling="2026-04-20 20:05:20.78749632 +0000 UTC m=+3.202789041" lastFinishedPulling="2026-04-20 20:05:40.763791985 +0000 UTC m=+23.179084705" observedRunningTime="2026-04-20 20:05:41.316673219 +0000 UTC m=+23.731965960" watchObservedRunningTime="2026-04-20 20:05:41.316759251 +0000 UTC m=+23.732051994" Apr 20 20:05:41.317334 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:41.317300 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-n68c4" podStartSLOduration=6.306841537 podStartE2EDuration="23.31728619s" podCreationTimestamp="2026-04-20 20:05:18 +0000 UTC" firstStartedPulling="2026-04-20 20:05:20.780941743 +0000 UTC m=+3.196234463" lastFinishedPulling="2026-04-20 20:05:37.791386391 +0000 UTC m=+20.206679116" observedRunningTime="2026-04-20 20:05:40.303580718 +0000 UTC m=+22.718873479" watchObservedRunningTime="2026-04-20 20:05:41.31728619 +0000 UTC m=+23.732578932" Apr 20 20:05:42.170227 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:42.170198 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:05:42.170414 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:42.170199 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:42.170414 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:42.170315 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npkgv" podUID="923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d" Apr 20 20:05:42.170414 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:42.170385 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k96mb" podUID="d1323090-2026-43df-829f-115ae2bc0438" Apr 20 20:05:42.607258 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:42.607223 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-lvg7k" Apr 20 20:05:42.607848 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:42.607825 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-lvg7k" Apr 20 20:05:43.166888 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:43.166848 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hwpzm" Apr 20 20:05:43.167063 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:43.166959 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hwpzm" podUID="983cba91-1490-41d1-acd9-67e8ffb4ce55" Apr 20 20:05:43.299541 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:43.299510 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-lvg7k" Apr 20 20:05:43.300219 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:43.300199 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-lvg7k" Apr 20 20:05:44.166722 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:44.166539 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:05:44.167156 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:44.166543 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:44.167156 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:44.166823 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npkgv" podUID="923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d" Apr 20 20:05:44.167156 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:44.166879 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k96mb" podUID="d1323090-2026-43df-829f-115ae2bc0438" Apr 20 20:05:44.302419 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:44.302395 2573 generic.go:358] "Generic (PLEG): container finished" podID="ba60b2b3-08e4-40aa-842f-6be514920597" containerID="4fbb5816da44e643d6e4aa4ba6bfd6f003398af32521098437c18370beca82ec" exitCode=0 Apr 20 20:05:44.302547 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:44.302444 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rcsnj" event={"ID":"ba60b2b3-08e4-40aa-842f-6be514920597","Type":"ContainerDied","Data":"4fbb5816da44e643d6e4aa4ba6bfd6f003398af32521098437c18370beca82ec"} Apr 20 20:05:44.305473 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:44.305456 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/ovn-acl-logging/0.log" Apr 20 20:05:44.305876 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:44.305841 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" event={"ID":"54652553-5a61-482e-b688-ce64c57b917b","Type":"ContainerStarted","Data":"cd6e888c0bacbce8db570664df2d2d25dc59a51de0109c9b2d31dd5252574ef2"} Apr 20 20:05:44.306222 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:44.306203 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:44.306326 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:44.306230 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:44.306400 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:44.306334 2573 scope.go:117] "RemoveContainer" containerID="ee17b6e484f173bd489663d2afc720b04355a0443145bf31d62411c7602f129d" Apr 20 20:05:44.321440 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:44.321421 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:45.166774 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:45.166744 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hwpzm" Apr 20 20:05:45.167255 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:45.166877 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hwpzm" podUID="983cba91-1490-41d1-acd9-67e8ffb4ce55" Apr 20 20:05:45.311002 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:45.310978 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/ovn-acl-logging/0.log" Apr 20 20:05:45.311422 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:45.311395 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" event={"ID":"54652553-5a61-482e-b688-ce64c57b917b","Type":"ContainerStarted","Data":"f7b2aec9a943baf60563fb4f1d303f968a7ba45ad7d9cf8d391018b7003df2fe"} Apr 20 20:05:45.311596 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:45.311582 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:45.325612 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:45.325590 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:05:45.341451 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:45.341413 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" podStartSLOduration=10.230783726 podStartE2EDuration="27.341397473s" podCreationTimestamp="2026-04-20 20:05:18 +0000 UTC" firstStartedPulling="2026-04-20 20:05:20.778679703 +0000 UTC m=+3.193972426" lastFinishedPulling="2026-04-20 20:05:37.889293447 +0000 UTC m=+20.304586173" observedRunningTime="2026-04-20 20:05:45.341291952 +0000 UTC m=+27.756584695" watchObservedRunningTime="2026-04-20 20:05:45.341397473 +0000 UTC m=+27.756690218" Apr 20 20:05:45.396769 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:45.396745 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-k96mb"] Apr 20 20:05:45.397025 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:45.397003 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:45.397115 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:45.397094 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k96mb" podUID="d1323090-2026-43df-829f-115ae2bc0438" Apr 20 20:05:45.397412 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:45.397391 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hwpzm"] Apr 20 20:05:45.397483 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:45.397469 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hwpzm" Apr 20 20:05:45.397573 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:45.397551 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hwpzm" podUID="983cba91-1490-41d1-acd9-67e8ffb4ce55" Apr 20 20:05:45.398180 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:45.398158 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-npkgv"] Apr 20 20:05:45.398271 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:45.398254 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:05:45.398384 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:45.398350 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npkgv" podUID="923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d" Apr 20 20:05:47.166239 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:47.166172 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:47.166239 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:47.166202 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:05:47.166239 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:47.166208 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hwpzm" Apr 20 20:05:47.166787 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:47.166289 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k96mb" podUID="d1323090-2026-43df-829f-115ae2bc0438" Apr 20 20:05:47.166787 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:47.166399 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npkgv" podUID="923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d" Apr 20 20:05:47.166787 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:47.166457 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hwpzm" podUID="983cba91-1490-41d1-acd9-67e8ffb4ce55" Apr 20 20:05:47.316839 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:47.316806 2573 generic.go:358] "Generic (PLEG): container finished" podID="ba60b2b3-08e4-40aa-842f-6be514920597" containerID="5fd09806a2e1e9801ce2b612ea00976cf7dced4615da8559f3ce3c413f0aae63" exitCode=0 Apr 20 20:05:47.317000 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:47.316890 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rcsnj" event={"ID":"ba60b2b3-08e4-40aa-842f-6be514920597","Type":"ContainerDied","Data":"5fd09806a2e1e9801ce2b612ea00976cf7dced4615da8559f3ce3c413f0aae63"} Apr 20 20:05:49.166540 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:49.166509 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:49.166540 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:49.166529 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hwpzm" Apr 20 20:05:49.167164 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:49.166509 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:05:49.167164 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:49.166618 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-k96mb" podUID="d1323090-2026-43df-829f-115ae2bc0438" Apr 20 20:05:49.167164 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:49.166687 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-npkgv" podUID="923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d" Apr 20 20:05:49.167164 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:49.166739 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hwpzm" podUID="983cba91-1490-41d1-acd9-67e8ffb4ce55" Apr 20 20:05:49.322883 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:49.322835 2573 generic.go:358] "Generic (PLEG): container finished" podID="ba60b2b3-08e4-40aa-842f-6be514920597" containerID="f0df1722d2f8cf7d78267de52851e5f15508e71de3667039930bdfc9a65435f5" exitCode=0 Apr 20 20:05:49.323022 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:49.322889 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rcsnj" event={"ID":"ba60b2b3-08e4-40aa-842f-6be514920597","Type":"ContainerDied","Data":"f0df1722d2f8cf7d78267de52851e5f15508e71de3667039930bdfc9a65435f5"} Apr 20 20:05:50.920604 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:50.920524 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-227.ec2.internal" event="NodeReady" Apr 20 20:05:50.921137 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:50.920669 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 20:05:50.967642 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:50.967614 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-x7nqw"] Apr 20 20:05:50.986800 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:50.986777 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-67td9"] Apr 20 20:05:50.986981 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:50.986962 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x7nqw" Apr 20 20:05:50.989597 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:50.989568 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 20:05:50.990286 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:50.989776 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 20:05:50.990286 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:50.989976 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-c4d6d\"" Apr 20 20:05:51.007616 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.007594 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-x7nqw"] Apr 20 20:05:51.007714 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.007622 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-67td9"] Apr 20 20:05:51.007772 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.007737 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-67td9" Apr 20 20:05:51.010822 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.010619 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 20:05:51.010822 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.010678 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 20:05:51.010822 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.010726 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 20:05:51.010822 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.010619 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-5cv42\"" Apr 20 20:05:51.092517 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.092485 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ddb92e25-31c0-49bc-9084-b1a08aad3877-config-volume\") pod \"dns-default-x7nqw\" (UID: \"ddb92e25-31c0-49bc-9084-b1a08aad3877\") " pod="openshift-dns/dns-default-x7nqw" Apr 20 20:05:51.092679 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.092527 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddb92e25-31c0-49bc-9084-b1a08aad3877-metrics-tls\") pod \"dns-default-x7nqw\" (UID: \"ddb92e25-31c0-49bc-9084-b1a08aad3877\") " pod="openshift-dns/dns-default-x7nqw" Apr 20 20:05:51.092679 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.092553 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htttg\" (UniqueName: \"kubernetes.io/projected/ddb92e25-31c0-49bc-9084-b1a08aad3877-kube-api-access-htttg\") pod \"dns-default-x7nqw\" (UID: \"ddb92e25-31c0-49bc-9084-b1a08aad3877\") " pod="openshift-dns/dns-default-x7nqw" Apr 20 20:05:51.092679 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.092586 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ddb92e25-31c0-49bc-9084-b1a08aad3877-tmp-dir\") pod \"dns-default-x7nqw\" (UID: \"ddb92e25-31c0-49bc-9084-b1a08aad3877\") " pod="openshift-dns/dns-default-x7nqw" Apr 20 20:05:51.092679 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.092611 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9104f378-d15a-480e-aae0-cb20f3c35f2c-cert\") pod \"ingress-canary-67td9\" (UID: \"9104f378-d15a-480e-aae0-cb20f3c35f2c\") " pod="openshift-ingress-canary/ingress-canary-67td9" Apr 20 20:05:51.092828 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.092704 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmr26\" (UniqueName: \"kubernetes.io/projected/9104f378-d15a-480e-aae0-cb20f3c35f2c-kube-api-access-nmr26\") pod \"ingress-canary-67td9\" (UID: \"9104f378-d15a-480e-aae0-cb20f3c35f2c\") " pod="openshift-ingress-canary/ingress-canary-67td9" Apr 20 20:05:51.166934 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.166905 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:51.166934 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.166922 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:05:51.167172 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.166905 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hwpzm" Apr 20 20:05:51.170746 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.170450 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-lvnms\"" Apr 20 20:05:51.170746 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.170481 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 20:05:51.170746 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.170516 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 20:05:51.170746 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.170533 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 20:05:51.170746 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.170479 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 20:05:51.170746 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.170479 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jt5w7\"" Apr 20 20:05:51.192988 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.192954 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmr26\" (UniqueName: \"kubernetes.io/projected/9104f378-d15a-480e-aae0-cb20f3c35f2c-kube-api-access-nmr26\") pod \"ingress-canary-67td9\" (UID: \"9104f378-d15a-480e-aae0-cb20f3c35f2c\") " pod="openshift-ingress-canary/ingress-canary-67td9" Apr 20 20:05:51.193100 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.193019 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ddb92e25-31c0-49bc-9084-b1a08aad3877-config-volume\") pod \"dns-default-x7nqw\" (UID: \"ddb92e25-31c0-49bc-9084-b1a08aad3877\") " pod="openshift-dns/dns-default-x7nqw" Apr 20 20:05:51.193100 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.193047 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddb92e25-31c0-49bc-9084-b1a08aad3877-metrics-tls\") pod \"dns-default-x7nqw\" (UID: \"ddb92e25-31c0-49bc-9084-b1a08aad3877\") " pod="openshift-dns/dns-default-x7nqw" Apr 20 20:05:51.193100 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.193074 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-htttg\" (UniqueName: \"kubernetes.io/projected/ddb92e25-31c0-49bc-9084-b1a08aad3877-kube-api-access-htttg\") pod \"dns-default-x7nqw\" (UID: \"ddb92e25-31c0-49bc-9084-b1a08aad3877\") " pod="openshift-dns/dns-default-x7nqw" Apr 20 20:05:51.193254 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.193100 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ddb92e25-31c0-49bc-9084-b1a08aad3877-tmp-dir\") pod \"dns-default-x7nqw\" (UID: \"ddb92e25-31c0-49bc-9084-b1a08aad3877\") " pod="openshift-dns/dns-default-x7nqw" Apr 20 20:05:51.193254 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:51.193201 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:05:51.193348 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:51.193259 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddb92e25-31c0-49bc-9084-b1a08aad3877-metrics-tls podName:ddb92e25-31c0-49bc-9084-b1a08aad3877 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:51.69323945 +0000 UTC m=+34.108532184 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ddb92e25-31c0-49bc-9084-b1a08aad3877-metrics-tls") pod "dns-default-x7nqw" (UID: "ddb92e25-31c0-49bc-9084-b1a08aad3877") : secret "dns-default-metrics-tls" not found Apr 20 20:05:51.193451 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.193406 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9104f378-d15a-480e-aae0-cb20f3c35f2c-cert\") pod \"ingress-canary-67td9\" (UID: \"9104f378-d15a-480e-aae0-cb20f3c35f2c\") " pod="openshift-ingress-canary/ingress-canary-67td9" Apr 20 20:05:51.193530 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:51.193499 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:05:51.193576 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.193526 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ddb92e25-31c0-49bc-9084-b1a08aad3877-tmp-dir\") pod \"dns-default-x7nqw\" (UID: \"ddb92e25-31c0-49bc-9084-b1a08aad3877\") " pod="openshift-dns/dns-default-x7nqw" Apr 20 20:05:51.193576 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:51.193554 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9104f378-d15a-480e-aae0-cb20f3c35f2c-cert podName:9104f378-d15a-480e-aae0-cb20f3c35f2c nodeName:}" failed. No retries permitted until 2026-04-20 20:05:51.693539915 +0000 UTC m=+34.108832649 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9104f378-d15a-480e-aae0-cb20f3c35f2c-cert") pod "ingress-canary-67td9" (UID: "9104f378-d15a-480e-aae0-cb20f3c35f2c") : secret "canary-serving-cert" not found Apr 20 20:05:51.193652 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.193605 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ddb92e25-31c0-49bc-9084-b1a08aad3877-config-volume\") pod \"dns-default-x7nqw\" (UID: \"ddb92e25-31c0-49bc-9084-b1a08aad3877\") " pod="openshift-dns/dns-default-x7nqw" Apr 20 20:05:51.206125 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.205995 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-htttg\" (UniqueName: \"kubernetes.io/projected/ddb92e25-31c0-49bc-9084-b1a08aad3877-kube-api-access-htttg\") pod \"dns-default-x7nqw\" (UID: \"ddb92e25-31c0-49bc-9084-b1a08aad3877\") " pod="openshift-dns/dns-default-x7nqw" Apr 20 20:05:51.206264 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.206053 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmr26\" (UniqueName: \"kubernetes.io/projected/9104f378-d15a-480e-aae0-cb20f3c35f2c-kube-api-access-nmr26\") pod \"ingress-canary-67td9\" (UID: \"9104f378-d15a-480e-aae0-cb20f3c35f2c\") " pod="openshift-ingress-canary/ingress-canary-67td9" Apr 20 20:05:51.696672 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.696622 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddb92e25-31c0-49bc-9084-b1a08aad3877-metrics-tls\") pod \"dns-default-x7nqw\" (UID: \"ddb92e25-31c0-49bc-9084-b1a08aad3877\") " pod="openshift-dns/dns-default-x7nqw" Apr 20 20:05:51.696672 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.696663 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9104f378-d15a-480e-aae0-cb20f3c35f2c-cert\") pod \"ingress-canary-67td9\" (UID: \"9104f378-d15a-480e-aae0-cb20f3c35f2c\") " pod="openshift-ingress-canary/ingress-canary-67td9" Apr 20 20:05:51.697112 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:51.696802 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:05:51.697112 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:51.696818 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:05:51.697112 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:51.696896 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddb92e25-31c0-49bc-9084-b1a08aad3877-metrics-tls podName:ddb92e25-31c0-49bc-9084-b1a08aad3877 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:52.696872623 +0000 UTC m=+35.112165344 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ddb92e25-31c0-49bc-9084-b1a08aad3877-metrics-tls") pod "dns-default-x7nqw" (UID: "ddb92e25-31c0-49bc-9084-b1a08aad3877") : secret "dns-default-metrics-tls" not found Apr 20 20:05:51.697112 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:51.696917 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9104f378-d15a-480e-aae0-cb20f3c35f2c-cert podName:9104f378-d15a-480e-aae0-cb20f3c35f2c nodeName:}" failed. No retries permitted until 2026-04-20 20:05:52.696907623 +0000 UTC m=+35.112200348 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9104f378-d15a-480e-aae0-cb20f3c35f2c-cert") pod "ingress-canary-67td9" (UID: "9104f378-d15a-480e-aae0-cb20f3c35f2c") : secret "canary-serving-cert" not found Apr 20 20:05:51.797525 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.797494 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-metrics-certs\") pod \"network-metrics-daemon-npkgv\" (UID: \"923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d\") " pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:05:51.797725 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:51.797646 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 20:05:51.797725 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:51.797717 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-metrics-certs podName:923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d nodeName:}" failed. No retries permitted until 2026-04-20 20:06:23.79769563 +0000 UTC m=+66.212988367 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-metrics-certs") pod "network-metrics-daemon-npkgv" (UID: "923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d") : secret "metrics-daemon-secret" not found Apr 20 20:05:51.897957 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.897922 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d1323090-2026-43df-829f-115ae2bc0438-original-pull-secret\") pod \"global-pull-secret-syncer-k96mb\" (UID: \"d1323090-2026-43df-829f-115ae2bc0438\") " pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:51.900888 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.900844 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d1323090-2026-43df-829f-115ae2bc0438-original-pull-secret\") pod \"global-pull-secret-syncer-k96mb\" (UID: \"d1323090-2026-43df-829f-115ae2bc0438\") " pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:51.998333 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:51.998264 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22hj6\" (UniqueName: \"kubernetes.io/projected/983cba91-1490-41d1-acd9-67e8ffb4ce55-kube-api-access-22hj6\") pod \"network-check-target-hwpzm\" (UID: \"983cba91-1490-41d1-acd9-67e8ffb4ce55\") " pod="openshift-network-diagnostics/network-check-target-hwpzm" Apr 20 20:05:52.001021 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:52.000987 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22hj6\" (UniqueName: \"kubernetes.io/projected/983cba91-1490-41d1-acd9-67e8ffb4ce55-kube-api-access-22hj6\") pod \"network-check-target-hwpzm\" (UID: \"983cba91-1490-41d1-acd9-67e8ffb4ce55\") " pod="openshift-network-diagnostics/network-check-target-hwpzm" Apr 20 20:05:52.078943 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:52.078913 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-k96mb" Apr 20 20:05:52.084742 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:52.084715 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hwpzm" Apr 20 20:05:52.251797 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:52.251731 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-k96mb"] Apr 20 20:05:52.254574 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:52.254530 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hwpzm"] Apr 20 20:05:52.256324 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:52.256298 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1323090_2026_43df_829f_115ae2bc0438.slice/crio-4bf476c6ee47de154387e5d3f6d01be1520716851476de6dac803b5c29ccfc6c WatchSource:0}: Error finding container 4bf476c6ee47de154387e5d3f6d01be1520716851476de6dac803b5c29ccfc6c: Status 404 returned error can't find the container with id 4bf476c6ee47de154387e5d3f6d01be1520716851476de6dac803b5c29ccfc6c Apr 20 20:05:52.258573 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:05:52.258541 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod983cba91_1490_41d1_acd9_67e8ffb4ce55.slice/crio-bd791a3da03e8f04027891a62211b7dd7ba6e84e0ce93f280da8e9c1e491d2f1 WatchSource:0}: Error finding container bd791a3da03e8f04027891a62211b7dd7ba6e84e0ce93f280da8e9c1e491d2f1: Status 404 returned error can't find the container with id bd791a3da03e8f04027891a62211b7dd7ba6e84e0ce93f280da8e9c1e491d2f1 Apr 20 20:05:52.331728 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:52.331696 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hwpzm" event={"ID":"983cba91-1490-41d1-acd9-67e8ffb4ce55","Type":"ContainerStarted","Data":"bd791a3da03e8f04027891a62211b7dd7ba6e84e0ce93f280da8e9c1e491d2f1"} Apr 20 20:05:52.332831 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:52.332805 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-k96mb" event={"ID":"d1323090-2026-43df-829f-115ae2bc0438","Type":"ContainerStarted","Data":"4bf476c6ee47de154387e5d3f6d01be1520716851476de6dac803b5c29ccfc6c"} Apr 20 20:05:52.703564 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:52.703526 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddb92e25-31c0-49bc-9084-b1a08aad3877-metrics-tls\") pod \"dns-default-x7nqw\" (UID: \"ddb92e25-31c0-49bc-9084-b1a08aad3877\") " pod="openshift-dns/dns-default-x7nqw" Apr 20 20:05:52.703791 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:52.703573 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9104f378-d15a-480e-aae0-cb20f3c35f2c-cert\") pod \"ingress-canary-67td9\" (UID: \"9104f378-d15a-480e-aae0-cb20f3c35f2c\") " pod="openshift-ingress-canary/ingress-canary-67td9" Apr 20 20:05:52.703791 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:52.703681 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:05:52.703791 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:52.703708 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:05:52.703791 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:52.703759 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddb92e25-31c0-49bc-9084-b1a08aad3877-metrics-tls podName:ddb92e25-31c0-49bc-9084-b1a08aad3877 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:54.703741167 +0000 UTC m=+37.119033890 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ddb92e25-31c0-49bc-9084-b1a08aad3877-metrics-tls") pod "dns-default-x7nqw" (UID: "ddb92e25-31c0-49bc-9084-b1a08aad3877") : secret "dns-default-metrics-tls" not found Apr 20 20:05:52.703791 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:52.703775 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9104f378-d15a-480e-aae0-cb20f3c35f2c-cert podName:9104f378-d15a-480e-aae0-cb20f3c35f2c nodeName:}" failed. No retries permitted until 2026-04-20 20:05:54.703769432 +0000 UTC m=+37.119062151 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9104f378-d15a-480e-aae0-cb20f3c35f2c-cert") pod "ingress-canary-67td9" (UID: "9104f378-d15a-480e-aae0-cb20f3c35f2c") : secret "canary-serving-cert" not found Apr 20 20:05:54.720648 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:54.720617 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddb92e25-31c0-49bc-9084-b1a08aad3877-metrics-tls\") pod \"dns-default-x7nqw\" (UID: \"ddb92e25-31c0-49bc-9084-b1a08aad3877\") " pod="openshift-dns/dns-default-x7nqw" Apr 20 20:05:54.721269 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:54.720661 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9104f378-d15a-480e-aae0-cb20f3c35f2c-cert\") pod \"ingress-canary-67td9\" (UID: \"9104f378-d15a-480e-aae0-cb20f3c35f2c\") " pod="openshift-ingress-canary/ingress-canary-67td9" Apr 20 20:05:54.721269 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:54.720778 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:05:54.721269 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:54.720840 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9104f378-d15a-480e-aae0-cb20f3c35f2c-cert podName:9104f378-d15a-480e-aae0-cb20f3c35f2c nodeName:}" failed. No retries permitted until 2026-04-20 20:05:58.72082231 +0000 UTC m=+41.136115051 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9104f378-d15a-480e-aae0-cb20f3c35f2c-cert") pod "ingress-canary-67td9" (UID: "9104f378-d15a-480e-aae0-cb20f3c35f2c") : secret "canary-serving-cert" not found Apr 20 20:05:54.721269 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:54.720779 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:05:54.721269 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:54.720936 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddb92e25-31c0-49bc-9084-b1a08aad3877-metrics-tls podName:ddb92e25-31c0-49bc-9084-b1a08aad3877 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:58.720915382 +0000 UTC m=+41.136208111 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ddb92e25-31c0-49bc-9084-b1a08aad3877-metrics-tls") pod "dns-default-x7nqw" (UID: "ddb92e25-31c0-49bc-9084-b1a08aad3877") : secret "dns-default-metrics-tls" not found Apr 20 20:05:58.751439 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:58.751402 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddb92e25-31c0-49bc-9084-b1a08aad3877-metrics-tls\") pod \"dns-default-x7nqw\" (UID: \"ddb92e25-31c0-49bc-9084-b1a08aad3877\") " pod="openshift-dns/dns-default-x7nqw" Apr 20 20:05:58.751439 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:58.751444 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9104f378-d15a-480e-aae0-cb20f3c35f2c-cert\") pod \"ingress-canary-67td9\" (UID: \"9104f378-d15a-480e-aae0-cb20f3c35f2c\") " pod="openshift-ingress-canary/ingress-canary-67td9" Apr 20 20:05:58.751843 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:58.751555 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:05:58.751843 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:58.751615 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9104f378-d15a-480e-aae0-cb20f3c35f2c-cert podName:9104f378-d15a-480e-aae0-cb20f3c35f2c nodeName:}" failed. No retries permitted until 2026-04-20 20:06:06.751601735 +0000 UTC m=+49.166894458 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9104f378-d15a-480e-aae0-cb20f3c35f2c-cert") pod "ingress-canary-67td9" (UID: "9104f378-d15a-480e-aae0-cb20f3c35f2c") : secret "canary-serving-cert" not found Apr 20 20:05:58.751843 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:58.751556 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:05:58.751843 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:05:58.751698 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddb92e25-31c0-49bc-9084-b1a08aad3877-metrics-tls podName:ddb92e25-31c0-49bc-9084-b1a08aad3877 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:06.751669168 +0000 UTC m=+49.166961902 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ddb92e25-31c0-49bc-9084-b1a08aad3877-metrics-tls") pod "dns-default-x7nqw" (UID: "ddb92e25-31c0-49bc-9084-b1a08aad3877") : secret "dns-default-metrics-tls" not found Apr 20 20:05:59.346696 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:59.346661 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hwpzm" event={"ID":"983cba91-1490-41d1-acd9-67e8ffb4ce55","Type":"ContainerStarted","Data":"3e9fcaefc74430b60f7c0ebbf410ecda1026e87f41235a904e7c03b7ca471a45"} Apr 20 20:05:59.346933 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:59.346905 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-hwpzm" Apr 20 20:05:59.347978 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:59.347959 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-k96mb" event={"ID":"d1323090-2026-43df-829f-115ae2bc0438","Type":"ContainerStarted","Data":"5ae8123d45fa7e0c47f45f764971d83a56c3d3588d91e364257b1622cd680063"} Apr 20 20:05:59.350171 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:59.350147 2573 generic.go:358] "Generic (PLEG): container finished" podID="ba60b2b3-08e4-40aa-842f-6be514920597" containerID="2c1581edbacbaccaf632bcedaaf1d586b93e40019fdfdd3f75aee4963a580a80" exitCode=0 Apr 20 20:05:59.350266 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:59.350186 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rcsnj" event={"ID":"ba60b2b3-08e4-40aa-842f-6be514920597","Type":"ContainerDied","Data":"2c1581edbacbaccaf632bcedaaf1d586b93e40019fdfdd3f75aee4963a580a80"} Apr 20 20:05:59.386729 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:59.386691 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-hwpzm" podStartSLOduration=35.1161224 podStartE2EDuration="41.386680292s" podCreationTimestamp="2026-04-20 20:05:18 +0000 UTC" firstStartedPulling="2026-04-20 20:05:52.261477224 +0000 UTC m=+34.676769958" lastFinishedPulling="2026-04-20 20:05:58.532035129 +0000 UTC m=+40.947327850" observedRunningTime="2026-04-20 20:05:59.36132995 +0000 UTC m=+41.776622693" watchObservedRunningTime="2026-04-20 20:05:59.386680292 +0000 UTC m=+41.801973032" Apr 20 20:05:59.400052 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:05:59.400013 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-k96mb" podStartSLOduration=35.117727009 podStartE2EDuration="41.399999673s" podCreationTimestamp="2026-04-20 20:05:18 +0000 UTC" firstStartedPulling="2026-04-20 20:05:52.25932131 +0000 UTC m=+34.674614044" lastFinishedPulling="2026-04-20 20:05:58.541593988 +0000 UTC m=+40.956886708" observedRunningTime="2026-04-20 20:05:59.399924627 +0000 UTC m=+41.815217372" watchObservedRunningTime="2026-04-20 20:05:59.399999673 +0000 UTC m=+41.815292414" Apr 20 20:06:00.354548 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:06:00.354510 2573 generic.go:358] "Generic (PLEG): container finished" podID="ba60b2b3-08e4-40aa-842f-6be514920597" containerID="7b6495b5d9b6c600602ed7113337314ad6eae975fc5f1c650613f1816281fc18" exitCode=0 Apr 20 20:06:00.354925 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:06:00.354582 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rcsnj" event={"ID":"ba60b2b3-08e4-40aa-842f-6be514920597","Type":"ContainerDied","Data":"7b6495b5d9b6c600602ed7113337314ad6eae975fc5f1c650613f1816281fc18"} Apr 20 20:06:01.359429 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:06:01.359395 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rcsnj" event={"ID":"ba60b2b3-08e4-40aa-842f-6be514920597","Type":"ContainerStarted","Data":"85abfe2228c24f5722e73c565ee16e279e636ebd8a1f592a7768d2cc716974c6"} Apr 20 20:06:01.383324 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:06:01.383283 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-rcsnj" podStartSLOduration=5.641317167 podStartE2EDuration="43.383269657s" podCreationTimestamp="2026-04-20 20:05:18 +0000 UTC" firstStartedPulling="2026-04-20 20:05:20.790183803 +0000 UTC m=+3.205476537" lastFinishedPulling="2026-04-20 20:05:58.532136293 +0000 UTC m=+40.947429027" observedRunningTime="2026-04-20 20:06:01.381878451 +0000 UTC m=+43.797171190" watchObservedRunningTime="2026-04-20 20:06:01.383269657 +0000 UTC m=+43.798562398" Apr 20 20:06:06.804148 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:06:06.804111 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddb92e25-31c0-49bc-9084-b1a08aad3877-metrics-tls\") pod \"dns-default-x7nqw\" (UID: \"ddb92e25-31c0-49bc-9084-b1a08aad3877\") " pod="openshift-dns/dns-default-x7nqw" Apr 20 20:06:06.804148 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:06:06.804153 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9104f378-d15a-480e-aae0-cb20f3c35f2c-cert\") pod \"ingress-canary-67td9\" (UID: \"9104f378-d15a-480e-aae0-cb20f3c35f2c\") " pod="openshift-ingress-canary/ingress-canary-67td9" Apr 20 20:06:06.804586 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:06:06.804256 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:06.804586 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:06:06.804271 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:06.804586 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:06:06.804307 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9104f378-d15a-480e-aae0-cb20f3c35f2c-cert podName:9104f378-d15a-480e-aae0-cb20f3c35f2c nodeName:}" failed. No retries permitted until 2026-04-20 20:06:22.804294689 +0000 UTC m=+65.219587409 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9104f378-d15a-480e-aae0-cb20f3c35f2c-cert") pod "ingress-canary-67td9" (UID: "9104f378-d15a-480e-aae0-cb20f3c35f2c") : secret "canary-serving-cert" not found Apr 20 20:06:06.804586 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:06:06.804338 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddb92e25-31c0-49bc-9084-b1a08aad3877-metrics-tls podName:ddb92e25-31c0-49bc-9084-b1a08aad3877 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:22.804320366 +0000 UTC m=+65.219613100 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ddb92e25-31c0-49bc-9084-b1a08aad3877-metrics-tls") pod "dns-default-x7nqw" (UID: "ddb92e25-31c0-49bc-9084-b1a08aad3877") : secret "dns-default-metrics-tls" not found Apr 20 20:06:17.327785 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:06:17.327758 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ks7s9" Apr 20 20:06:22.808188 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:06:22.808146 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddb92e25-31c0-49bc-9084-b1a08aad3877-metrics-tls\") pod \"dns-default-x7nqw\" (UID: \"ddb92e25-31c0-49bc-9084-b1a08aad3877\") " pod="openshift-dns/dns-default-x7nqw" Apr 20 20:06:22.808188 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:06:22.808187 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9104f378-d15a-480e-aae0-cb20f3c35f2c-cert\") pod \"ingress-canary-67td9\" (UID: \"9104f378-d15a-480e-aae0-cb20f3c35f2c\") " pod="openshift-ingress-canary/ingress-canary-67td9" Apr 20 20:06:22.808737 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:06:22.808287 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:22.808737 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:06:22.808316 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:22.808737 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:06:22.808348 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9104f378-d15a-480e-aae0-cb20f3c35f2c-cert podName:9104f378-d15a-480e-aae0-cb20f3c35f2c nodeName:}" failed. No retries permitted until 2026-04-20 20:06:54.808334011 +0000 UTC m=+97.223626731 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9104f378-d15a-480e-aae0-cb20f3c35f2c-cert") pod "ingress-canary-67td9" (UID: "9104f378-d15a-480e-aae0-cb20f3c35f2c") : secret "canary-serving-cert" not found Apr 20 20:06:22.808737 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:06:22.808397 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddb92e25-31c0-49bc-9084-b1a08aad3877-metrics-tls podName:ddb92e25-31c0-49bc-9084-b1a08aad3877 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:54.808380109 +0000 UTC m=+97.223672835 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ddb92e25-31c0-49bc-9084-b1a08aad3877-metrics-tls") pod "dns-default-x7nqw" (UID: "ddb92e25-31c0-49bc-9084-b1a08aad3877") : secret "dns-default-metrics-tls" not found Apr 20 20:06:23.812987 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:06:23.812953 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-metrics-certs\") pod \"network-metrics-daemon-npkgv\" (UID: \"923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d\") " pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:06:23.813356 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:06:23.813059 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 20:06:23.813356 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:06:23.813121 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-metrics-certs podName:923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d nodeName:}" failed. No retries permitted until 2026-04-20 20:07:27.813107041 +0000 UTC m=+130.228399761 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-metrics-certs") pod "network-metrics-daemon-npkgv" (UID: "923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d") : secret "metrics-daemon-secret" not found Apr 20 20:06:30.357273 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:06:30.357100 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-hwpzm" Apr 20 20:06:54.818506 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:06:54.818468 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddb92e25-31c0-49bc-9084-b1a08aad3877-metrics-tls\") pod \"dns-default-x7nqw\" (UID: \"ddb92e25-31c0-49bc-9084-b1a08aad3877\") " pod="openshift-dns/dns-default-x7nqw" Apr 20 20:06:54.818506 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:06:54.818506 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9104f378-d15a-480e-aae0-cb20f3c35f2c-cert\") pod \"ingress-canary-67td9\" (UID: \"9104f378-d15a-480e-aae0-cb20f3c35f2c\") " pod="openshift-ingress-canary/ingress-canary-67td9" Apr 20 20:06:54.819023 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:06:54.818608 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:54.819023 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:06:54.818610 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:54.819023 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:06:54.818670 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ddb92e25-31c0-49bc-9084-b1a08aad3877-metrics-tls podName:ddb92e25-31c0-49bc-9084-b1a08aad3877 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:58.818654767 +0000 UTC m=+161.233947496 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ddb92e25-31c0-49bc-9084-b1a08aad3877-metrics-tls") pod "dns-default-x7nqw" (UID: "ddb92e25-31c0-49bc-9084-b1a08aad3877") : secret "dns-default-metrics-tls" not found Apr 20 20:06:54.819023 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:06:54.818683 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9104f378-d15a-480e-aae0-cb20f3c35f2c-cert podName:9104f378-d15a-480e-aae0-cb20f3c35f2c nodeName:}" failed. No retries permitted until 2026-04-20 20:07:58.81867702 +0000 UTC m=+161.233969740 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9104f378-d15a-480e-aae0-cb20f3c35f2c-cert") pod "ingress-canary-67td9" (UID: "9104f378-d15a-480e-aae0-cb20f3c35f2c") : secret "canary-serving-cert" not found Apr 20 20:07:13.068271 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.068236 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mpjng"] Apr 20 20:07:13.070073 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.070058 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mpjng" Apr 20 20:07:13.072166 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.072141 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-59745ff96d-mll2m"] Apr 20 20:07:13.072752 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.072726 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 20 20:07:13.072871 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.072798 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 20 20:07:13.072941 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.072897 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-zdmkm\"" Apr 20 20:07:13.073005 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.072961 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 20 20:07:13.073178 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.073161 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:07:13.074393 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.074375 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:13.076658 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.076638 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 20 20:07:13.076658 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.076645 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 20 20:07:13.076821 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.076690 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 20 20:07:13.077265 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.077250 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-89zhh\"" Apr 20 20:07:13.077361 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.077255 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 20:07:13.077539 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.077522 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 20 20:07:13.077596 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.077577 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 20:07:13.080740 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.080722 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mpjng"] Apr 20 20:07:13.087015 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.086998 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-59745ff96d-mll2m"] Apr 20 20:07:13.139485 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.139460 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1da8a235-923c-4d64-a78a-ee3ef677d15d-default-certificate\") pod \"router-default-59745ff96d-mll2m\" (UID: \"1da8a235-923c-4d64-a78a-ee3ef677d15d\") " pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:13.139620 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.139488 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1da8a235-923c-4d64-a78a-ee3ef677d15d-service-ca-bundle\") pod \"router-default-59745ff96d-mll2m\" (UID: \"1da8a235-923c-4d64-a78a-ee3ef677d15d\") " pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:13.139620 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.139507 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1da8a235-923c-4d64-a78a-ee3ef677d15d-stats-auth\") pod \"router-default-59745ff96d-mll2m\" (UID: \"1da8a235-923c-4d64-a78a-ee3ef677d15d\") " pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:13.139620 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.139524 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjjvh\" (UniqueName: \"kubernetes.io/projected/1da8a235-923c-4d64-a78a-ee3ef677d15d-kube-api-access-rjjvh\") pod \"router-default-59745ff96d-mll2m\" (UID: \"1da8a235-923c-4d64-a78a-ee3ef677d15d\") " pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:13.139737 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.139622 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06eaab54-dbb4-4d67-8fc9-d22d03b1e5a8-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-mpjng\" (UID: \"06eaab54-dbb4-4d67-8fc9-d22d03b1e5a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mpjng" Apr 20 20:07:13.139737 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.139649 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pddzr\" (UniqueName: \"kubernetes.io/projected/06eaab54-dbb4-4d67-8fc9-d22d03b1e5a8-kube-api-access-pddzr\") pod \"kube-storage-version-migrator-operator-6769c5d45-mpjng\" (UID: \"06eaab54-dbb4-4d67-8fc9-d22d03b1e5a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mpjng" Apr 20 20:07:13.139737 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.139697 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06eaab54-dbb4-4d67-8fc9-d22d03b1e5a8-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-mpjng\" (UID: \"06eaab54-dbb4-4d67-8fc9-d22d03b1e5a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mpjng" Apr 20 20:07:13.139737 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.139719 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1da8a235-923c-4d64-a78a-ee3ef677d15d-metrics-certs\") pod \"router-default-59745ff96d-mll2m\" (UID: \"1da8a235-923c-4d64-a78a-ee3ef677d15d\") " pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:13.165078 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.165038 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zl54z"] Apr 20 20:07:13.167018 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.167002 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zl54z" Apr 20 20:07:13.169900 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.169868 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 20 20:07:13.169900 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.169896 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:07:13.170054 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.169969 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 20 20:07:13.170104 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.170059 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-gbcwr\"" Apr 20 20:07:13.170157 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.170126 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 20 20:07:13.175876 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.175840 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zl54z"] Apr 20 20:07:13.239958 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.239931 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06eaab54-dbb4-4d67-8fc9-d22d03b1e5a8-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-mpjng\" (UID: \"06eaab54-dbb4-4d67-8fc9-d22d03b1e5a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mpjng" Apr 20 20:07:13.239958 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.239959 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pddzr\" (UniqueName: \"kubernetes.io/projected/06eaab54-dbb4-4d67-8fc9-d22d03b1e5a8-kube-api-access-pddzr\") pod \"kube-storage-version-migrator-operator-6769c5d45-mpjng\" (UID: \"06eaab54-dbb4-4d67-8fc9-d22d03b1e5a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mpjng" Apr 20 20:07:13.240131 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.239998 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06eaab54-dbb4-4d67-8fc9-d22d03b1e5a8-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-mpjng\" (UID: \"06eaab54-dbb4-4d67-8fc9-d22d03b1e5a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mpjng" Apr 20 20:07:13.240131 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.240017 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1da8a235-923c-4d64-a78a-ee3ef677d15d-metrics-certs\") pod \"router-default-59745ff96d-mll2m\" (UID: \"1da8a235-923c-4d64-a78a-ee3ef677d15d\") " pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:13.240131 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.240036 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51239262-468e-4240-9144-dfb1b3010a21-serving-cert\") pod \"service-ca-operator-d6fc45fc5-zl54z\" (UID: \"51239262-468e-4240-9144-dfb1b3010a21\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zl54z" Apr 20 20:07:13.240131 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.240057 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51239262-468e-4240-9144-dfb1b3010a21-config\") pod \"service-ca-operator-d6fc45fc5-zl54z\" (UID: \"51239262-468e-4240-9144-dfb1b3010a21\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zl54z" Apr 20 20:07:13.240131 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:13.240103 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 20:07:13.240374 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:13.240180 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1da8a235-923c-4d64-a78a-ee3ef677d15d-metrics-certs podName:1da8a235-923c-4d64-a78a-ee3ef677d15d nodeName:}" failed. No retries permitted until 2026-04-20 20:07:13.740160127 +0000 UTC m=+116.155452872 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1da8a235-923c-4d64-a78a-ee3ef677d15d-metrics-certs") pod "router-default-59745ff96d-mll2m" (UID: "1da8a235-923c-4d64-a78a-ee3ef677d15d") : secret "router-metrics-certs-default" not found Apr 20 20:07:13.240374 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.240107 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1da8a235-923c-4d64-a78a-ee3ef677d15d-default-certificate\") pod \"router-default-59745ff96d-mll2m\" (UID: \"1da8a235-923c-4d64-a78a-ee3ef677d15d\") " pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:13.240374 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.240271 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1da8a235-923c-4d64-a78a-ee3ef677d15d-service-ca-bundle\") pod \"router-default-59745ff96d-mll2m\" (UID: \"1da8a235-923c-4d64-a78a-ee3ef677d15d\") " pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:13.240374 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.240299 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1da8a235-923c-4d64-a78a-ee3ef677d15d-stats-auth\") pod \"router-default-59745ff96d-mll2m\" (UID: \"1da8a235-923c-4d64-a78a-ee3ef677d15d\") " pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:13.240374 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.240327 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rjjvh\" (UniqueName: \"kubernetes.io/projected/1da8a235-923c-4d64-a78a-ee3ef677d15d-kube-api-access-rjjvh\") pod \"router-default-59745ff96d-mll2m\" (UID: \"1da8a235-923c-4d64-a78a-ee3ef677d15d\") " pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:13.240633 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.240433 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cckf7\" (UniqueName: \"kubernetes.io/projected/51239262-468e-4240-9144-dfb1b3010a21-kube-api-access-cckf7\") pod \"service-ca-operator-d6fc45fc5-zl54z\" (UID: \"51239262-468e-4240-9144-dfb1b3010a21\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zl54z" Apr 20 20:07:13.240633 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.240449 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06eaab54-dbb4-4d67-8fc9-d22d03b1e5a8-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-mpjng\" (UID: \"06eaab54-dbb4-4d67-8fc9-d22d03b1e5a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mpjng" Apr 20 20:07:13.240633 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:13.240466 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1da8a235-923c-4d64-a78a-ee3ef677d15d-service-ca-bundle podName:1da8a235-923c-4d64-a78a-ee3ef677d15d nodeName:}" failed. No retries permitted until 2026-04-20 20:07:13.740437048 +0000 UTC m=+116.155729978 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1da8a235-923c-4d64-a78a-ee3ef677d15d-service-ca-bundle") pod "router-default-59745ff96d-mll2m" (UID: "1da8a235-923c-4d64-a78a-ee3ef677d15d") : configmap references non-existent config key: service-ca.crt Apr 20 20:07:13.243083 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.243059 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06eaab54-dbb4-4d67-8fc9-d22d03b1e5a8-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-mpjng\" (UID: \"06eaab54-dbb4-4d67-8fc9-d22d03b1e5a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mpjng" Apr 20 20:07:13.243278 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.243260 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/1da8a235-923c-4d64-a78a-ee3ef677d15d-stats-auth\") pod \"router-default-59745ff96d-mll2m\" (UID: \"1da8a235-923c-4d64-a78a-ee3ef677d15d\") " pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:13.243319 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.243290 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/1da8a235-923c-4d64-a78a-ee3ef677d15d-default-certificate\") pod \"router-default-59745ff96d-mll2m\" (UID: \"1da8a235-923c-4d64-a78a-ee3ef677d15d\") " pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:13.250817 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.250798 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjjvh\" (UniqueName: \"kubernetes.io/projected/1da8a235-923c-4d64-a78a-ee3ef677d15d-kube-api-access-rjjvh\") pod \"router-default-59745ff96d-mll2m\" (UID: \"1da8a235-923c-4d64-a78a-ee3ef677d15d\") " pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:13.251156 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.251134 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pddzr\" (UniqueName: \"kubernetes.io/projected/06eaab54-dbb4-4d67-8fc9-d22d03b1e5a8-kube-api-access-pddzr\") pod \"kube-storage-version-migrator-operator-6769c5d45-mpjng\" (UID: \"06eaab54-dbb4-4d67-8fc9-d22d03b1e5a8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mpjng" Apr 20 20:07:13.341646 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.341583 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cckf7\" (UniqueName: \"kubernetes.io/projected/51239262-468e-4240-9144-dfb1b3010a21-kube-api-access-cckf7\") pod \"service-ca-operator-d6fc45fc5-zl54z\" (UID: \"51239262-468e-4240-9144-dfb1b3010a21\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zl54z" Apr 20 20:07:13.341741 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.341645 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51239262-468e-4240-9144-dfb1b3010a21-serving-cert\") pod \"service-ca-operator-d6fc45fc5-zl54z\" (UID: \"51239262-468e-4240-9144-dfb1b3010a21\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zl54z" Apr 20 20:07:13.341741 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.341667 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51239262-468e-4240-9144-dfb1b3010a21-config\") pod \"service-ca-operator-d6fc45fc5-zl54z\" (UID: \"51239262-468e-4240-9144-dfb1b3010a21\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zl54z" Apr 20 20:07:13.342214 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.342197 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51239262-468e-4240-9144-dfb1b3010a21-config\") pod \"service-ca-operator-d6fc45fc5-zl54z\" (UID: \"51239262-468e-4240-9144-dfb1b3010a21\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zl54z" Apr 20 20:07:13.343520 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.343503 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51239262-468e-4240-9144-dfb1b3010a21-serving-cert\") pod \"service-ca-operator-d6fc45fc5-zl54z\" (UID: \"51239262-468e-4240-9144-dfb1b3010a21\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zl54z" Apr 20 20:07:13.349500 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.349481 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cckf7\" (UniqueName: \"kubernetes.io/projected/51239262-468e-4240-9144-dfb1b3010a21-kube-api-access-cckf7\") pod \"service-ca-operator-d6fc45fc5-zl54z\" (UID: \"51239262-468e-4240-9144-dfb1b3010a21\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zl54z" Apr 20 20:07:13.380703 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.380665 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-6zzrt"] Apr 20 20:07:13.381970 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.381938 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mpjng" Apr 20 20:07:13.383999 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.383982 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-qv9dl"] Apr 20 20:07:13.384157 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.384141 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6zzrt" Apr 20 20:07:13.386195 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.386141 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qwgxr"] Apr 20 20:07:13.386307 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.386290 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qv9dl" Apr 20 20:07:13.386588 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.386566 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-tjcq9\"" Apr 20 20:07:13.388236 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.388218 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qwgxr" Apr 20 20:07:13.388694 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.388678 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 20 20:07:13.389043 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.389026 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 20 20:07:13.389150 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.389131 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-cz5k5\"" Apr 20 20:07:13.391313 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.391044 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 20 20:07:13.391313 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.391082 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 20 20:07:13.391313 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.391083 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-bd69k\"" Apr 20 20:07:13.391313 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.391272 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:07:13.391613 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.391590 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-6zzrt"] Apr 20 20:07:13.399982 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.399963 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qwgxr"] Apr 20 20:07:13.400701 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.400683 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-qv9dl"] Apr 20 20:07:13.442316 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.442285 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxq8q\" (UniqueName: \"kubernetes.io/projected/a2d2805c-7734-4363-99a7-f1fc0f7b91a5-kube-api-access-nxq8q\") pod \"network-check-source-8894fc9bd-6zzrt\" (UID: \"a2d2805c-7734-4363-99a7-f1fc0f7b91a5\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6zzrt" Apr 20 20:07:13.442455 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.442344 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/14f2d422-2dc1-4c64-8da4-e881d350c667-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qv9dl\" (UID: \"14f2d422-2dc1-4c64-8da4-e881d350c667\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qv9dl" Apr 20 20:07:13.442455 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.442412 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/14f2d422-2dc1-4c64-8da4-e881d350c667-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-qv9dl\" (UID: \"14f2d422-2dc1-4c64-8da4-e881d350c667\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qv9dl" Apr 20 20:07:13.442566 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.442501 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfd612b2-5ec7-4a68-9cd6-29a94ae37e78-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qwgxr\" (UID: \"dfd612b2-5ec7-4a68-9cd6-29a94ae37e78\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qwgxr" Apr 20 20:07:13.442566 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.442528 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c2lj\" (UniqueName: \"kubernetes.io/projected/dfd612b2-5ec7-4a68-9cd6-29a94ae37e78-kube-api-access-7c2lj\") pod \"cluster-samples-operator-6dc5bdb6b4-qwgxr\" (UID: \"dfd612b2-5ec7-4a68-9cd6-29a94ae37e78\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qwgxr" Apr 20 20:07:13.475302 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.475280 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zl54z" Apr 20 20:07:13.496510 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.496484 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mpjng"] Apr 20 20:07:13.499769 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:07:13.499744 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06eaab54_dbb4_4d67_8fc9_d22d03b1e5a8.slice/crio-3e538f4b64c95af1aa94ff7ee3a9f43966921efdb9339392aa92f21dc0d6b910 WatchSource:0}: Error finding container 3e538f4b64c95af1aa94ff7ee3a9f43966921efdb9339392aa92f21dc0d6b910: Status 404 returned error can't find the container with id 3e538f4b64c95af1aa94ff7ee3a9f43966921efdb9339392aa92f21dc0d6b910 Apr 20 20:07:13.543334 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.543308 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxq8q\" (UniqueName: \"kubernetes.io/projected/a2d2805c-7734-4363-99a7-f1fc0f7b91a5-kube-api-access-nxq8q\") pod \"network-check-source-8894fc9bd-6zzrt\" (UID: \"a2d2805c-7734-4363-99a7-f1fc0f7b91a5\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6zzrt" Apr 20 20:07:13.543441 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.543350 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/14f2d422-2dc1-4c64-8da4-e881d350c667-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qv9dl\" (UID: \"14f2d422-2dc1-4c64-8da4-e881d350c667\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qv9dl" Apr 20 20:07:13.543441 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.543385 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/14f2d422-2dc1-4c64-8da4-e881d350c667-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-qv9dl\" (UID: \"14f2d422-2dc1-4c64-8da4-e881d350c667\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qv9dl" Apr 20 20:07:13.543523 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.543441 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfd612b2-5ec7-4a68-9cd6-29a94ae37e78-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qwgxr\" (UID: \"dfd612b2-5ec7-4a68-9cd6-29a94ae37e78\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qwgxr" Apr 20 20:07:13.543523 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.543469 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7c2lj\" (UniqueName: \"kubernetes.io/projected/dfd612b2-5ec7-4a68-9cd6-29a94ae37e78-kube-api-access-7c2lj\") pod \"cluster-samples-operator-6dc5bdb6b4-qwgxr\" (UID: \"dfd612b2-5ec7-4a68-9cd6-29a94ae37e78\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qwgxr" Apr 20 20:07:13.543523 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:13.543502 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 20:07:13.543671 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:13.543569 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14f2d422-2dc1-4c64-8da4-e881d350c667-networking-console-plugin-cert podName:14f2d422-2dc1-4c64-8da4-e881d350c667 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:14.043550506 +0000 UTC m=+116.458843245 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/14f2d422-2dc1-4c64-8da4-e881d350c667-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-qv9dl" (UID: "14f2d422-2dc1-4c64-8da4-e881d350c667") : secret "networking-console-plugin-cert" not found Apr 20 20:07:13.543671 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:13.543586 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 20:07:13.543671 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:13.543656 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfd612b2-5ec7-4a68-9cd6-29a94ae37e78-samples-operator-tls podName:dfd612b2-5ec7-4a68-9cd6-29a94ae37e78 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:14.043643016 +0000 UTC m=+116.458935741 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/dfd612b2-5ec7-4a68-9cd6-29a94ae37e78-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qwgxr" (UID: "dfd612b2-5ec7-4a68-9cd6-29a94ae37e78") : secret "samples-operator-tls" not found Apr 20 20:07:13.544216 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.544194 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/14f2d422-2dc1-4c64-8da4-e881d350c667-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-qv9dl\" (UID: \"14f2d422-2dc1-4c64-8da4-e881d350c667\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qv9dl" Apr 20 20:07:13.557880 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.557830 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxq8q\" (UniqueName: \"kubernetes.io/projected/a2d2805c-7734-4363-99a7-f1fc0f7b91a5-kube-api-access-nxq8q\") pod \"network-check-source-8894fc9bd-6zzrt\" (UID: \"a2d2805c-7734-4363-99a7-f1fc0f7b91a5\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6zzrt" Apr 20 20:07:13.558044 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.558024 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c2lj\" (UniqueName: \"kubernetes.io/projected/dfd612b2-5ec7-4a68-9cd6-29a94ae37e78-kube-api-access-7c2lj\") pod \"cluster-samples-operator-6dc5bdb6b4-qwgxr\" (UID: \"dfd612b2-5ec7-4a68-9cd6-29a94ae37e78\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qwgxr" Apr 20 20:07:13.586246 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.586221 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zl54z"] Apr 20 20:07:13.589103 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:07:13.589079 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51239262_468e_4240_9144_dfb1b3010a21.slice/crio-1f1eb42d1728c9431afbee17d065a7da69ceb788fdc397f18bad1e3e5bc83055 WatchSource:0}: Error finding container 1f1eb42d1728c9431afbee17d065a7da69ceb788fdc397f18bad1e3e5bc83055: Status 404 returned error can't find the container with id 1f1eb42d1728c9431afbee17d065a7da69ceb788fdc397f18bad1e3e5bc83055 Apr 20 20:07:13.695634 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.695604 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6zzrt" Apr 20 20:07:13.744461 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.744435 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1da8a235-923c-4d64-a78a-ee3ef677d15d-metrics-certs\") pod \"router-default-59745ff96d-mll2m\" (UID: \"1da8a235-923c-4d64-a78a-ee3ef677d15d\") " pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:13.744598 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.744512 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1da8a235-923c-4d64-a78a-ee3ef677d15d-service-ca-bundle\") pod \"router-default-59745ff96d-mll2m\" (UID: \"1da8a235-923c-4d64-a78a-ee3ef677d15d\") " pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:13.744598 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:13.744570 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 20:07:13.744693 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:13.744648 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1da8a235-923c-4d64-a78a-ee3ef677d15d-metrics-certs podName:1da8a235-923c-4d64-a78a-ee3ef677d15d nodeName:}" failed. No retries permitted until 2026-04-20 20:07:14.744627685 +0000 UTC m=+117.159920414 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1da8a235-923c-4d64-a78a-ee3ef677d15d-metrics-certs") pod "router-default-59745ff96d-mll2m" (UID: "1da8a235-923c-4d64-a78a-ee3ef677d15d") : secret "router-metrics-certs-default" not found Apr 20 20:07:13.744763 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:13.744745 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1da8a235-923c-4d64-a78a-ee3ef677d15d-service-ca-bundle podName:1da8a235-923c-4d64-a78a-ee3ef677d15d nodeName:}" failed. No retries permitted until 2026-04-20 20:07:14.744731103 +0000 UTC m=+117.160023823 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1da8a235-923c-4d64-a78a-ee3ef677d15d-service-ca-bundle") pod "router-default-59745ff96d-mll2m" (UID: "1da8a235-923c-4d64-a78a-ee3ef677d15d") : configmap references non-existent config key: service-ca.crt Apr 20 20:07:13.802418 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:13.802393 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-6zzrt"] Apr 20 20:07:13.805245 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:07:13.805220 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2d2805c_7734_4363_99a7_f1fc0f7b91a5.slice/crio-2196cbeacffb754f4f5f36c7d0f1b9a72500df4eceb69525d13ebbdd5fb49e55 WatchSource:0}: Error finding container 2196cbeacffb754f4f5f36c7d0f1b9a72500df4eceb69525d13ebbdd5fb49e55: Status 404 returned error can't find the container with id 2196cbeacffb754f4f5f36c7d0f1b9a72500df4eceb69525d13ebbdd5fb49e55 Apr 20 20:07:14.047642 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:14.047599 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfd612b2-5ec7-4a68-9cd6-29a94ae37e78-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qwgxr\" (UID: \"dfd612b2-5ec7-4a68-9cd6-29a94ae37e78\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qwgxr" Apr 20 20:07:14.047841 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:14.047699 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/14f2d422-2dc1-4c64-8da4-e881d350c667-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qv9dl\" (UID: \"14f2d422-2dc1-4c64-8da4-e881d350c667\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qv9dl" Apr 20 20:07:14.047841 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:14.047758 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 20:07:14.047841 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:14.047830 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 20:07:14.048031 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:14.047839 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfd612b2-5ec7-4a68-9cd6-29a94ae37e78-samples-operator-tls podName:dfd612b2-5ec7-4a68-9cd6-29a94ae37e78 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:15.047817391 +0000 UTC m=+117.463110125 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/dfd612b2-5ec7-4a68-9cd6-29a94ae37e78-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qwgxr" (UID: "dfd612b2-5ec7-4a68-9cd6-29a94ae37e78") : secret "samples-operator-tls" not found Apr 20 20:07:14.048031 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:14.047916 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14f2d422-2dc1-4c64-8da4-e881d350c667-networking-console-plugin-cert podName:14f2d422-2dc1-4c64-8da4-e881d350c667 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:15.047901711 +0000 UTC m=+117.463194440 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/14f2d422-2dc1-4c64-8da4-e881d350c667-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-qv9dl" (UID: "14f2d422-2dc1-4c64-8da4-e881d350c667") : secret "networking-console-plugin-cert" not found Apr 20 20:07:14.498652 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:14.498436 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6zzrt" event={"ID":"a2d2805c-7734-4363-99a7-f1fc0f7b91a5","Type":"ContainerStarted","Data":"a8f6a2789ac0079bc1d0e09464cfbc2fdd2ed33c1ae28c71c0ba7cdd3e06addb"} Apr 20 20:07:14.498652 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:14.498479 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6zzrt" event={"ID":"a2d2805c-7734-4363-99a7-f1fc0f7b91a5","Type":"ContainerStarted","Data":"2196cbeacffb754f4f5f36c7d0f1b9a72500df4eceb69525d13ebbdd5fb49e55"} Apr 20 20:07:14.499705 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:14.499683 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zl54z" event={"ID":"51239262-468e-4240-9144-dfb1b3010a21","Type":"ContainerStarted","Data":"1f1eb42d1728c9431afbee17d065a7da69ceb788fdc397f18bad1e3e5bc83055"} Apr 20 20:07:14.500706 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:14.500681 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mpjng" event={"ID":"06eaab54-dbb4-4d67-8fc9-d22d03b1e5a8","Type":"ContainerStarted","Data":"3e538f4b64c95af1aa94ff7ee3a9f43966921efdb9339392aa92f21dc0d6b910"} Apr 20 20:07:14.513389 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:14.513332 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6zzrt" podStartSLOduration=1.5133171 podStartE2EDuration="1.5133171s" podCreationTimestamp="2026-04-20 20:07:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:07:14.512961526 +0000 UTC m=+116.928254291" watchObservedRunningTime="2026-04-20 20:07:14.5133171 +0000 UTC m=+116.928609845" Apr 20 20:07:14.754069 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:14.753989 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1da8a235-923c-4d64-a78a-ee3ef677d15d-metrics-certs\") pod \"router-default-59745ff96d-mll2m\" (UID: \"1da8a235-923c-4d64-a78a-ee3ef677d15d\") " pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:14.754069 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:14.754057 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1da8a235-923c-4d64-a78a-ee3ef677d15d-service-ca-bundle\") pod \"router-default-59745ff96d-mll2m\" (UID: \"1da8a235-923c-4d64-a78a-ee3ef677d15d\") " pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:14.754374 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:14.754240 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1da8a235-923c-4d64-a78a-ee3ef677d15d-service-ca-bundle podName:1da8a235-923c-4d64-a78a-ee3ef677d15d nodeName:}" failed. No retries permitted until 2026-04-20 20:07:16.75422007 +0000 UTC m=+119.169512815 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1da8a235-923c-4d64-a78a-ee3ef677d15d-service-ca-bundle") pod "router-default-59745ff96d-mll2m" (UID: "1da8a235-923c-4d64-a78a-ee3ef677d15d") : configmap references non-existent config key: service-ca.crt Apr 20 20:07:14.754374 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:14.754290 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 20:07:14.754374 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:14.754341 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1da8a235-923c-4d64-a78a-ee3ef677d15d-metrics-certs podName:1da8a235-923c-4d64-a78a-ee3ef677d15d nodeName:}" failed. No retries permitted until 2026-04-20 20:07:16.754327347 +0000 UTC m=+119.169620079 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1da8a235-923c-4d64-a78a-ee3ef677d15d-metrics-certs") pod "router-default-59745ff96d-mll2m" (UID: "1da8a235-923c-4d64-a78a-ee3ef677d15d") : secret "router-metrics-certs-default" not found Apr 20 20:07:15.056740 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:15.056702 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfd612b2-5ec7-4a68-9cd6-29a94ae37e78-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qwgxr\" (UID: \"dfd612b2-5ec7-4a68-9cd6-29a94ae37e78\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qwgxr" Apr 20 20:07:15.056942 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:15.056764 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/14f2d422-2dc1-4c64-8da4-e881d350c667-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qv9dl\" (UID: \"14f2d422-2dc1-4c64-8da4-e881d350c667\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qv9dl" Apr 20 20:07:15.056942 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:15.056867 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 20:07:15.056942 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:15.056918 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14f2d422-2dc1-4c64-8da4-e881d350c667-networking-console-plugin-cert podName:14f2d422-2dc1-4c64-8da4-e881d350c667 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:17.056905255 +0000 UTC m=+119.472197974 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/14f2d422-2dc1-4c64-8da4-e881d350c667-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-qv9dl" (UID: "14f2d422-2dc1-4c64-8da4-e881d350c667") : secret "networking-console-plugin-cert" not found Apr 20 20:07:15.057120 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:15.056865 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 20:07:15.057120 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:15.057002 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfd612b2-5ec7-4a68-9cd6-29a94ae37e78-samples-operator-tls podName:dfd612b2-5ec7-4a68-9cd6-29a94ae37e78 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:17.056987232 +0000 UTC m=+119.472279972 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/dfd612b2-5ec7-4a68-9cd6-29a94ae37e78-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qwgxr" (UID: "dfd612b2-5ec7-4a68-9cd6-29a94ae37e78") : secret "samples-operator-tls" not found Apr 20 20:07:16.507171 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:16.507080 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mpjng" event={"ID":"06eaab54-dbb4-4d67-8fc9-d22d03b1e5a8","Type":"ContainerStarted","Data":"60eb11c9c90683d16f71df52ee3f36b9010a8d3b5363098677ca552ce21ed3c2"} Apr 20 20:07:16.508504 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:16.508482 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zl54z" event={"ID":"51239262-468e-4240-9144-dfb1b3010a21","Type":"ContainerStarted","Data":"3dcb9190fc65a8d0623b061dd317350c10d51bb1555acb959133f4d7e977bc36"} Apr 20 20:07:16.523245 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:16.523196 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mpjng" podStartSLOduration=0.859336581 podStartE2EDuration="3.523181668s" podCreationTimestamp="2026-04-20 20:07:13 +0000 UTC" firstStartedPulling="2026-04-20 20:07:13.501915422 +0000 UTC m=+115.917208145" lastFinishedPulling="2026-04-20 20:07:16.165760497 +0000 UTC m=+118.581053232" observedRunningTime="2026-04-20 20:07:16.521847244 +0000 UTC m=+118.937139987" watchObservedRunningTime="2026-04-20 20:07:16.523181668 +0000 UTC m=+118.938474411" Apr 20 20:07:16.536839 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:16.536798 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zl54z" podStartSLOduration=0.960506573 podStartE2EDuration="3.536787741s" podCreationTimestamp="2026-04-20 20:07:13 +0000 UTC" firstStartedPulling="2026-04-20 20:07:13.590804513 +0000 UTC m=+116.006097232" lastFinishedPulling="2026-04-20 20:07:16.167085664 +0000 UTC m=+118.582378400" observedRunningTime="2026-04-20 20:07:16.536298659 +0000 UTC m=+118.951591402" watchObservedRunningTime="2026-04-20 20:07:16.536787741 +0000 UTC m=+118.952080484" Apr 20 20:07:16.773229 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:16.773141 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1da8a235-923c-4d64-a78a-ee3ef677d15d-metrics-certs\") pod \"router-default-59745ff96d-mll2m\" (UID: \"1da8a235-923c-4d64-a78a-ee3ef677d15d\") " pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:16.773229 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:16.773217 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1da8a235-923c-4d64-a78a-ee3ef677d15d-service-ca-bundle\") pod \"router-default-59745ff96d-mll2m\" (UID: \"1da8a235-923c-4d64-a78a-ee3ef677d15d\") " pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:16.773428 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:16.773269 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 20:07:16.773428 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:16.773338 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1da8a235-923c-4d64-a78a-ee3ef677d15d-metrics-certs podName:1da8a235-923c-4d64-a78a-ee3ef677d15d nodeName:}" failed. No retries permitted until 2026-04-20 20:07:20.773319299 +0000 UTC m=+123.188612023 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1da8a235-923c-4d64-a78a-ee3ef677d15d-metrics-certs") pod "router-default-59745ff96d-mll2m" (UID: "1da8a235-923c-4d64-a78a-ee3ef677d15d") : secret "router-metrics-certs-default" not found Apr 20 20:07:16.773428 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:16.773398 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1da8a235-923c-4d64-a78a-ee3ef677d15d-service-ca-bundle podName:1da8a235-923c-4d64-a78a-ee3ef677d15d nodeName:}" failed. No retries permitted until 2026-04-20 20:07:20.773377362 +0000 UTC m=+123.188670084 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1da8a235-923c-4d64-a78a-ee3ef677d15d-service-ca-bundle") pod "router-default-59745ff96d-mll2m" (UID: "1da8a235-923c-4d64-a78a-ee3ef677d15d") : configmap references non-existent config key: service-ca.crt Apr 20 20:07:17.075514 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:17.075477 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfd612b2-5ec7-4a68-9cd6-29a94ae37e78-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qwgxr\" (UID: \"dfd612b2-5ec7-4a68-9cd6-29a94ae37e78\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qwgxr" Apr 20 20:07:17.075685 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:17.075544 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/14f2d422-2dc1-4c64-8da4-e881d350c667-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qv9dl\" (UID: \"14f2d422-2dc1-4c64-8da4-e881d350c667\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qv9dl" Apr 20 20:07:17.075685 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:17.075624 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 20:07:17.075685 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:17.075679 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 20:07:17.075828 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:17.075689 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfd612b2-5ec7-4a68-9cd6-29a94ae37e78-samples-operator-tls podName:dfd612b2-5ec7-4a68-9cd6-29a94ae37e78 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:21.075675399 +0000 UTC m=+123.490968119 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/dfd612b2-5ec7-4a68-9cd6-29a94ae37e78-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qwgxr" (UID: "dfd612b2-5ec7-4a68-9cd6-29a94ae37e78") : secret "samples-operator-tls" not found Apr 20 20:07:17.075828 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:17.075722 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14f2d422-2dc1-4c64-8da4-e881d350c667-networking-console-plugin-cert podName:14f2d422-2dc1-4c64-8da4-e881d350c667 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:21.075710656 +0000 UTC m=+123.491003376 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/14f2d422-2dc1-4c64-8da4-e881d350c667-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-qv9dl" (UID: "14f2d422-2dc1-4c64-8da4-e881d350c667") : secret "networking-console-plugin-cert" not found Apr 20 20:07:19.220326 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:19.220298 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-pgbf5_d645d5ae-1405-421d-8f27-65e056976e28/dns-node-resolver/0.log" Apr 20 20:07:20.019591 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:20.019568 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-knsf9_7f6b4fe9-415b-4c1d-91f4-70456b92ec7e/node-ca/0.log" Apr 20 20:07:20.805238 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:20.805201 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1da8a235-923c-4d64-a78a-ee3ef677d15d-service-ca-bundle\") pod \"router-default-59745ff96d-mll2m\" (UID: \"1da8a235-923c-4d64-a78a-ee3ef677d15d\") " pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:20.805596 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:20.805290 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1da8a235-923c-4d64-a78a-ee3ef677d15d-metrics-certs\") pod \"router-default-59745ff96d-mll2m\" (UID: \"1da8a235-923c-4d64-a78a-ee3ef677d15d\") " pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:20.805596 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:20.805360 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1da8a235-923c-4d64-a78a-ee3ef677d15d-service-ca-bundle podName:1da8a235-923c-4d64-a78a-ee3ef677d15d nodeName:}" failed. No retries permitted until 2026-04-20 20:07:28.805341902 +0000 UTC m=+131.220634623 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1da8a235-923c-4d64-a78a-ee3ef677d15d-service-ca-bundle") pod "router-default-59745ff96d-mll2m" (UID: "1da8a235-923c-4d64-a78a-ee3ef677d15d") : configmap references non-existent config key: service-ca.crt Apr 20 20:07:20.805596 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:20.805384 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 20:07:20.805596 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:20.805431 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1da8a235-923c-4d64-a78a-ee3ef677d15d-metrics-certs podName:1da8a235-923c-4d64-a78a-ee3ef677d15d nodeName:}" failed. No retries permitted until 2026-04-20 20:07:28.805418372 +0000 UTC m=+131.220711092 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1da8a235-923c-4d64-a78a-ee3ef677d15d-metrics-certs") pod "router-default-59745ff96d-mll2m" (UID: "1da8a235-923c-4d64-a78a-ee3ef677d15d") : secret "router-metrics-certs-default" not found Apr 20 20:07:21.107290 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:21.107201 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfd612b2-5ec7-4a68-9cd6-29a94ae37e78-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qwgxr\" (UID: \"dfd612b2-5ec7-4a68-9cd6-29a94ae37e78\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qwgxr" Apr 20 20:07:21.107290 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:21.107263 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/14f2d422-2dc1-4c64-8da4-e881d350c667-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qv9dl\" (UID: \"14f2d422-2dc1-4c64-8da4-e881d350c667\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qv9dl" Apr 20 20:07:21.107470 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:21.107359 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 20:07:21.107470 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:21.107444 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14f2d422-2dc1-4c64-8da4-e881d350c667-networking-console-plugin-cert podName:14f2d422-2dc1-4c64-8da4-e881d350c667 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:29.107426725 +0000 UTC m=+131.522719448 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/14f2d422-2dc1-4c64-8da4-e881d350c667-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-qv9dl" (UID: "14f2d422-2dc1-4c64-8da4-e881d350c667") : secret "networking-console-plugin-cert" not found Apr 20 20:07:21.107470 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:21.107359 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 20:07:21.107567 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:21.107495 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfd612b2-5ec7-4a68-9cd6-29a94ae37e78-samples-operator-tls podName:dfd612b2-5ec7-4a68-9cd6-29a94ae37e78 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:29.107484471 +0000 UTC m=+131.522777201 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/dfd612b2-5ec7-4a68-9cd6-29a94ae37e78-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-qwgxr" (UID: "dfd612b2-5ec7-4a68-9cd6-29a94ae37e78") : secret "samples-operator-tls" not found Apr 20 20:07:27.859037 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:27.858994 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-metrics-certs\") pod \"network-metrics-daemon-npkgv\" (UID: \"923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d\") " pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:07:27.859402 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:27.859137 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 20:07:27.859402 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:27.859206 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-metrics-certs podName:923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d nodeName:}" failed. No retries permitted until 2026-04-20 20:09:29.859188792 +0000 UTC m=+252.274481512 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-metrics-certs") pod "network-metrics-daemon-npkgv" (UID: "923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d") : secret "metrics-daemon-secret" not found Apr 20 20:07:28.867716 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:28.867685 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1da8a235-923c-4d64-a78a-ee3ef677d15d-service-ca-bundle\") pod \"router-default-59745ff96d-mll2m\" (UID: \"1da8a235-923c-4d64-a78a-ee3ef677d15d\") " pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:28.868128 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:28.867800 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1da8a235-923c-4d64-a78a-ee3ef677d15d-metrics-certs\") pod \"router-default-59745ff96d-mll2m\" (UID: \"1da8a235-923c-4d64-a78a-ee3ef677d15d\") " pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:28.868128 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:28.867878 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1da8a235-923c-4d64-a78a-ee3ef677d15d-service-ca-bundle podName:1da8a235-923c-4d64-a78a-ee3ef677d15d nodeName:}" failed. No retries permitted until 2026-04-20 20:07:44.867840973 +0000 UTC m=+147.283133692 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/1da8a235-923c-4d64-a78a-ee3ef677d15d-service-ca-bundle") pod "router-default-59745ff96d-mll2m" (UID: "1da8a235-923c-4d64-a78a-ee3ef677d15d") : configmap references non-existent config key: service-ca.crt Apr 20 20:07:28.868128 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:28.867931 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 20:07:28.868128 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:28.867989 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1da8a235-923c-4d64-a78a-ee3ef677d15d-metrics-certs podName:1da8a235-923c-4d64-a78a-ee3ef677d15d nodeName:}" failed. No retries permitted until 2026-04-20 20:07:44.86797463 +0000 UTC m=+147.283267350 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1da8a235-923c-4d64-a78a-ee3ef677d15d-metrics-certs") pod "router-default-59745ff96d-mll2m" (UID: "1da8a235-923c-4d64-a78a-ee3ef677d15d") : secret "router-metrics-certs-default" not found Apr 20 20:07:29.170164 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:29.170076 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/14f2d422-2dc1-4c64-8da4-e881d350c667-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qv9dl\" (UID: \"14f2d422-2dc1-4c64-8da4-e881d350c667\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qv9dl" Apr 20 20:07:29.170311 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:29.170170 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfd612b2-5ec7-4a68-9cd6-29a94ae37e78-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qwgxr\" (UID: \"dfd612b2-5ec7-4a68-9cd6-29a94ae37e78\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qwgxr" Apr 20 20:07:29.170311 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:29.170201 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 20:07:29.170311 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:29.170259 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14f2d422-2dc1-4c64-8da4-e881d350c667-networking-console-plugin-cert podName:14f2d422-2dc1-4c64-8da4-e881d350c667 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:45.170241911 +0000 UTC m=+147.585534636 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/14f2d422-2dc1-4c64-8da4-e881d350c667-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-qv9dl" (UID: "14f2d422-2dc1-4c64-8da4-e881d350c667") : secret "networking-console-plugin-cert" not found Apr 20 20:07:29.172506 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:29.172479 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfd612b2-5ec7-4a68-9cd6-29a94ae37e78-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-qwgxr\" (UID: \"dfd612b2-5ec7-4a68-9cd6-29a94ae37e78\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qwgxr" Apr 20 20:07:29.311304 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:29.311282 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-bd69k\"" Apr 20 20:07:29.319504 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:29.319483 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qwgxr" Apr 20 20:07:29.430400 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:29.430341 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qwgxr"] Apr 20 20:07:29.539116 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:29.539087 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qwgxr" event={"ID":"dfd612b2-5ec7-4a68-9cd6-29a94ae37e78","Type":"ContainerStarted","Data":"e261fddab1b01e6f280fa5597357e7a7d5cd98ba81a8f2f0a1ffc6c9e04a1c66"} Apr 20 20:07:31.545675 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:31.545634 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qwgxr" event={"ID":"dfd612b2-5ec7-4a68-9cd6-29a94ae37e78","Type":"ContainerStarted","Data":"1519d742fb5d7f40b636614354dd2c6bac08ebcb5ac0f768f9f969675e8c3f4f"} Apr 20 20:07:31.545675 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:31.545676 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qwgxr" event={"ID":"dfd612b2-5ec7-4a68-9cd6-29a94ae37e78","Type":"ContainerStarted","Data":"6c6cf8b3d60c66e2e829a7a2958faa19b6ee87b3eac55b4491cdca7383fa5705"} Apr 20 20:07:31.561970 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:31.561923 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-qwgxr" podStartSLOduration=17.036139124 podStartE2EDuration="18.561911063s" podCreationTimestamp="2026-04-20 20:07:13 +0000 UTC" firstStartedPulling="2026-04-20 20:07:29.472174597 +0000 UTC m=+131.887467317" lastFinishedPulling="2026-04-20 20:07:30.997946524 +0000 UTC m=+133.413239256" observedRunningTime="2026-04-20 20:07:31.561283385 +0000 UTC m=+133.976576137" watchObservedRunningTime="2026-04-20 20:07:31.561911063 +0000 UTC m=+133.977203804" Apr 20 20:07:41.648166 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.648131 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-zzlvj"] Apr 20 20:07:41.650864 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.650837 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-zzlvj" Apr 20 20:07:41.653335 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.653311 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6754dfd45f-tfgml"] Apr 20 20:07:41.653611 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.653582 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 20:07:41.653701 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.653584 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 20:07:41.653701 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.653591 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 20:07:41.653798 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.653722 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 20:07:41.654606 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.654586 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-5f2m5\"" Apr 20 20:07:41.655349 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.655333 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" Apr 20 20:07:41.657732 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.657712 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 20:07:41.657830 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.657772 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 20:07:41.658172 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.657896 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-j7xv8\"" Apr 20 20:07:41.658172 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.658064 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 20:07:41.664722 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.664701 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-zzlvj"] Apr 20 20:07:41.666045 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.666027 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 20:07:41.670067 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.670047 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6754dfd45f-tfgml"] Apr 20 20:07:41.767360 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.767322 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4cc09a9f-b574-4629-92a5-1121adb73396-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zzlvj\" (UID: \"4cc09a9f-b574-4629-92a5-1121adb73396\") " pod="openshift-insights/insights-runtime-extractor-zzlvj" Apr 20 20:07:41.767360 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.767356 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/84920716-a237-4371-8e3b-ecc46291eb90-registry-tls\") pod \"image-registry-6754dfd45f-tfgml\" (UID: \"84920716-a237-4371-8e3b-ecc46291eb90\") " pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" Apr 20 20:07:41.767569 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.767384 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84920716-a237-4371-8e3b-ecc46291eb90-trusted-ca\") pod \"image-registry-6754dfd45f-tfgml\" (UID: \"84920716-a237-4371-8e3b-ecc46291eb90\") " pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" Apr 20 20:07:41.767569 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.767468 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4cc09a9f-b574-4629-92a5-1121adb73396-crio-socket\") pod \"insights-runtime-extractor-zzlvj\" (UID: \"4cc09a9f-b574-4629-92a5-1121adb73396\") " pod="openshift-insights/insights-runtime-extractor-zzlvj" Apr 20 20:07:41.767569 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.767513 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/84920716-a237-4371-8e3b-ecc46291eb90-installation-pull-secrets\") pod \"image-registry-6754dfd45f-tfgml\" (UID: \"84920716-a237-4371-8e3b-ecc46291eb90\") " pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" Apr 20 20:07:41.767674 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.767580 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/84920716-a237-4371-8e3b-ecc46291eb90-bound-sa-token\") pod \"image-registry-6754dfd45f-tfgml\" (UID: \"84920716-a237-4371-8e3b-ecc46291eb90\") " pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" Apr 20 20:07:41.767674 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.767597 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4cc09a9f-b574-4629-92a5-1121adb73396-data-volume\") pod \"insights-runtime-extractor-zzlvj\" (UID: \"4cc09a9f-b574-4629-92a5-1121adb73396\") " pod="openshift-insights/insights-runtime-extractor-zzlvj" Apr 20 20:07:41.767674 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.767639 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw564\" (UniqueName: \"kubernetes.io/projected/84920716-a237-4371-8e3b-ecc46291eb90-kube-api-access-bw564\") pod \"image-registry-6754dfd45f-tfgml\" (UID: \"84920716-a237-4371-8e3b-ecc46291eb90\") " pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" Apr 20 20:07:41.767674 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.767659 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkfc2\" (UniqueName: \"kubernetes.io/projected/4cc09a9f-b574-4629-92a5-1121adb73396-kube-api-access-kkfc2\") pod \"insights-runtime-extractor-zzlvj\" (UID: \"4cc09a9f-b574-4629-92a5-1121adb73396\") " pod="openshift-insights/insights-runtime-extractor-zzlvj" Apr 20 20:07:41.767808 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.767684 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/84920716-a237-4371-8e3b-ecc46291eb90-registry-certificates\") pod \"image-registry-6754dfd45f-tfgml\" (UID: \"84920716-a237-4371-8e3b-ecc46291eb90\") " pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" Apr 20 20:07:41.767808 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.767760 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/84920716-a237-4371-8e3b-ecc46291eb90-ca-trust-extracted\") pod \"image-registry-6754dfd45f-tfgml\" (UID: \"84920716-a237-4371-8e3b-ecc46291eb90\") " pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" Apr 20 20:07:41.767808 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.767798 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/84920716-a237-4371-8e3b-ecc46291eb90-image-registry-private-configuration\") pod \"image-registry-6754dfd45f-tfgml\" (UID: \"84920716-a237-4371-8e3b-ecc46291eb90\") " pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" Apr 20 20:07:41.767949 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.767818 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4cc09a9f-b574-4629-92a5-1121adb73396-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-zzlvj\" (UID: \"4cc09a9f-b574-4629-92a5-1121adb73396\") " pod="openshift-insights/insights-runtime-extractor-zzlvj" Apr 20 20:07:41.868345 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.868315 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/84920716-a237-4371-8e3b-ecc46291eb90-image-registry-private-configuration\") pod \"image-registry-6754dfd45f-tfgml\" (UID: \"84920716-a237-4371-8e3b-ecc46291eb90\") " pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" Apr 20 20:07:41.868345 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.868346 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4cc09a9f-b574-4629-92a5-1121adb73396-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-zzlvj\" (UID: \"4cc09a9f-b574-4629-92a5-1121adb73396\") " pod="openshift-insights/insights-runtime-extractor-zzlvj" Apr 20 20:07:41.868540 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.868366 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4cc09a9f-b574-4629-92a5-1121adb73396-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zzlvj\" (UID: \"4cc09a9f-b574-4629-92a5-1121adb73396\") " pod="openshift-insights/insights-runtime-extractor-zzlvj" Apr 20 20:07:41.868540 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.868490 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/84920716-a237-4371-8e3b-ecc46291eb90-registry-tls\") pod \"image-registry-6754dfd45f-tfgml\" (UID: \"84920716-a237-4371-8e3b-ecc46291eb90\") " pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" Apr 20 20:07:41.868540 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.868527 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84920716-a237-4371-8e3b-ecc46291eb90-trusted-ca\") pod \"image-registry-6754dfd45f-tfgml\" (UID: \"84920716-a237-4371-8e3b-ecc46291eb90\") " pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" Apr 20 20:07:41.868653 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.868557 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4cc09a9f-b574-4629-92a5-1121adb73396-crio-socket\") pod \"insights-runtime-extractor-zzlvj\" (UID: \"4cc09a9f-b574-4629-92a5-1121adb73396\") " pod="openshift-insights/insights-runtime-extractor-zzlvj" Apr 20 20:07:41.868653 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.868605 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/84920716-a237-4371-8e3b-ecc46291eb90-installation-pull-secrets\") pod \"image-registry-6754dfd45f-tfgml\" (UID: \"84920716-a237-4371-8e3b-ecc46291eb90\") " pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" Apr 20 20:07:41.868653 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.868642 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/84920716-a237-4371-8e3b-ecc46291eb90-bound-sa-token\") pod \"image-registry-6754dfd45f-tfgml\" (UID: \"84920716-a237-4371-8e3b-ecc46291eb90\") " pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" Apr 20 20:07:41.868768 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.868665 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4cc09a9f-b574-4629-92a5-1121adb73396-data-volume\") pod \"insights-runtime-extractor-zzlvj\" (UID: \"4cc09a9f-b574-4629-92a5-1121adb73396\") " pod="openshift-insights/insights-runtime-extractor-zzlvj" Apr 20 20:07:41.868768 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.868705 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bw564\" (UniqueName: \"kubernetes.io/projected/84920716-a237-4371-8e3b-ecc46291eb90-kube-api-access-bw564\") pod \"image-registry-6754dfd45f-tfgml\" (UID: \"84920716-a237-4371-8e3b-ecc46291eb90\") " pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" Apr 20 20:07:41.868768 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.868717 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4cc09a9f-b574-4629-92a5-1121adb73396-crio-socket\") pod \"insights-runtime-extractor-zzlvj\" (UID: \"4cc09a9f-b574-4629-92a5-1121adb73396\") " pod="openshift-insights/insights-runtime-extractor-zzlvj" Apr 20 20:07:41.868768 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.868733 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kkfc2\" (UniqueName: \"kubernetes.io/projected/4cc09a9f-b574-4629-92a5-1121adb73396-kube-api-access-kkfc2\") pod \"insights-runtime-extractor-zzlvj\" (UID: \"4cc09a9f-b574-4629-92a5-1121adb73396\") " pod="openshift-insights/insights-runtime-extractor-zzlvj" Apr 20 20:07:41.868768 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.868760 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/84920716-a237-4371-8e3b-ecc46291eb90-registry-certificates\") pod \"image-registry-6754dfd45f-tfgml\" (UID: \"84920716-a237-4371-8e3b-ecc46291eb90\") " pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" Apr 20 20:07:41.869052 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.868843 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/84920716-a237-4371-8e3b-ecc46291eb90-ca-trust-extracted\") pod \"image-registry-6754dfd45f-tfgml\" (UID: \"84920716-a237-4371-8e3b-ecc46291eb90\") " pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" Apr 20 20:07:41.869285 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.869260 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/84920716-a237-4371-8e3b-ecc46291eb90-ca-trust-extracted\") pod \"image-registry-6754dfd45f-tfgml\" (UID: \"84920716-a237-4371-8e3b-ecc46291eb90\") " pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" Apr 20 20:07:41.869717 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.869679 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4cc09a9f-b574-4629-92a5-1121adb73396-data-volume\") pod \"insights-runtime-extractor-zzlvj\" (UID: \"4cc09a9f-b574-4629-92a5-1121adb73396\") " pod="openshift-insights/insights-runtime-extractor-zzlvj" Apr 20 20:07:41.869823 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.869800 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84920716-a237-4371-8e3b-ecc46291eb90-trusted-ca\") pod \"image-registry-6754dfd45f-tfgml\" (UID: \"84920716-a237-4371-8e3b-ecc46291eb90\") " pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" Apr 20 20:07:41.870010 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.869990 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/84920716-a237-4371-8e3b-ecc46291eb90-registry-certificates\") pod \"image-registry-6754dfd45f-tfgml\" (UID: \"84920716-a237-4371-8e3b-ecc46291eb90\") " pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" Apr 20 20:07:41.870068 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.870025 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4cc09a9f-b574-4629-92a5-1121adb73396-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-zzlvj\" (UID: \"4cc09a9f-b574-4629-92a5-1121adb73396\") " pod="openshift-insights/insights-runtime-extractor-zzlvj" Apr 20 20:07:41.871262 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.871234 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/84920716-a237-4371-8e3b-ecc46291eb90-installation-pull-secrets\") pod \"image-registry-6754dfd45f-tfgml\" (UID: \"84920716-a237-4371-8e3b-ecc46291eb90\") " pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" Apr 20 20:07:41.871262 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.871241 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/84920716-a237-4371-8e3b-ecc46291eb90-image-registry-private-configuration\") pod \"image-registry-6754dfd45f-tfgml\" (UID: \"84920716-a237-4371-8e3b-ecc46291eb90\") " pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" Apr 20 20:07:41.871381 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.871309 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4cc09a9f-b574-4629-92a5-1121adb73396-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-zzlvj\" (UID: \"4cc09a9f-b574-4629-92a5-1121adb73396\") " pod="openshift-insights/insights-runtime-extractor-zzlvj" Apr 20 20:07:41.871441 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.871423 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/84920716-a237-4371-8e3b-ecc46291eb90-registry-tls\") pod \"image-registry-6754dfd45f-tfgml\" (UID: \"84920716-a237-4371-8e3b-ecc46291eb90\") " pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" Apr 20 20:07:41.877121 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.877083 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/84920716-a237-4371-8e3b-ecc46291eb90-bound-sa-token\") pod \"image-registry-6754dfd45f-tfgml\" (UID: \"84920716-a237-4371-8e3b-ecc46291eb90\") " pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" Apr 20 20:07:41.877301 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.877279 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw564\" (UniqueName: \"kubernetes.io/projected/84920716-a237-4371-8e3b-ecc46291eb90-kube-api-access-bw564\") pod \"image-registry-6754dfd45f-tfgml\" (UID: \"84920716-a237-4371-8e3b-ecc46291eb90\") " pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" Apr 20 20:07:41.877412 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.877396 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkfc2\" (UniqueName: \"kubernetes.io/projected/4cc09a9f-b574-4629-92a5-1121adb73396-kube-api-access-kkfc2\") pod \"insights-runtime-extractor-zzlvj\" (UID: \"4cc09a9f-b574-4629-92a5-1121adb73396\") " pod="openshift-insights/insights-runtime-extractor-zzlvj" Apr 20 20:07:41.961551 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.961488 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-zzlvj" Apr 20 20:07:41.968289 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:41.968256 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" Apr 20 20:07:42.087487 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:42.087454 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-zzlvj"] Apr 20 20:07:42.091334 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:07:42.091306 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cc09a9f_b574_4629_92a5_1121adb73396.slice/crio-7a234b28b1b60e23f208e206493569484e47872af6e917809dcda8b8f0f430bc WatchSource:0}: Error finding container 7a234b28b1b60e23f208e206493569484e47872af6e917809dcda8b8f0f430bc: Status 404 returned error can't find the container with id 7a234b28b1b60e23f208e206493569484e47872af6e917809dcda8b8f0f430bc Apr 20 20:07:42.105209 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:42.105188 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6754dfd45f-tfgml"] Apr 20 20:07:42.107755 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:07:42.107734 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84920716_a237_4371_8e3b_ecc46291eb90.slice/crio-579f9a618f37825b8ecc76eca50c7e442718f015143096da0b04981a7844a460 WatchSource:0}: Error finding container 579f9a618f37825b8ecc76eca50c7e442718f015143096da0b04981a7844a460: Status 404 returned error can't find the container with id 579f9a618f37825b8ecc76eca50c7e442718f015143096da0b04981a7844a460 Apr 20 20:07:42.570746 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:42.570711 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" event={"ID":"84920716-a237-4371-8e3b-ecc46291eb90","Type":"ContainerStarted","Data":"506c41d4cdceb4905fc6f6018f6767e530a7d2193092c232ffe722af6f1607e9"} Apr 20 20:07:42.570746 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:42.570747 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" event={"ID":"84920716-a237-4371-8e3b-ecc46291eb90","Type":"ContainerStarted","Data":"579f9a618f37825b8ecc76eca50c7e442718f015143096da0b04981a7844a460"} Apr 20 20:07:42.571001 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:42.570792 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" Apr 20 20:07:42.571886 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:42.571841 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zzlvj" event={"ID":"4cc09a9f-b574-4629-92a5-1121adb73396","Type":"ContainerStarted","Data":"6678ed7540ff54cb0cb9ac0c4306abff94859d4c0cfd1ccef2fd34c019f80c52"} Apr 20 20:07:42.571985 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:42.571889 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zzlvj" event={"ID":"4cc09a9f-b574-4629-92a5-1121adb73396","Type":"ContainerStarted","Data":"7a234b28b1b60e23f208e206493569484e47872af6e917809dcda8b8f0f430bc"} Apr 20 20:07:42.589574 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:42.589386 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" podStartSLOduration=1.589370102 podStartE2EDuration="1.589370102s" podCreationTimestamp="2026-04-20 20:07:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:07:42.589193901 +0000 UTC m=+145.004486645" watchObservedRunningTime="2026-04-20 20:07:42.589370102 +0000 UTC m=+145.004662845" Apr 20 20:07:43.576139 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:43.576103 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zzlvj" event={"ID":"4cc09a9f-b574-4629-92a5-1121adb73396","Type":"ContainerStarted","Data":"e8e6ded3c164d6f5c1746e9e8b74cce9dc0717eaf32b6050c951df712c3099ed"} Apr 20 20:07:44.580399 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:44.580325 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-zzlvj" event={"ID":"4cc09a9f-b574-4629-92a5-1121adb73396","Type":"ContainerStarted","Data":"fa0497ca9e697a27805fb545b3e848159678b1e84302b092bc27da1630238a76"} Apr 20 20:07:44.597342 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:44.597293 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-zzlvj" podStartSLOduration=1.456155721 podStartE2EDuration="3.597278821s" podCreationTimestamp="2026-04-20 20:07:41 +0000 UTC" firstStartedPulling="2026-04-20 20:07:42.157125155 +0000 UTC m=+144.572417874" lastFinishedPulling="2026-04-20 20:07:44.298248249 +0000 UTC m=+146.713540974" observedRunningTime="2026-04-20 20:07:44.596997541 +0000 UTC m=+147.012290283" watchObservedRunningTime="2026-04-20 20:07:44.597278821 +0000 UTC m=+147.012571563" Apr 20 20:07:44.894547 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:44.894478 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1da8a235-923c-4d64-a78a-ee3ef677d15d-service-ca-bundle\") pod \"router-default-59745ff96d-mll2m\" (UID: \"1da8a235-923c-4d64-a78a-ee3ef677d15d\") " pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:44.894547 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:44.894532 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1da8a235-923c-4d64-a78a-ee3ef677d15d-metrics-certs\") pod \"router-default-59745ff96d-mll2m\" (UID: \"1da8a235-923c-4d64-a78a-ee3ef677d15d\") " pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:44.895140 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:44.895121 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1da8a235-923c-4d64-a78a-ee3ef677d15d-service-ca-bundle\") pod \"router-default-59745ff96d-mll2m\" (UID: \"1da8a235-923c-4d64-a78a-ee3ef677d15d\") " pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:44.896724 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:44.896706 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1da8a235-923c-4d64-a78a-ee3ef677d15d-metrics-certs\") pod \"router-default-59745ff96d-mll2m\" (UID: \"1da8a235-923c-4d64-a78a-ee3ef677d15d\") " pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:45.187336 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:45.187272 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-89zhh\"" Apr 20 20:07:45.195519 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:45.195494 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:45.197322 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:45.197293 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/14f2d422-2dc1-4c64-8da4-e881d350c667-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qv9dl\" (UID: \"14f2d422-2dc1-4c64-8da4-e881d350c667\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qv9dl" Apr 20 20:07:45.199581 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:45.199562 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/14f2d422-2dc1-4c64-8da4-e881d350c667-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qv9dl\" (UID: \"14f2d422-2dc1-4c64-8da4-e881d350c667\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qv9dl" Apr 20 20:07:45.206034 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:45.206013 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-cz5k5\"" Apr 20 20:07:45.214207 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:45.214189 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qv9dl" Apr 20 20:07:45.319751 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:45.319725 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-59745ff96d-mll2m"] Apr 20 20:07:45.323058 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:07:45.323006 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1da8a235_923c_4d64_a78a_ee3ef677d15d.slice/crio-42021aa828a5649185f41668d11d3924dc2b95dec093993e101cf79de782a9c2 WatchSource:0}: Error finding container 42021aa828a5649185f41668d11d3924dc2b95dec093993e101cf79de782a9c2: Status 404 returned error can't find the container with id 42021aa828a5649185f41668d11d3924dc2b95dec093993e101cf79de782a9c2 Apr 20 20:07:45.335404 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:45.335380 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-qv9dl"] Apr 20 20:07:45.338159 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:07:45.338130 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14f2d422_2dc1_4c64_8da4_e881d350c667.slice/crio-42e66a7f0228a8c50bb93fa7e872e03b5def3137edd782835396b84dfc1e1e31 WatchSource:0}: Error finding container 42e66a7f0228a8c50bb93fa7e872e03b5def3137edd782835396b84dfc1e1e31: Status 404 returned error can't find the container with id 42e66a7f0228a8c50bb93fa7e872e03b5def3137edd782835396b84dfc1e1e31 Apr 20 20:07:45.583472 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:45.583441 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qv9dl" event={"ID":"14f2d422-2dc1-4c64-8da4-e881d350c667","Type":"ContainerStarted","Data":"42e66a7f0228a8c50bb93fa7e872e03b5def3137edd782835396b84dfc1e1e31"} Apr 20 20:07:45.588045 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:45.586246 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-59745ff96d-mll2m" event={"ID":"1da8a235-923c-4d64-a78a-ee3ef677d15d","Type":"ContainerStarted","Data":"8989bdc42ec3b9019d1116bf4f0593de3ba2ed72fe3853562b0f33b52d32b78a"} Apr 20 20:07:45.588045 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:45.586281 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-59745ff96d-mll2m" event={"ID":"1da8a235-923c-4d64-a78a-ee3ef677d15d","Type":"ContainerStarted","Data":"42021aa828a5649185f41668d11d3924dc2b95dec093993e101cf79de782a9c2"} Apr 20 20:07:45.605014 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:45.604976 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-59745ff96d-mll2m" podStartSLOduration=32.604965042 podStartE2EDuration="32.604965042s" podCreationTimestamp="2026-04-20 20:07:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:07:45.604283295 +0000 UTC m=+148.019576060" watchObservedRunningTime="2026-04-20 20:07:45.604965042 +0000 UTC m=+148.020257783" Apr 20 20:07:46.196259 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:46.196218 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:46.198967 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:46.198947 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:46.588578 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:46.588550 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qv9dl" event={"ID":"14f2d422-2dc1-4c64-8da4-e881d350c667","Type":"ContainerStarted","Data":"3fef46f631d90101e68f64370b1f44d3603fd87524f6d04256844a8be22ed11a"} Apr 20 20:07:46.588969 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:46.588747 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:46.590025 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:46.590004 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-59745ff96d-mll2m" Apr 20 20:07:46.625498 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:46.625451 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qv9dl" podStartSLOduration=32.631179334 podStartE2EDuration="33.625437073s" podCreationTimestamp="2026-04-20 20:07:13 +0000 UTC" firstStartedPulling="2026-04-20 20:07:45.340005388 +0000 UTC m=+147.755298112" lastFinishedPulling="2026-04-20 20:07:46.334263131 +0000 UTC m=+148.749555851" observedRunningTime="2026-04-20 20:07:46.605056144 +0000 UTC m=+149.020348917" watchObservedRunningTime="2026-04-20 20:07:46.625437073 +0000 UTC m=+149.040729815" Apr 20 20:07:47.528208 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:47.528176 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rc42b"] Apr 20 20:07:47.532167 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:47.532152 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rc42b" Apr 20 20:07:47.534644 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:47.534623 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 20 20:07:47.534749 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:47.534737 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-bkvsh\"" Apr 20 20:07:47.538376 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:47.538338 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rc42b"] Apr 20 20:07:47.716958 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:47.716923 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f056ed27-f10a-4fac-bf71-741c0d02ce55-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-rc42b\" (UID: \"f056ed27-f10a-4fac-bf71-741c0d02ce55\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rc42b" Apr 20 20:07:47.818214 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:47.818143 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f056ed27-f10a-4fac-bf71-741c0d02ce55-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-rc42b\" (UID: \"f056ed27-f10a-4fac-bf71-741c0d02ce55\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rc42b" Apr 20 20:07:47.818331 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:47.818265 2573 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 20 20:07:47.818331 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:47.818330 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f056ed27-f10a-4fac-bf71-741c0d02ce55-tls-certificates podName:f056ed27-f10a-4fac-bf71-741c0d02ce55 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:48.318312325 +0000 UTC m=+150.733605063 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/f056ed27-f10a-4fac-bf71-741c0d02ce55-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-rc42b" (UID: "f056ed27-f10a-4fac-bf71-741c0d02ce55") : secret "prometheus-operator-admission-webhook-tls" not found Apr 20 20:07:48.322701 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:48.322666 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f056ed27-f10a-4fac-bf71-741c0d02ce55-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-rc42b\" (UID: \"f056ed27-f10a-4fac-bf71-741c0d02ce55\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rc42b" Apr 20 20:07:48.325092 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:48.325060 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f056ed27-f10a-4fac-bf71-741c0d02ce55-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-rc42b\" (UID: \"f056ed27-f10a-4fac-bf71-741c0d02ce55\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rc42b" Apr 20 20:07:48.441044 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:48.441016 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rc42b" Apr 20 20:07:48.558939 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:48.558911 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rc42b"] Apr 20 20:07:48.563796 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:07:48.563766 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf056ed27_f10a_4fac_bf71_741c0d02ce55.slice/crio-52256534abe9cfd40c5bab32932dda51151bfa4c9452acc033b49dbd7c220245 WatchSource:0}: Error finding container 52256534abe9cfd40c5bab32932dda51151bfa4c9452acc033b49dbd7c220245: Status 404 returned error can't find the container with id 52256534abe9cfd40c5bab32932dda51151bfa4c9452acc033b49dbd7c220245 Apr 20 20:07:48.593674 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:48.593648 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rc42b" event={"ID":"f056ed27-f10a-4fac-bf71-741c0d02ce55","Type":"ContainerStarted","Data":"52256534abe9cfd40c5bab32932dda51151bfa4c9452acc033b49dbd7c220245"} Apr 20 20:07:50.599661 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:50.599622 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rc42b" event={"ID":"f056ed27-f10a-4fac-bf71-741c0d02ce55","Type":"ContainerStarted","Data":"04b2a5230183bca93114687c1c19637cf2c4dccd76b451fac41ba0f520992f1b"} Apr 20 20:07:50.600095 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:50.599886 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rc42b" Apr 20 20:07:50.604226 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:50.604202 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rc42b" Apr 20 20:07:50.615633 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:50.615592 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-rc42b" podStartSLOduration=2.424994091 podStartE2EDuration="3.615581195s" podCreationTimestamp="2026-04-20 20:07:47 +0000 UTC" firstStartedPulling="2026-04-20 20:07:48.565505569 +0000 UTC m=+150.980798290" lastFinishedPulling="2026-04-20 20:07:49.756092674 +0000 UTC m=+152.171385394" observedRunningTime="2026-04-20 20:07:50.614347011 +0000 UTC m=+153.029639753" watchObservedRunningTime="2026-04-20 20:07:50.615581195 +0000 UTC m=+153.030873934" Apr 20 20:07:53.997393 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:53.997357 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-x7nqw" podUID="ddb92e25-31c0-49bc-9084-b1a08aad3877" Apr 20 20:07:54.016655 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:54.016623 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-67td9" podUID="9104f378-d15a-480e-aae0-cb20f3c35f2c" Apr 20 20:07:54.190245 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:54.190215 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-npkgv" podUID="923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d" Apr 20 20:07:54.612376 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:54.612345 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x7nqw" Apr 20 20:07:56.958152 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:56.958073 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-djt8k"] Apr 20 20:07:56.961426 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:56.961402 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-djt8k" Apr 20 20:07:56.964573 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:56.964552 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 20 20:07:56.964701 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:56.964654 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 20:07:56.965751 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:56.965733 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 20 20:07:56.965874 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:56.965816 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 20:07:56.965874 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:56.965825 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 20:07:56.965971 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:56.965819 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-tdh48\"" Apr 20 20:07:56.966791 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:56.966767 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-cxwts"] Apr 20 20:07:56.969672 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:56.969652 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cxwts" Apr 20 20:07:56.972225 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:56.972203 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 20:07:56.972325 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:56.972248 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 20:07:56.972325 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:56.972274 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 20:07:56.972325 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:56.972310 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-h4n7h\"" Apr 20 20:07:56.973779 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:56.973761 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-szw4p"] Apr 20 20:07:56.977691 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:56.977671 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-djt8k"] Apr 20 20:07:56.977790 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:56.977759 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-szw4p" Apr 20 20:07:56.980022 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:56.980002 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 20 20:07:56.980105 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:56.980038 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 20 20:07:56.980171 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:56.980104 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-m8drx\"" Apr 20 20:07:56.980477 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:56.980456 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 20 20:07:56.988230 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:56.988214 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-szw4p"] Apr 20 20:07:57.084264 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.084235 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/69b6688a-37f6-4688-a5fd-d2ef9e639109-node-exporter-textfile\") pod \"node-exporter-cxwts\" (UID: \"69b6688a-37f6-4688-a5fd-d2ef9e639109\") " pod="openshift-monitoring/node-exporter-cxwts" Apr 20 20:07:57.084264 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.084263 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/69b6688a-37f6-4688-a5fd-d2ef9e639109-metrics-client-ca\") pod \"node-exporter-cxwts\" (UID: \"69b6688a-37f6-4688-a5fd-d2ef9e639109\") " pod="openshift-monitoring/node-exporter-cxwts" Apr 20 20:07:57.084435 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.084281 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6nc9\" (UniqueName: \"kubernetes.io/projected/a39b1ed6-ec22-43b1-83cd-adbacbe16fdd-kube-api-access-n6nc9\") pod \"openshift-state-metrics-9d44df66c-djt8k\" (UID: \"a39b1ed6-ec22-43b1-83cd-adbacbe16fdd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-djt8k" Apr 20 20:07:57.084435 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.084332 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b659d2f4-02dc-4e48-8c03-8ad876c8b7d9-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-szw4p\" (UID: \"b659d2f4-02dc-4e48-8c03-8ad876c8b7d9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-szw4p" Apr 20 20:07:57.084435 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.084362 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/69b6688a-37f6-4688-a5fd-d2ef9e639109-node-exporter-wtmp\") pod \"node-exporter-cxwts\" (UID: \"69b6688a-37f6-4688-a5fd-d2ef9e639109\") " pod="openshift-monitoring/node-exporter-cxwts" Apr 20 20:07:57.084435 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.084396 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a39b1ed6-ec22-43b1-83cd-adbacbe16fdd-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-djt8k\" (UID: \"a39b1ed6-ec22-43b1-83cd-adbacbe16fdd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-djt8k" Apr 20 20:07:57.084435 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.084432 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/69b6688a-37f6-4688-a5fd-d2ef9e639109-node-exporter-accelerators-collector-config\") pod \"node-exporter-cxwts\" (UID: \"69b6688a-37f6-4688-a5fd-d2ef9e639109\") " pod="openshift-monitoring/node-exporter-cxwts" Apr 20 20:07:57.084606 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.084452 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a39b1ed6-ec22-43b1-83cd-adbacbe16fdd-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-djt8k\" (UID: \"a39b1ed6-ec22-43b1-83cd-adbacbe16fdd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-djt8k" Apr 20 20:07:57.084606 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.084494 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl4nm\" (UniqueName: \"kubernetes.io/projected/b659d2f4-02dc-4e48-8c03-8ad876c8b7d9-kube-api-access-nl4nm\") pod \"kube-state-metrics-69db897b98-szw4p\" (UID: \"b659d2f4-02dc-4e48-8c03-8ad876c8b7d9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-szw4p" Apr 20 20:07:57.084606 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.084526 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/69b6688a-37f6-4688-a5fd-d2ef9e639109-sys\") pod \"node-exporter-cxwts\" (UID: \"69b6688a-37f6-4688-a5fd-d2ef9e639109\") " pod="openshift-monitoring/node-exporter-cxwts" Apr 20 20:07:57.084606 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.084558 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/69b6688a-37f6-4688-a5fd-d2ef9e639109-node-exporter-tls\") pod \"node-exporter-cxwts\" (UID: \"69b6688a-37f6-4688-a5fd-d2ef9e639109\") " pod="openshift-monitoring/node-exporter-cxwts" Apr 20 20:07:57.084606 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.084583 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jhsw\" (UniqueName: \"kubernetes.io/projected/69b6688a-37f6-4688-a5fd-d2ef9e639109-kube-api-access-9jhsw\") pod \"node-exporter-cxwts\" (UID: \"69b6688a-37f6-4688-a5fd-d2ef9e639109\") " pod="openshift-monitoring/node-exporter-cxwts" Apr 20 20:07:57.084765 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.084609 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a39b1ed6-ec22-43b1-83cd-adbacbe16fdd-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-djt8k\" (UID: \"a39b1ed6-ec22-43b1-83cd-adbacbe16fdd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-djt8k" Apr 20 20:07:57.084765 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.084630 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b659d2f4-02dc-4e48-8c03-8ad876c8b7d9-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-szw4p\" (UID: \"b659d2f4-02dc-4e48-8c03-8ad876c8b7d9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-szw4p" Apr 20 20:07:57.084765 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.084651 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/b659d2f4-02dc-4e48-8c03-8ad876c8b7d9-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-szw4p\" (UID: \"b659d2f4-02dc-4e48-8c03-8ad876c8b7d9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-szw4p" Apr 20 20:07:57.084765 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.084713 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/69b6688a-37f6-4688-a5fd-d2ef9e639109-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cxwts\" (UID: \"69b6688a-37f6-4688-a5fd-d2ef9e639109\") " pod="openshift-monitoring/node-exporter-cxwts" Apr 20 20:07:57.084765 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.084731 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/b659d2f4-02dc-4e48-8c03-8ad876c8b7d9-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-szw4p\" (UID: \"b659d2f4-02dc-4e48-8c03-8ad876c8b7d9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-szw4p" Apr 20 20:07:57.084765 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.084750 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b659d2f4-02dc-4e48-8c03-8ad876c8b7d9-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-szw4p\" (UID: \"b659d2f4-02dc-4e48-8c03-8ad876c8b7d9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-szw4p" Apr 20 20:07:57.084988 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.084790 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/69b6688a-37f6-4688-a5fd-d2ef9e639109-root\") pod \"node-exporter-cxwts\" (UID: \"69b6688a-37f6-4688-a5fd-d2ef9e639109\") " pod="openshift-monitoring/node-exporter-cxwts" Apr 20 20:07:57.185479 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.185446 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/69b6688a-37f6-4688-a5fd-d2ef9e639109-node-exporter-accelerators-collector-config\") pod \"node-exporter-cxwts\" (UID: \"69b6688a-37f6-4688-a5fd-d2ef9e639109\") " pod="openshift-monitoring/node-exporter-cxwts" Apr 20 20:07:57.185584 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.185483 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a39b1ed6-ec22-43b1-83cd-adbacbe16fdd-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-djt8k\" (UID: \"a39b1ed6-ec22-43b1-83cd-adbacbe16fdd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-djt8k" Apr 20 20:07:57.185584 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.185515 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nl4nm\" (UniqueName: \"kubernetes.io/projected/b659d2f4-02dc-4e48-8c03-8ad876c8b7d9-kube-api-access-nl4nm\") pod \"kube-state-metrics-69db897b98-szw4p\" (UID: \"b659d2f4-02dc-4e48-8c03-8ad876c8b7d9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-szw4p" Apr 20 20:07:57.185584 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.185555 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/69b6688a-37f6-4688-a5fd-d2ef9e639109-sys\") pod \"node-exporter-cxwts\" (UID: \"69b6688a-37f6-4688-a5fd-d2ef9e639109\") " pod="openshift-monitoring/node-exporter-cxwts" Apr 20 20:07:57.185584 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.185577 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/69b6688a-37f6-4688-a5fd-d2ef9e639109-node-exporter-tls\") pod \"node-exporter-cxwts\" (UID: \"69b6688a-37f6-4688-a5fd-d2ef9e639109\") " pod="openshift-monitoring/node-exporter-cxwts" Apr 20 20:07:57.185797 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.185599 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jhsw\" (UniqueName: \"kubernetes.io/projected/69b6688a-37f6-4688-a5fd-d2ef9e639109-kube-api-access-9jhsw\") pod \"node-exporter-cxwts\" (UID: \"69b6688a-37f6-4688-a5fd-d2ef9e639109\") " pod="openshift-monitoring/node-exporter-cxwts" Apr 20 20:07:57.185797 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.185624 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a39b1ed6-ec22-43b1-83cd-adbacbe16fdd-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-djt8k\" (UID: \"a39b1ed6-ec22-43b1-83cd-adbacbe16fdd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-djt8k" Apr 20 20:07:57.185797 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.185656 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/69b6688a-37f6-4688-a5fd-d2ef9e639109-sys\") pod \"node-exporter-cxwts\" (UID: \"69b6688a-37f6-4688-a5fd-d2ef9e639109\") " pod="openshift-monitoring/node-exporter-cxwts" Apr 20 20:07:57.185797 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.185729 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b659d2f4-02dc-4e48-8c03-8ad876c8b7d9-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-szw4p\" (UID: \"b659d2f4-02dc-4e48-8c03-8ad876c8b7d9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-szw4p" Apr 20 20:07:57.185797 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:57.185739 2573 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 20:07:57.185797 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.185771 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/b659d2f4-02dc-4e48-8c03-8ad876c8b7d9-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-szw4p\" (UID: \"b659d2f4-02dc-4e48-8c03-8ad876c8b7d9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-szw4p" Apr 20 20:07:57.186140 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:57.185818 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69b6688a-37f6-4688-a5fd-d2ef9e639109-node-exporter-tls podName:69b6688a-37f6-4688-a5fd-d2ef9e639109 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:57.685793889 +0000 UTC m=+160.101086626 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/69b6688a-37f6-4688-a5fd-d2ef9e639109-node-exporter-tls") pod "node-exporter-cxwts" (UID: "69b6688a-37f6-4688-a5fd-d2ef9e639109") : secret "node-exporter-tls" not found Apr 20 20:07:57.186140 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.185911 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/69b6688a-37f6-4688-a5fd-d2ef9e639109-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cxwts\" (UID: \"69b6688a-37f6-4688-a5fd-d2ef9e639109\") " pod="openshift-monitoring/node-exporter-cxwts" Apr 20 20:07:57.186140 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.185954 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/b659d2f4-02dc-4e48-8c03-8ad876c8b7d9-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-szw4p\" (UID: \"b659d2f4-02dc-4e48-8c03-8ad876c8b7d9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-szw4p" Apr 20 20:07:57.186140 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.185983 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b659d2f4-02dc-4e48-8c03-8ad876c8b7d9-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-szw4p\" (UID: \"b659d2f4-02dc-4e48-8c03-8ad876c8b7d9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-szw4p" Apr 20 20:07:57.186140 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.186014 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/69b6688a-37f6-4688-a5fd-d2ef9e639109-root\") pod \"node-exporter-cxwts\" (UID: \"69b6688a-37f6-4688-a5fd-d2ef9e639109\") " pod="openshift-monitoring/node-exporter-cxwts" Apr 20 20:07:57.186140 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.186046 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/69b6688a-37f6-4688-a5fd-d2ef9e639109-node-exporter-textfile\") pod \"node-exporter-cxwts\" (UID: \"69b6688a-37f6-4688-a5fd-d2ef9e639109\") " pod="openshift-monitoring/node-exporter-cxwts" Apr 20 20:07:57.186140 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.186074 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/69b6688a-37f6-4688-a5fd-d2ef9e639109-metrics-client-ca\") pod \"node-exporter-cxwts\" (UID: \"69b6688a-37f6-4688-a5fd-d2ef9e639109\") " pod="openshift-monitoring/node-exporter-cxwts" Apr 20 20:07:57.186140 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.186099 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n6nc9\" (UniqueName: \"kubernetes.io/projected/a39b1ed6-ec22-43b1-83cd-adbacbe16fdd-kube-api-access-n6nc9\") pod \"openshift-state-metrics-9d44df66c-djt8k\" (UID: \"a39b1ed6-ec22-43b1-83cd-adbacbe16fdd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-djt8k" Apr 20 20:07:57.186140 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.186129 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b659d2f4-02dc-4e48-8c03-8ad876c8b7d9-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-szw4p\" (UID: \"b659d2f4-02dc-4e48-8c03-8ad876c8b7d9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-szw4p" Apr 20 20:07:57.186555 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.186157 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/69b6688a-37f6-4688-a5fd-d2ef9e639109-node-exporter-wtmp\") pod \"node-exporter-cxwts\" (UID: \"69b6688a-37f6-4688-a5fd-d2ef9e639109\") " pod="openshift-monitoring/node-exporter-cxwts" Apr 20 20:07:57.186555 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.186182 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/69b6688a-37f6-4688-a5fd-d2ef9e639109-node-exporter-accelerators-collector-config\") pod \"node-exporter-cxwts\" (UID: \"69b6688a-37f6-4688-a5fd-d2ef9e639109\") " pod="openshift-monitoring/node-exporter-cxwts" Apr 20 20:07:57.186555 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.186196 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a39b1ed6-ec22-43b1-83cd-adbacbe16fdd-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-djt8k\" (UID: \"a39b1ed6-ec22-43b1-83cd-adbacbe16fdd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-djt8k" Apr 20 20:07:57.186555 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.186488 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b659d2f4-02dc-4e48-8c03-8ad876c8b7d9-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-szw4p\" (UID: \"b659d2f4-02dc-4e48-8c03-8ad876c8b7d9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-szw4p" Apr 20 20:07:57.186555 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.186555 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/69b6688a-37f6-4688-a5fd-d2ef9e639109-root\") pod \"node-exporter-cxwts\" (UID: \"69b6688a-37f6-4688-a5fd-d2ef9e639109\") " pod="openshift-monitoring/node-exporter-cxwts" Apr 20 20:07:57.186822 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.186568 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a39b1ed6-ec22-43b1-83cd-adbacbe16fdd-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-djt8k\" (UID: \"a39b1ed6-ec22-43b1-83cd-adbacbe16fdd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-djt8k" Apr 20 20:07:57.186822 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.186574 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/b659d2f4-02dc-4e48-8c03-8ad876c8b7d9-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-szw4p\" (UID: \"b659d2f4-02dc-4e48-8c03-8ad876c8b7d9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-szw4p" Apr 20 20:07:57.186822 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:57.186666 2573 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 20 20:07:57.186822 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:57.186675 2573 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 20 20:07:57.186822 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.186678 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/69b6688a-37f6-4688-a5fd-d2ef9e639109-node-exporter-wtmp\") pod \"node-exporter-cxwts\" (UID: \"69b6688a-37f6-4688-a5fd-d2ef9e639109\") " pod="openshift-monitoring/node-exporter-cxwts" Apr 20 20:07:57.186822 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:57.186716 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b659d2f4-02dc-4e48-8c03-8ad876c8b7d9-kube-state-metrics-tls podName:b659d2f4-02dc-4e48-8c03-8ad876c8b7d9 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:57.686699368 +0000 UTC m=+160.101992089 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/b659d2f4-02dc-4e48-8c03-8ad876c8b7d9-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-szw4p" (UID: "b659d2f4-02dc-4e48-8c03-8ad876c8b7d9") : secret "kube-state-metrics-tls" not found Apr 20 20:07:57.186822 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:07:57.186734 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a39b1ed6-ec22-43b1-83cd-adbacbe16fdd-openshift-state-metrics-tls podName:a39b1ed6-ec22-43b1-83cd-adbacbe16fdd nodeName:}" failed. No retries permitted until 2026-04-20 20:07:57.686724526 +0000 UTC m=+160.102017249 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/a39b1ed6-ec22-43b1-83cd-adbacbe16fdd-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-djt8k" (UID: "a39b1ed6-ec22-43b1-83cd-adbacbe16fdd") : secret "openshift-state-metrics-tls" not found Apr 20 20:07:57.187240 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.186903 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/69b6688a-37f6-4688-a5fd-d2ef9e639109-node-exporter-textfile\") pod \"node-exporter-cxwts\" (UID: \"69b6688a-37f6-4688-a5fd-d2ef9e639109\") " pod="openshift-monitoring/node-exporter-cxwts" Apr 20 20:07:57.187240 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.186957 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/69b6688a-37f6-4688-a5fd-d2ef9e639109-metrics-client-ca\") pod \"node-exporter-cxwts\" (UID: \"69b6688a-37f6-4688-a5fd-d2ef9e639109\") " pod="openshift-monitoring/node-exporter-cxwts" Apr 20 20:07:57.187240 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.187056 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/b659d2f4-02dc-4e48-8c03-8ad876c8b7d9-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-szw4p\" (UID: \"b659d2f4-02dc-4e48-8c03-8ad876c8b7d9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-szw4p" Apr 20 20:07:57.188231 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.188202 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a39b1ed6-ec22-43b1-83cd-adbacbe16fdd-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-djt8k\" (UID: \"a39b1ed6-ec22-43b1-83cd-adbacbe16fdd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-djt8k" Apr 20 20:07:57.188426 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.188405 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/69b6688a-37f6-4688-a5fd-d2ef9e639109-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cxwts\" (UID: \"69b6688a-37f6-4688-a5fd-d2ef9e639109\") " pod="openshift-monitoring/node-exporter-cxwts" Apr 20 20:07:57.188995 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.188979 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b659d2f4-02dc-4e48-8c03-8ad876c8b7d9-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-szw4p\" (UID: \"b659d2f4-02dc-4e48-8c03-8ad876c8b7d9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-szw4p" Apr 20 20:07:57.196110 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.196088 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6nc9\" (UniqueName: \"kubernetes.io/projected/a39b1ed6-ec22-43b1-83cd-adbacbe16fdd-kube-api-access-n6nc9\") pod \"openshift-state-metrics-9d44df66c-djt8k\" (UID: \"a39b1ed6-ec22-43b1-83cd-adbacbe16fdd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-djt8k" Apr 20 20:07:57.196365 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.196345 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jhsw\" (UniqueName: \"kubernetes.io/projected/69b6688a-37f6-4688-a5fd-d2ef9e639109-kube-api-access-9jhsw\") pod \"node-exporter-cxwts\" (UID: \"69b6688a-37f6-4688-a5fd-d2ef9e639109\") " pod="openshift-monitoring/node-exporter-cxwts" Apr 20 20:07:57.196628 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.196610 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl4nm\" (UniqueName: \"kubernetes.io/projected/b659d2f4-02dc-4e48-8c03-8ad876c8b7d9-kube-api-access-nl4nm\") pod \"kube-state-metrics-69db897b98-szw4p\" (UID: \"b659d2f4-02dc-4e48-8c03-8ad876c8b7d9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-szw4p" Apr 20 20:07:57.689638 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.689606 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b659d2f4-02dc-4e48-8c03-8ad876c8b7d9-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-szw4p\" (UID: \"b659d2f4-02dc-4e48-8c03-8ad876c8b7d9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-szw4p" Apr 20 20:07:57.689638 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.689644 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a39b1ed6-ec22-43b1-83cd-adbacbe16fdd-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-djt8k\" (UID: \"a39b1ed6-ec22-43b1-83cd-adbacbe16fdd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-djt8k" Apr 20 20:07:57.689846 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.689686 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/69b6688a-37f6-4688-a5fd-d2ef9e639109-node-exporter-tls\") pod \"node-exporter-cxwts\" (UID: \"69b6688a-37f6-4688-a5fd-d2ef9e639109\") " pod="openshift-monitoring/node-exporter-cxwts" Apr 20 20:07:57.692040 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.692015 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/b659d2f4-02dc-4e48-8c03-8ad876c8b7d9-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-szw4p\" (UID: \"b659d2f4-02dc-4e48-8c03-8ad876c8b7d9\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-szw4p" Apr 20 20:07:57.692040 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.692016 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/69b6688a-37f6-4688-a5fd-d2ef9e639109-node-exporter-tls\") pod \"node-exporter-cxwts\" (UID: \"69b6688a-37f6-4688-a5fd-d2ef9e639109\") " pod="openshift-monitoring/node-exporter-cxwts" Apr 20 20:07:57.692180 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.692017 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a39b1ed6-ec22-43b1-83cd-adbacbe16fdd-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-djt8k\" (UID: \"a39b1ed6-ec22-43b1-83cd-adbacbe16fdd\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-djt8k" Apr 20 20:07:57.870554 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.870531 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-djt8k" Apr 20 20:07:57.879205 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.879187 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cxwts" Apr 20 20:07:57.885789 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:57.885770 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-szw4p" Apr 20 20:07:57.886325 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:07:57.886283 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69b6688a_37f6_4688_a5fd_d2ef9e639109.slice/crio-06166ad418d5a16fc66d386e3c0abe6861f0b155fbc9660c79697ff54cb9acbf WatchSource:0}: Error finding container 06166ad418d5a16fc66d386e3c0abe6861f0b155fbc9660c79697ff54cb9acbf: Status 404 returned error can't find the container with id 06166ad418d5a16fc66d386e3c0abe6861f0b155fbc9660c79697ff54cb9acbf Apr 20 20:07:58.012380 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.012358 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-djt8k"] Apr 20 20:07:58.014378 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:07:58.014353 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda39b1ed6_ec22_43b1_83cd_adbacbe16fdd.slice/crio-41862aadbc2cd964c97b654869dd6a009d035a8caa8be2fad6db2715d2610fa9 WatchSource:0}: Error finding container 41862aadbc2cd964c97b654869dd6a009d035a8caa8be2fad6db2715d2610fa9: Status 404 returned error can't find the container with id 41862aadbc2cd964c97b654869dd6a009d035a8caa8be2fad6db2715d2610fa9 Apr 20 20:07:58.030759 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.030735 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-szw4p"] Apr 20 20:07:58.033973 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:07:58.033950 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb659d2f4_02dc_4e48_8c03_8ad876c8b7d9.slice/crio-c2fbe6bd845e6528af69fc133f7102cc4ac3317e2e624129a32750a0bb291f1a WatchSource:0}: Error finding container c2fbe6bd845e6528af69fc133f7102cc4ac3317e2e624129a32750a0bb291f1a: Status 404 returned error can't find the container with id c2fbe6bd845e6528af69fc133f7102cc4ac3317e2e624129a32750a0bb291f1a Apr 20 20:07:58.142498 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.142472 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:07:58.152542 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.152520 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.157296 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.157272 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 20:07:58.157405 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.157281 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-slp46\"" Apr 20 20:07:58.157405 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.157378 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 20:07:58.157524 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.157508 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 20:07:58.157684 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.157643 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 20:07:58.157684 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.157665 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 20:07:58.157826 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.157805 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 20:07:58.157934 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.157919 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 20:07:58.167088 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.167060 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 20:07:58.172155 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.172137 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 20:07:58.178276 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.178257 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:07:58.294934 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.294908 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/77bfa676-1f7d-43e7-ac4a-280f6e507c80-tls-assets\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.295048 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.294942 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.295048 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.294990 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdq5m\" (UniqueName: \"kubernetes.io/projected/77bfa676-1f7d-43e7-ac4a-280f6e507c80-kube-api-access-tdq5m\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.295144 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.295041 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.295144 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.295085 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77bfa676-1f7d-43e7-ac4a-280f6e507c80-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.295204 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.295169 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/77bfa676-1f7d-43e7-ac4a-280f6e507c80-config-out\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.295234 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.295211 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.295268 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.295246 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.295300 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.295278 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-config-volume\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.295340 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.295306 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/77bfa676-1f7d-43e7-ac4a-280f6e507c80-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.295377 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.295348 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-web-config\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.295407 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.295397 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.295441 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.295420 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/77bfa676-1f7d-43e7-ac4a-280f6e507c80-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.395827 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.395798 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-web-config\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.395975 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.395873 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.395975 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.395904 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/77bfa676-1f7d-43e7-ac4a-280f6e507c80-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.395975 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.395935 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/77bfa676-1f7d-43e7-ac4a-280f6e507c80-tls-assets\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.395975 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.395961 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.396206 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.396007 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tdq5m\" (UniqueName: \"kubernetes.io/projected/77bfa676-1f7d-43e7-ac4a-280f6e507c80-kube-api-access-tdq5m\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.396206 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.396036 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.396206 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.396069 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77bfa676-1f7d-43e7-ac4a-280f6e507c80-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.396206 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.396126 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/77bfa676-1f7d-43e7-ac4a-280f6e507c80-config-out\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.396206 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.396153 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.396206 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.396184 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.396499 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.396222 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-config-volume\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.396499 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.396253 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/77bfa676-1f7d-43e7-ac4a-280f6e507c80-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.396704 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.396666 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/77bfa676-1f7d-43e7-ac4a-280f6e507c80-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.397057 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.397030 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/77bfa676-1f7d-43e7-ac4a-280f6e507c80-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.397785 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.397760 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77bfa676-1f7d-43e7-ac4a-280f6e507c80-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.399151 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.399030 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.399509 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.399485 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.399772 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.399752 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-web-config\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.401296 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.400282 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.401296 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.400446 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/77bfa676-1f7d-43e7-ac4a-280f6e507c80-config-out\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.401761 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.401734 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.401989 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.401965 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.402678 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.402651 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-config-volume\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.403054 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.403032 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/77bfa676-1f7d-43e7-ac4a-280f6e507c80-tls-assets\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.407222 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.407201 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdq5m\" (UniqueName: \"kubernetes.io/projected/77bfa676-1f7d-43e7-ac4a-280f6e507c80-kube-api-access-tdq5m\") pod \"alertmanager-main-0\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.462711 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.462688 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:07:58.622704 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.622667 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-szw4p" event={"ID":"b659d2f4-02dc-4e48-8c03-8ad876c8b7d9","Type":"ContainerStarted","Data":"c2fbe6bd845e6528af69fc133f7102cc4ac3317e2e624129a32750a0bb291f1a"} Apr 20 20:07:58.623739 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.623713 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cxwts" event={"ID":"69b6688a-37f6-4688-a5fd-d2ef9e639109","Type":"ContainerStarted","Data":"06166ad418d5a16fc66d386e3c0abe6861f0b155fbc9660c79697ff54cb9acbf"} Apr 20 20:07:58.625445 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.625341 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-djt8k" event={"ID":"a39b1ed6-ec22-43b1-83cd-adbacbe16fdd","Type":"ContainerStarted","Data":"6925a5266c36adc614e70fdccf14ad0e991bc896d0aa3b931741689314c20ad1"} Apr 20 20:07:58.625445 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.625365 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-djt8k" event={"ID":"a39b1ed6-ec22-43b1-83cd-adbacbe16fdd","Type":"ContainerStarted","Data":"fa67bdeedf3c89bbf5c9ce405d8d69b9449c83a4ba2f1007798f73d4ea1da3c8"} Apr 20 20:07:58.625445 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.625373 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-djt8k" event={"ID":"a39b1ed6-ec22-43b1-83cd-adbacbe16fdd","Type":"ContainerStarted","Data":"41862aadbc2cd964c97b654869dd6a009d035a8caa8be2fad6db2715d2610fa9"} Apr 20 20:07:58.739291 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.739269 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:07:58.741786 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:07:58.741757 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77bfa676_1f7d_43e7_ac4a_280f6e507c80.slice/crio-7805db9d18cdbd2e6c84eeaf976c524753445bb1c2ade6672f02db83e39275f4 WatchSource:0}: Error finding container 7805db9d18cdbd2e6c84eeaf976c524753445bb1c2ade6672f02db83e39275f4: Status 404 returned error can't find the container with id 7805db9d18cdbd2e6c84eeaf976c524753445bb1c2ade6672f02db83e39275f4 Apr 20 20:07:58.900111 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.900040 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddb92e25-31c0-49bc-9084-b1a08aad3877-metrics-tls\") pod \"dns-default-x7nqw\" (UID: \"ddb92e25-31c0-49bc-9084-b1a08aad3877\") " pod="openshift-dns/dns-default-x7nqw" Apr 20 20:07:58.900226 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.900143 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9104f378-d15a-480e-aae0-cb20f3c35f2c-cert\") pod \"ingress-canary-67td9\" (UID: \"9104f378-d15a-480e-aae0-cb20f3c35f2c\") " pod="openshift-ingress-canary/ingress-canary-67td9" Apr 20 20:07:58.902595 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.902571 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ddb92e25-31c0-49bc-9084-b1a08aad3877-metrics-tls\") pod \"dns-default-x7nqw\" (UID: \"ddb92e25-31c0-49bc-9084-b1a08aad3877\") " pod="openshift-dns/dns-default-x7nqw" Apr 20 20:07:58.902595 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:58.902586 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9104f378-d15a-480e-aae0-cb20f3c35f2c-cert\") pod \"ingress-canary-67td9\" (UID: \"9104f378-d15a-480e-aae0-cb20f3c35f2c\") " pod="openshift-ingress-canary/ingress-canary-67td9" Apr 20 20:07:59.115926 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:59.115896 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-c4d6d\"" Apr 20 20:07:59.124253 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:59.124224 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x7nqw" Apr 20 20:07:59.286235 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:59.286203 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-x7nqw"] Apr 20 20:07:59.290062 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:07:59.290028 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddb92e25_31c0_49bc_9084_b1a08aad3877.slice/crio-6cc1f9ecc8440e7b9a8168eab6997b8de4b32e9c5fb67c6cc68ce3347a88a5df WatchSource:0}: Error finding container 6cc1f9ecc8440e7b9a8168eab6997b8de4b32e9c5fb67c6cc68ce3347a88a5df: Status 404 returned error can't find the container with id 6cc1f9ecc8440e7b9a8168eab6997b8de4b32e9c5fb67c6cc68ce3347a88a5df Apr 20 20:07:59.629342 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:59.629303 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x7nqw" event={"ID":"ddb92e25-31c0-49bc-9084-b1a08aad3877","Type":"ContainerStarted","Data":"6cc1f9ecc8440e7b9a8168eab6997b8de4b32e9c5fb67c6cc68ce3347a88a5df"} Apr 20 20:07:59.630518 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:59.630480 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"77bfa676-1f7d-43e7-ac4a-280f6e507c80","Type":"ContainerStarted","Data":"7805db9d18cdbd2e6c84eeaf976c524753445bb1c2ade6672f02db83e39275f4"} Apr 20 20:07:59.632142 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:59.632114 2573 generic.go:358] "Generic (PLEG): container finished" podID="69b6688a-37f6-4688-a5fd-d2ef9e639109" containerID="9645c5224eb2a713f8d8bf1836bb08c34c80dfca58927cb9884e7a602ec79736" exitCode=0 Apr 20 20:07:59.632281 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:59.632196 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cxwts" event={"ID":"69b6688a-37f6-4688-a5fd-d2ef9e639109","Type":"ContainerDied","Data":"9645c5224eb2a713f8d8bf1836bb08c34c80dfca58927cb9884e7a602ec79736"} Apr 20 20:07:59.634248 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:59.634223 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-djt8k" event={"ID":"a39b1ed6-ec22-43b1-83cd-adbacbe16fdd","Type":"ContainerStarted","Data":"fbb7e56662b0f5b50852fad6416409235acdfc7520c2e67eebc0c163d5e8eec6"} Apr 20 20:07:59.678577 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:07:59.678531 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-djt8k" podStartSLOduration=2.640851544 podStartE2EDuration="3.678519475s" podCreationTimestamp="2026-04-20 20:07:56 +0000 UTC" firstStartedPulling="2026-04-20 20:07:58.118217335 +0000 UTC m=+160.533510056" lastFinishedPulling="2026-04-20 20:07:59.155885267 +0000 UTC m=+161.571177987" observedRunningTime="2026-04-20 20:07:59.677203147 +0000 UTC m=+162.092495891" watchObservedRunningTime="2026-04-20 20:07:59.678519475 +0000 UTC m=+162.093812216" Apr 20 20:08:00.639253 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:00.639214 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-szw4p" event={"ID":"b659d2f4-02dc-4e48-8c03-8ad876c8b7d9","Type":"ContainerStarted","Data":"3b8a83fb6283212328e6b43292785956ec7dffabccaac0d577f76bd5360304ee"} Apr 20 20:08:00.639684 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:00.639262 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-szw4p" event={"ID":"b659d2f4-02dc-4e48-8c03-8ad876c8b7d9","Type":"ContainerStarted","Data":"de77a2d0cc434abf3d4af2842c6089f10eedd6941ad0c1b472943d90317f0eff"} Apr 20 20:08:00.639684 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:00.639278 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-szw4p" event={"ID":"b659d2f4-02dc-4e48-8c03-8ad876c8b7d9","Type":"ContainerStarted","Data":"74a87dfb56650f3927f34839c796884ad4f665c74cc666623186066f87a00668"} Apr 20 20:08:00.641632 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:00.641604 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cxwts" event={"ID":"69b6688a-37f6-4688-a5fd-d2ef9e639109","Type":"ContainerStarted","Data":"35c5aeb965b350bd1c956471d2b5e23c2fd9c61ed3bdeb453cc944a88f2dc1bd"} Apr 20 20:08:00.641632 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:00.641635 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cxwts" event={"ID":"69b6688a-37f6-4688-a5fd-d2ef9e639109","Type":"ContainerStarted","Data":"1691d2503ff058874114b04998229c0c07ca9c23d47016116f10cf1b6859130d"} Apr 20 20:08:00.643094 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:00.643066 2573 generic.go:358] "Generic (PLEG): container finished" podID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerID="626945f52344057886b0ad39b84207866aea778c72fcd1c49e20de6d3fbcc59b" exitCode=0 Apr 20 20:08:00.643199 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:00.643149 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"77bfa676-1f7d-43e7-ac4a-280f6e507c80","Type":"ContainerDied","Data":"626945f52344057886b0ad39b84207866aea778c72fcd1c49e20de6d3fbcc59b"} Apr 20 20:08:00.661788 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:00.661746 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-szw4p" podStartSLOduration=2.749514637 podStartE2EDuration="4.661730295s" podCreationTimestamp="2026-04-20 20:07:56 +0000 UTC" firstStartedPulling="2026-04-20 20:07:58.035482113 +0000 UTC m=+160.450774833" lastFinishedPulling="2026-04-20 20:07:59.947697757 +0000 UTC m=+162.362990491" observedRunningTime="2026-04-20 20:08:00.660621468 +0000 UTC m=+163.075914211" watchObservedRunningTime="2026-04-20 20:08:00.661730295 +0000 UTC m=+163.077023036" Apr 20 20:08:00.708217 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:00.708172 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-cxwts" podStartSLOduration=3.940631629 podStartE2EDuration="4.708160272s" podCreationTimestamp="2026-04-20 20:07:56 +0000 UTC" firstStartedPulling="2026-04-20 20:07:57.889829559 +0000 UTC m=+160.305122279" lastFinishedPulling="2026-04-20 20:07:58.657358189 +0000 UTC m=+161.072650922" observedRunningTime="2026-04-20 20:08:00.707515886 +0000 UTC m=+163.122808693" watchObservedRunningTime="2026-04-20 20:08:00.708160272 +0000 UTC m=+163.123453013" Apr 20 20:08:01.648125 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:01.648085 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x7nqw" event={"ID":"ddb92e25-31c0-49bc-9084-b1a08aad3877","Type":"ContainerStarted","Data":"df08c4c784e85fd3cf4eb8e52f74b9c5d314e76d8746734f5fb441581a6e61c4"} Apr 20 20:08:01.648125 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:01.648132 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x7nqw" event={"ID":"ddb92e25-31c0-49bc-9084-b1a08aad3877","Type":"ContainerStarted","Data":"e5244e70feff31aa4a729f5aadb2fe1b5b385bb9ea3c9c2f0af1b133d37f73e6"} Apr 20 20:08:01.671861 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:01.671792 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-x7nqw" podStartSLOduration=129.970714833 podStartE2EDuration="2m11.671773572s" podCreationTimestamp="2026-04-20 20:05:50 +0000 UTC" firstStartedPulling="2026-04-20 20:07:59.292570402 +0000 UTC m=+161.707863137" lastFinishedPulling="2026-04-20 20:08:00.993629157 +0000 UTC m=+163.408921876" observedRunningTime="2026-04-20 20:08:01.669076677 +0000 UTC m=+164.084369420" watchObservedRunningTime="2026-04-20 20:08:01.671773572 +0000 UTC m=+164.087066316" Apr 20 20:08:01.972454 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:01.972385 2573 patch_prober.go:28] interesting pod/image-registry-6754dfd45f-tfgml container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 20:08:01.972592 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:01.972446 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" podUID="84920716-a237-4371-8e3b-ecc46291eb90" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:08:02.653303 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:02.653266 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"77bfa676-1f7d-43e7-ac4a-280f6e507c80","Type":"ContainerStarted","Data":"5a0ab59b0723d106e21c06d480057a2f61d48b11fef5a638d4775acf5e10c594"} Apr 20 20:08:02.653303 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:02.653306 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"77bfa676-1f7d-43e7-ac4a-280f6e507c80","Type":"ContainerStarted","Data":"6a65ccab9aaf326bab1421e96d037da631ffcc45897cf11c0fa14f5f148478d4"} Apr 20 20:08:02.653697 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:02.653317 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"77bfa676-1f7d-43e7-ac4a-280f6e507c80","Type":"ContainerStarted","Data":"9a80d73a3c076c512db1929090438926c7a76e493f8ddc56eec99020dba216b2"} Apr 20 20:08:02.653697 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:02.653326 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"77bfa676-1f7d-43e7-ac4a-280f6e507c80","Type":"ContainerStarted","Data":"a0d01c7244004ca80dfe2d60d63a5c8c44a69aa992e72eb3bd0371e0f28adda9"} Apr 20 20:08:02.653697 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:02.653335 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"77bfa676-1f7d-43e7-ac4a-280f6e507c80","Type":"ContainerStarted","Data":"57a58c42deef08889c0f64c255dec7cde51db5b7da08909388991d0c586409a2"} Apr 20 20:08:02.653697 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:02.653522 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-x7nqw" Apr 20 20:08:03.580725 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:03.580696 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6754dfd45f-tfgml" Apr 20 20:08:03.661542 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:03.661504 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"77bfa676-1f7d-43e7-ac4a-280f6e507c80","Type":"ContainerStarted","Data":"7f39626ab2240c1782f612a7f5be4d43cbb7b60f7a5fafd86edb034555120daf"} Apr 20 20:08:03.689808 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:03.689762 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.441923499 podStartE2EDuration="5.689744571s" podCreationTimestamp="2026-04-20 20:07:58 +0000 UTC" firstStartedPulling="2026-04-20 20:07:58.743841025 +0000 UTC m=+161.159133746" lastFinishedPulling="2026-04-20 20:08:02.991662082 +0000 UTC m=+165.406954818" observedRunningTime="2026-04-20 20:08:03.688208241 +0000 UTC m=+166.103500984" watchObservedRunningTime="2026-04-20 20:08:03.689744571 +0000 UTC m=+166.105037314" Apr 20 20:08:05.166381 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:05.166336 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:08:09.166674 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:09.166641 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-67td9" Apr 20 20:08:09.169360 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:09.169340 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-5cv42\"" Apr 20 20:08:09.177803 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:09.177779 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-67td9" Apr 20 20:08:09.289896 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:09.289873 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-67td9"] Apr 20 20:08:09.292026 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:08:09.292000 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9104f378_d15a_480e_aae0_cb20f3c35f2c.slice/crio-202a49fded6160c8ebea1e5fe207d2f190ec0870240ac2307177ecad3cbda31b WatchSource:0}: Error finding container 202a49fded6160c8ebea1e5fe207d2f190ec0870240ac2307177ecad3cbda31b: Status 404 returned error can't find the container with id 202a49fded6160c8ebea1e5fe207d2f190ec0870240ac2307177ecad3cbda31b Apr 20 20:08:09.542136 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:09.542110 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-77c6w"] Apr 20 20:08:09.546725 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:09.546705 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-77c6w" Apr 20 20:08:09.550359 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:09.550341 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 20:08:09.550452 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:09.550378 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 20:08:09.550663 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:09.550646 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-kwdc6\"" Apr 20 20:08:09.557372 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:09.557348 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-77c6w"] Apr 20 20:08:09.586714 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:09.586690 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7p9g\" (UniqueName: \"kubernetes.io/projected/f0e0866b-98b5-43bb-811b-a80d0fe3e428-kube-api-access-h7p9g\") pod \"downloads-6bcc868b7-77c6w\" (UID: \"f0e0866b-98b5-43bb-811b-a80d0fe3e428\") " pod="openshift-console/downloads-6bcc868b7-77c6w" Apr 20 20:08:09.679401 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:09.679373 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-67td9" event={"ID":"9104f378-d15a-480e-aae0-cb20f3c35f2c","Type":"ContainerStarted","Data":"202a49fded6160c8ebea1e5fe207d2f190ec0870240ac2307177ecad3cbda31b"} Apr 20 20:08:09.687748 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:09.687730 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7p9g\" (UniqueName: \"kubernetes.io/projected/f0e0866b-98b5-43bb-811b-a80d0fe3e428-kube-api-access-h7p9g\") pod \"downloads-6bcc868b7-77c6w\" (UID: \"f0e0866b-98b5-43bb-811b-a80d0fe3e428\") " pod="openshift-console/downloads-6bcc868b7-77c6w" Apr 20 20:08:09.696496 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:09.696476 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7p9g\" (UniqueName: \"kubernetes.io/projected/f0e0866b-98b5-43bb-811b-a80d0fe3e428-kube-api-access-h7p9g\") pod \"downloads-6bcc868b7-77c6w\" (UID: \"f0e0866b-98b5-43bb-811b-a80d0fe3e428\") " pod="openshift-console/downloads-6bcc868b7-77c6w" Apr 20 20:08:09.856311 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:09.856238 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-77c6w" Apr 20 20:08:09.993773 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:09.993742 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-77c6w"] Apr 20 20:08:09.996381 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:08:09.996348 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0e0866b_98b5_43bb_811b_a80d0fe3e428.slice/crio-657856ef8681ca80edd98c9a240cac37ba81de7c7cada095d8fc97d125e964f0 WatchSource:0}: Error finding container 657856ef8681ca80edd98c9a240cac37ba81de7c7cada095d8fc97d125e964f0: Status 404 returned error can't find the container with id 657856ef8681ca80edd98c9a240cac37ba81de7c7cada095d8fc97d125e964f0 Apr 20 20:08:10.683831 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:10.683792 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-77c6w" event={"ID":"f0e0866b-98b5-43bb-811b-a80d0fe3e428","Type":"ContainerStarted","Data":"657856ef8681ca80edd98c9a240cac37ba81de7c7cada095d8fc97d125e964f0"} Apr 20 20:08:11.689220 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:11.689176 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-67td9" event={"ID":"9104f378-d15a-480e-aae0-cb20f3c35f2c","Type":"ContainerStarted","Data":"c9a28a4d6f03f7469f8b10f6ab645302c2a13ecf19dcc286f5285865221ced44"} Apr 20 20:08:11.705386 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:11.705339 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-67td9" podStartSLOduration=140.09514593 podStartE2EDuration="2m21.70532303s" podCreationTimestamp="2026-04-20 20:05:50 +0000 UTC" firstStartedPulling="2026-04-20 20:08:09.294335491 +0000 UTC m=+171.709628211" lastFinishedPulling="2026-04-20 20:08:10.904512591 +0000 UTC m=+173.319805311" observedRunningTime="2026-04-20 20:08:11.704626928 +0000 UTC m=+174.119919672" watchObservedRunningTime="2026-04-20 20:08:11.70532303 +0000 UTC m=+174.120615766" Apr 20 20:08:12.663794 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:12.663761 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-x7nqw" Apr 20 20:08:16.539818 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:16.539787 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-57ccf95dfd-njgpv"] Apr 20 20:08:16.543143 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:16.543123 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57ccf95dfd-njgpv" Apr 20 20:08:16.547984 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:16.547887 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-xjcr2\"" Apr 20 20:08:16.547984 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:16.547935 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 20:08:16.547984 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:16.547947 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 20:08:16.548206 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:16.547940 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 20:08:16.548206 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:16.547889 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 20:08:16.548311 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:16.548275 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 20:08:16.553375 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:16.553349 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57ccf95dfd-njgpv"] Apr 20 20:08:16.649766 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:16.649729 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-oauth-serving-cert\") pod \"console-57ccf95dfd-njgpv\" (UID: \"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba\") " pod="openshift-console/console-57ccf95dfd-njgpv" Apr 20 20:08:16.649958 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:16.649777 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-console-oauth-config\") pod \"console-57ccf95dfd-njgpv\" (UID: \"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba\") " pod="openshift-console/console-57ccf95dfd-njgpv" Apr 20 20:08:16.649958 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:16.649913 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-service-ca\") pod \"console-57ccf95dfd-njgpv\" (UID: \"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba\") " pod="openshift-console/console-57ccf95dfd-njgpv" Apr 20 20:08:16.650080 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:16.649959 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-console-config\") pod \"console-57ccf95dfd-njgpv\" (UID: \"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba\") " pod="openshift-console/console-57ccf95dfd-njgpv" Apr 20 20:08:16.650080 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:16.650007 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp8m7\" (UniqueName: \"kubernetes.io/projected/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-kube-api-access-qp8m7\") pod \"console-57ccf95dfd-njgpv\" (UID: \"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba\") " pod="openshift-console/console-57ccf95dfd-njgpv" Apr 20 20:08:16.650080 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:16.650050 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-console-serving-cert\") pod \"console-57ccf95dfd-njgpv\" (UID: \"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba\") " pod="openshift-console/console-57ccf95dfd-njgpv" Apr 20 20:08:16.751013 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:16.750981 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-console-serving-cert\") pod \"console-57ccf95dfd-njgpv\" (UID: \"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba\") " pod="openshift-console/console-57ccf95dfd-njgpv" Apr 20 20:08:16.751170 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:16.751032 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-oauth-serving-cert\") pod \"console-57ccf95dfd-njgpv\" (UID: \"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba\") " pod="openshift-console/console-57ccf95dfd-njgpv" Apr 20 20:08:16.751170 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:16.751053 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-console-oauth-config\") pod \"console-57ccf95dfd-njgpv\" (UID: \"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba\") " pod="openshift-console/console-57ccf95dfd-njgpv" Apr 20 20:08:16.751170 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:16.751091 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-service-ca\") pod \"console-57ccf95dfd-njgpv\" (UID: \"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba\") " pod="openshift-console/console-57ccf95dfd-njgpv" Apr 20 20:08:16.751170 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:16.751114 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-console-config\") pod \"console-57ccf95dfd-njgpv\" (UID: \"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba\") " pod="openshift-console/console-57ccf95dfd-njgpv" Apr 20 20:08:16.751361 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:16.751333 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qp8m7\" (UniqueName: \"kubernetes.io/projected/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-kube-api-access-qp8m7\") pod \"console-57ccf95dfd-njgpv\" (UID: \"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba\") " pod="openshift-console/console-57ccf95dfd-njgpv" Apr 20 20:08:16.751778 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:16.751754 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-oauth-serving-cert\") pod \"console-57ccf95dfd-njgpv\" (UID: \"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba\") " pod="openshift-console/console-57ccf95dfd-njgpv" Apr 20 20:08:16.751926 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:16.751871 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-service-ca\") pod \"console-57ccf95dfd-njgpv\" (UID: \"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba\") " pod="openshift-console/console-57ccf95dfd-njgpv" Apr 20 20:08:16.751926 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:16.751872 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-console-config\") pod \"console-57ccf95dfd-njgpv\" (UID: \"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba\") " pod="openshift-console/console-57ccf95dfd-njgpv" Apr 20 20:08:16.753409 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:16.753382 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-console-serving-cert\") pod \"console-57ccf95dfd-njgpv\" (UID: \"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba\") " pod="openshift-console/console-57ccf95dfd-njgpv" Apr 20 20:08:16.753503 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:16.753457 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-console-oauth-config\") pod \"console-57ccf95dfd-njgpv\" (UID: \"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba\") " pod="openshift-console/console-57ccf95dfd-njgpv" Apr 20 20:08:16.758803 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:16.758783 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp8m7\" (UniqueName: \"kubernetes.io/projected/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-kube-api-access-qp8m7\") pod \"console-57ccf95dfd-njgpv\" (UID: \"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba\") " pod="openshift-console/console-57ccf95dfd-njgpv" Apr 20 20:08:16.854545 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:16.854481 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57ccf95dfd-njgpv" Apr 20 20:08:16.984429 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:16.984401 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57ccf95dfd-njgpv"] Apr 20 20:08:16.987926 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:08:16.987840 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb50ee53b_7d97_4c27_8a3e_58ce35cc34ba.slice/crio-c464adf12abee1a372d53c02ae23d6421208d9c14c2f209a9035cdc8c3309a9e WatchSource:0}: Error finding container c464adf12abee1a372d53c02ae23d6421208d9c14c2f209a9035cdc8c3309a9e: Status 404 returned error can't find the container with id c464adf12abee1a372d53c02ae23d6421208d9c14c2f209a9035cdc8c3309a9e Apr 20 20:08:17.710602 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:17.710564 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57ccf95dfd-njgpv" event={"ID":"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba","Type":"ContainerStarted","Data":"c464adf12abee1a372d53c02ae23d6421208d9c14c2f209a9035cdc8c3309a9e"} Apr 20 20:08:22.513147 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:22.513109 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-qwgxr_dfd612b2-5ec7-4a68-9cd6-29a94ae37e78/cluster-samples-operator/0.log" Apr 20 20:08:22.520566 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:22.520539 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-qwgxr_dfd612b2-5ec7-4a68-9cd6-29a94ae37e78/cluster-samples-operator-watch/0.log" Apr 20 20:08:27.273198 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.273155 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-67877d55c5-d2cnd"] Apr 20 20:08:27.277923 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.277896 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67877d55c5-d2cnd" Apr 20 20:08:27.291213 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.291188 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 20:08:27.292663 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.292638 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67877d55c5-d2cnd"] Apr 20 20:08:27.359090 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.359055 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58wcf\" (UniqueName: \"kubernetes.io/projected/9c1e48be-77e7-4c56-97df-9f0139870ab8-kube-api-access-58wcf\") pod \"console-67877d55c5-d2cnd\" (UID: \"9c1e48be-77e7-4c56-97df-9f0139870ab8\") " pod="openshift-console/console-67877d55c5-d2cnd" Apr 20 20:08:27.359263 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.359172 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c1e48be-77e7-4c56-97df-9f0139870ab8-service-ca\") pod \"console-67877d55c5-d2cnd\" (UID: \"9c1e48be-77e7-4c56-97df-9f0139870ab8\") " pod="openshift-console/console-67877d55c5-d2cnd" Apr 20 20:08:27.359263 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.359217 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c1e48be-77e7-4c56-97df-9f0139870ab8-console-config\") pod \"console-67877d55c5-d2cnd\" (UID: \"9c1e48be-77e7-4c56-97df-9f0139870ab8\") " pod="openshift-console/console-67877d55c5-d2cnd" Apr 20 20:08:27.359263 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.359254 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c1e48be-77e7-4c56-97df-9f0139870ab8-console-oauth-config\") pod \"console-67877d55c5-d2cnd\" (UID: \"9c1e48be-77e7-4c56-97df-9f0139870ab8\") " pod="openshift-console/console-67877d55c5-d2cnd" Apr 20 20:08:27.359416 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.359289 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c1e48be-77e7-4c56-97df-9f0139870ab8-oauth-serving-cert\") pod \"console-67877d55c5-d2cnd\" (UID: \"9c1e48be-77e7-4c56-97df-9f0139870ab8\") " pod="openshift-console/console-67877d55c5-d2cnd" Apr 20 20:08:27.359416 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.359351 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c1e48be-77e7-4c56-97df-9f0139870ab8-trusted-ca-bundle\") pod \"console-67877d55c5-d2cnd\" (UID: \"9c1e48be-77e7-4c56-97df-9f0139870ab8\") " pod="openshift-console/console-67877d55c5-d2cnd" Apr 20 20:08:27.359499 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.359409 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c1e48be-77e7-4c56-97df-9f0139870ab8-console-serving-cert\") pod \"console-67877d55c5-d2cnd\" (UID: \"9c1e48be-77e7-4c56-97df-9f0139870ab8\") " pod="openshift-console/console-67877d55c5-d2cnd" Apr 20 20:08:27.460573 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.460462 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58wcf\" (UniqueName: \"kubernetes.io/projected/9c1e48be-77e7-4c56-97df-9f0139870ab8-kube-api-access-58wcf\") pod \"console-67877d55c5-d2cnd\" (UID: \"9c1e48be-77e7-4c56-97df-9f0139870ab8\") " pod="openshift-console/console-67877d55c5-d2cnd" Apr 20 20:08:27.460757 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.460648 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c1e48be-77e7-4c56-97df-9f0139870ab8-service-ca\") pod \"console-67877d55c5-d2cnd\" (UID: \"9c1e48be-77e7-4c56-97df-9f0139870ab8\") " pod="openshift-console/console-67877d55c5-d2cnd" Apr 20 20:08:27.460757 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.460685 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c1e48be-77e7-4c56-97df-9f0139870ab8-console-config\") pod \"console-67877d55c5-d2cnd\" (UID: \"9c1e48be-77e7-4c56-97df-9f0139870ab8\") " pod="openshift-console/console-67877d55c5-d2cnd" Apr 20 20:08:27.460757 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.460724 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c1e48be-77e7-4c56-97df-9f0139870ab8-console-oauth-config\") pod \"console-67877d55c5-d2cnd\" (UID: \"9c1e48be-77e7-4c56-97df-9f0139870ab8\") " pod="openshift-console/console-67877d55c5-d2cnd" Apr 20 20:08:27.460948 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.460755 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c1e48be-77e7-4c56-97df-9f0139870ab8-oauth-serving-cert\") pod \"console-67877d55c5-d2cnd\" (UID: \"9c1e48be-77e7-4c56-97df-9f0139870ab8\") " pod="openshift-console/console-67877d55c5-d2cnd" Apr 20 20:08:27.460948 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.460820 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c1e48be-77e7-4c56-97df-9f0139870ab8-trusted-ca-bundle\") pod \"console-67877d55c5-d2cnd\" (UID: \"9c1e48be-77e7-4c56-97df-9f0139870ab8\") " pod="openshift-console/console-67877d55c5-d2cnd" Apr 20 20:08:27.460948 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.460891 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c1e48be-77e7-4c56-97df-9f0139870ab8-console-serving-cert\") pod \"console-67877d55c5-d2cnd\" (UID: \"9c1e48be-77e7-4c56-97df-9f0139870ab8\") " pod="openshift-console/console-67877d55c5-d2cnd" Apr 20 20:08:27.461664 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.461635 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c1e48be-77e7-4c56-97df-9f0139870ab8-oauth-serving-cert\") pod \"console-67877d55c5-d2cnd\" (UID: \"9c1e48be-77e7-4c56-97df-9f0139870ab8\") " pod="openshift-console/console-67877d55c5-d2cnd" Apr 20 20:08:27.461848 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.461773 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c1e48be-77e7-4c56-97df-9f0139870ab8-trusted-ca-bundle\") pod \"console-67877d55c5-d2cnd\" (UID: \"9c1e48be-77e7-4c56-97df-9f0139870ab8\") " pod="openshift-console/console-67877d55c5-d2cnd" Apr 20 20:08:27.461848 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.461777 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c1e48be-77e7-4c56-97df-9f0139870ab8-console-config\") pod \"console-67877d55c5-d2cnd\" (UID: \"9c1e48be-77e7-4c56-97df-9f0139870ab8\") " pod="openshift-console/console-67877d55c5-d2cnd" Apr 20 20:08:27.462035 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.462002 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c1e48be-77e7-4c56-97df-9f0139870ab8-service-ca\") pod \"console-67877d55c5-d2cnd\" (UID: \"9c1e48be-77e7-4c56-97df-9f0139870ab8\") " pod="openshift-console/console-67877d55c5-d2cnd" Apr 20 20:08:27.463748 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.463724 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c1e48be-77e7-4c56-97df-9f0139870ab8-console-serving-cert\") pod \"console-67877d55c5-d2cnd\" (UID: \"9c1e48be-77e7-4c56-97df-9f0139870ab8\") " pod="openshift-console/console-67877d55c5-d2cnd" Apr 20 20:08:27.463923 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.463905 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c1e48be-77e7-4c56-97df-9f0139870ab8-console-oauth-config\") pod \"console-67877d55c5-d2cnd\" (UID: \"9c1e48be-77e7-4c56-97df-9f0139870ab8\") " pod="openshift-console/console-67877d55c5-d2cnd" Apr 20 20:08:27.470687 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.470663 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58wcf\" (UniqueName: \"kubernetes.io/projected/9c1e48be-77e7-4c56-97df-9f0139870ab8-kube-api-access-58wcf\") pod \"console-67877d55c5-d2cnd\" (UID: \"9c1e48be-77e7-4c56-97df-9f0139870ab8\") " pod="openshift-console/console-67877d55c5-d2cnd" Apr 20 20:08:27.589373 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.589334 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67877d55c5-d2cnd" Apr 20 20:08:27.741839 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.741794 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57ccf95dfd-njgpv" event={"ID":"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba","Type":"ContainerStarted","Data":"ee903a8a0376bff277a7ed9b843a11592683cc96fe3471fc8630fae527ae1115"} Apr 20 20:08:27.743472 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.743442 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-77c6w" event={"ID":"f0e0866b-98b5-43bb-811b-a80d0fe3e428","Type":"ContainerStarted","Data":"dbad9b72f3f96dfc48c7e0bf70eeadf31d6c8da808ba401dea077c842642f61a"} Apr 20 20:08:27.743699 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.743665 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-77c6w" Apr 20 20:08:27.749416 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.749393 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67877d55c5-d2cnd"] Apr 20 20:08:27.752664 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:08:27.752637 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c1e48be_77e7_4c56_97df_9f0139870ab8.slice/crio-c1730c428532539c36aa6a78bca3c985543069c49c0623e4aff2f9fa7c203c9a WatchSource:0}: Error finding container c1730c428532539c36aa6a78bca3c985543069c49c0623e4aff2f9fa7c203c9a: Status 404 returned error can't find the container with id c1730c428532539c36aa6a78bca3c985543069c49c0623e4aff2f9fa7c203c9a Apr 20 20:08:27.756010 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.755971 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-77c6w" Apr 20 20:08:27.773546 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.773391 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-57ccf95dfd-njgpv" podStartSLOduration=2.071286928 podStartE2EDuration="11.773373778s" podCreationTimestamp="2026-04-20 20:08:16 +0000 UTC" firstStartedPulling="2026-04-20 20:08:16.990269567 +0000 UTC m=+179.405562289" lastFinishedPulling="2026-04-20 20:08:26.692356416 +0000 UTC m=+189.107649139" observedRunningTime="2026-04-20 20:08:27.77244042 +0000 UTC m=+190.187733174" watchObservedRunningTime="2026-04-20 20:08:27.773373778 +0000 UTC m=+190.188666522" Apr 20 20:08:27.798792 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:27.798635 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-77c6w" podStartSLOduration=2.06554226 podStartE2EDuration="18.798615947s" podCreationTimestamp="2026-04-20 20:08:09 +0000 UTC" firstStartedPulling="2026-04-20 20:08:09.998578346 +0000 UTC m=+172.413871066" lastFinishedPulling="2026-04-20 20:08:26.731652032 +0000 UTC m=+189.146944753" observedRunningTime="2026-04-20 20:08:27.797392794 +0000 UTC m=+190.212685536" watchObservedRunningTime="2026-04-20 20:08:27.798615947 +0000 UTC m=+190.213908690" Apr 20 20:08:28.748941 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:28.748893 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67877d55c5-d2cnd" event={"ID":"9c1e48be-77e7-4c56-97df-9f0139870ab8","Type":"ContainerStarted","Data":"47f2c7023ba7661fbc234ee9181e44f86801292f0619b2b0f69aa0aaae3ba467"} Apr 20 20:08:28.748941 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:28.748952 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67877d55c5-d2cnd" event={"ID":"9c1e48be-77e7-4c56-97df-9f0139870ab8","Type":"ContainerStarted","Data":"c1730c428532539c36aa6a78bca3c985543069c49c0623e4aff2f9fa7c203c9a"} Apr 20 20:08:28.774481 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:28.774425 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-67877d55c5-d2cnd" podStartSLOduration=1.774408518 podStartE2EDuration="1.774408518s" podCreationTimestamp="2026-04-20 20:08:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:08:28.771955201 +0000 UTC m=+191.187247956" watchObservedRunningTime="2026-04-20 20:08:28.774408518 +0000 UTC m=+191.189701263" Apr 20 20:08:36.855590 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:36.855536 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-57ccf95dfd-njgpv" Apr 20 20:08:36.856371 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:36.855674 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-57ccf95dfd-njgpv" Apr 20 20:08:36.869991 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:36.869965 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-57ccf95dfd-njgpv" Apr 20 20:08:37.590317 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:37.590287 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-67877d55c5-d2cnd" Apr 20 20:08:37.590497 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:37.590365 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-67877d55c5-d2cnd" Apr 20 20:08:37.594895 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:37.594876 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-67877d55c5-d2cnd" Apr 20 20:08:37.782650 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:37.782621 2573 generic.go:358] "Generic (PLEG): container finished" podID="06eaab54-dbb4-4d67-8fc9-d22d03b1e5a8" containerID="60eb11c9c90683d16f71df52ee3f36b9010a8d3b5363098677ca552ce21ed3c2" exitCode=0 Apr 20 20:08:37.782797 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:37.782693 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mpjng" event={"ID":"06eaab54-dbb4-4d67-8fc9-d22d03b1e5a8","Type":"ContainerDied","Data":"60eb11c9c90683d16f71df52ee3f36b9010a8d3b5363098677ca552ce21ed3c2"} Apr 20 20:08:37.783008 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:37.782988 2573 scope.go:117] "RemoveContainer" containerID="60eb11c9c90683d16f71df52ee3f36b9010a8d3b5363098677ca552ce21ed3c2" Apr 20 20:08:37.786755 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:37.786734 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-57ccf95dfd-njgpv" Apr 20 20:08:37.787269 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:37.787253 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-67877d55c5-d2cnd" Apr 20 20:08:37.897225 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:37.897192 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-57ccf95dfd-njgpv"] Apr 20 20:08:38.787337 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:38.787305 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mpjng" event={"ID":"06eaab54-dbb4-4d67-8fc9-d22d03b1e5a8","Type":"ContainerStarted","Data":"7a6f5f7bc2a69505e4b1dd221f03045da3a04283ef2b36ed59119b7cc8a78394"} Apr 20 20:08:42.800596 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:42.800559 2573 generic.go:358] "Generic (PLEG): container finished" podID="51239262-468e-4240-9144-dfb1b3010a21" containerID="3dcb9190fc65a8d0623b061dd317350c10d51bb1555acb959133f4d7e977bc36" exitCode=0 Apr 20 20:08:42.801048 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:42.800611 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zl54z" event={"ID":"51239262-468e-4240-9144-dfb1b3010a21","Type":"ContainerDied","Data":"3dcb9190fc65a8d0623b061dd317350c10d51bb1555acb959133f4d7e977bc36"} Apr 20 20:08:42.801048 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:42.801021 2573 scope.go:117] "RemoveContainer" containerID="3dcb9190fc65a8d0623b061dd317350c10d51bb1555acb959133f4d7e977bc36" Apr 20 20:08:43.805229 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:08:43.805197 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zl54z" event={"ID":"51239262-468e-4240-9144-dfb1b3010a21","Type":"ContainerStarted","Data":"2c5f4da4237ad31ecee4255a5c6bb4193fd16d0d5f2ccc3f3098a60d7a0107fd"} Apr 20 20:09:04.808989 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:04.808927 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-57ccf95dfd-njgpv" podUID="b50ee53b-7d97-4c27-8a3e-58ce35cc34ba" containerName="console" containerID="cri-o://ee903a8a0376bff277a7ed9b843a11592683cc96fe3471fc8630fae527ae1115" gracePeriod=15 Apr 20 20:09:05.071768 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:05.071747 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-57ccf95dfd-njgpv_b50ee53b-7d97-4c27-8a3e-58ce35cc34ba/console/0.log" Apr 20 20:09:05.071911 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:05.071817 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57ccf95dfd-njgpv" Apr 20 20:09:05.183071 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:05.183046 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp8m7\" (UniqueName: \"kubernetes.io/projected/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-kube-api-access-qp8m7\") pod \"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba\" (UID: \"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba\") " Apr 20 20:09:05.183186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:05.183103 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-service-ca\") pod \"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba\" (UID: \"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba\") " Apr 20 20:09:05.183186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:05.183133 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-console-serving-cert\") pod \"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba\" (UID: \"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba\") " Apr 20 20:09:05.183186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:05.183153 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-oauth-serving-cert\") pod \"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba\" (UID: \"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba\") " Apr 20 20:09:05.183186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:05.183179 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-console-config\") pod \"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba\" (UID: \"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba\") " Apr 20 20:09:05.183407 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:05.183245 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-console-oauth-config\") pod \"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba\" (UID: \"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba\") " Apr 20 20:09:05.183528 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:05.183481 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-service-ca" (OuterVolumeSpecName: "service-ca") pod "b50ee53b-7d97-4c27-8a3e-58ce35cc34ba" (UID: "b50ee53b-7d97-4c27-8a3e-58ce35cc34ba"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:05.183582 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:05.183557 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b50ee53b-7d97-4c27-8a3e-58ce35cc34ba" (UID: "b50ee53b-7d97-4c27-8a3e-58ce35cc34ba"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:05.183616 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:05.183585 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-console-config" (OuterVolumeSpecName: "console-config") pod "b50ee53b-7d97-4c27-8a3e-58ce35cc34ba" (UID: "b50ee53b-7d97-4c27-8a3e-58ce35cc34ba"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:05.185344 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:05.185314 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b50ee53b-7d97-4c27-8a3e-58ce35cc34ba" (UID: "b50ee53b-7d97-4c27-8a3e-58ce35cc34ba"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:05.185438 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:05.185366 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-kube-api-access-qp8m7" (OuterVolumeSpecName: "kube-api-access-qp8m7") pod "b50ee53b-7d97-4c27-8a3e-58ce35cc34ba" (UID: "b50ee53b-7d97-4c27-8a3e-58ce35cc34ba"). InnerVolumeSpecName "kube-api-access-qp8m7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:09:05.185438 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:05.185371 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b50ee53b-7d97-4c27-8a3e-58ce35cc34ba" (UID: "b50ee53b-7d97-4c27-8a3e-58ce35cc34ba"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:05.283979 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:05.283955 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-service-ca\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:09:05.283979 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:05.283977 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-console-serving-cert\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:09:05.284120 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:05.283987 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-oauth-serving-cert\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:09:05.284120 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:05.283998 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-console-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:09:05.284120 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:05.284006 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-console-oauth-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:09:05.284120 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:05.284015 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qp8m7\" (UniqueName: \"kubernetes.io/projected/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba-kube-api-access-qp8m7\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:09:05.872545 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:05.872520 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-57ccf95dfd-njgpv_b50ee53b-7d97-4c27-8a3e-58ce35cc34ba/console/0.log" Apr 20 20:09:05.873053 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:05.872557 2573 generic.go:358] "Generic (PLEG): container finished" podID="b50ee53b-7d97-4c27-8a3e-58ce35cc34ba" containerID="ee903a8a0376bff277a7ed9b843a11592683cc96fe3471fc8630fae527ae1115" exitCode=2 Apr 20 20:09:05.873053 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:05.872587 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57ccf95dfd-njgpv" event={"ID":"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba","Type":"ContainerDied","Data":"ee903a8a0376bff277a7ed9b843a11592683cc96fe3471fc8630fae527ae1115"} Apr 20 20:09:05.873053 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:05.872629 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57ccf95dfd-njgpv" event={"ID":"b50ee53b-7d97-4c27-8a3e-58ce35cc34ba","Type":"ContainerDied","Data":"c464adf12abee1a372d53c02ae23d6421208d9c14c2f209a9035cdc8c3309a9e"} Apr 20 20:09:05.873053 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:05.872636 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57ccf95dfd-njgpv" Apr 20 20:09:05.873053 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:05.872646 2573 scope.go:117] "RemoveContainer" containerID="ee903a8a0376bff277a7ed9b843a11592683cc96fe3471fc8630fae527ae1115" Apr 20 20:09:05.881145 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:05.881126 2573 scope.go:117] "RemoveContainer" containerID="ee903a8a0376bff277a7ed9b843a11592683cc96fe3471fc8630fae527ae1115" Apr 20 20:09:05.881407 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:09:05.881385 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee903a8a0376bff277a7ed9b843a11592683cc96fe3471fc8630fae527ae1115\": container with ID starting with ee903a8a0376bff277a7ed9b843a11592683cc96fe3471fc8630fae527ae1115 not found: ID does not exist" containerID="ee903a8a0376bff277a7ed9b843a11592683cc96fe3471fc8630fae527ae1115" Apr 20 20:09:05.881456 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:05.881414 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee903a8a0376bff277a7ed9b843a11592683cc96fe3471fc8630fae527ae1115"} err="failed to get container status \"ee903a8a0376bff277a7ed9b843a11592683cc96fe3471fc8630fae527ae1115\": rpc error: code = NotFound desc = could not find container \"ee903a8a0376bff277a7ed9b843a11592683cc96fe3471fc8630fae527ae1115\": container with ID starting with ee903a8a0376bff277a7ed9b843a11592683cc96fe3471fc8630fae527ae1115 not found: ID does not exist" Apr 20 20:09:05.892922 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:05.892903 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-57ccf95dfd-njgpv"] Apr 20 20:09:05.896320 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:05.896300 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-57ccf95dfd-njgpv"] Apr 20 20:09:06.171338 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:06.171268 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b50ee53b-7d97-4c27-8a3e-58ce35cc34ba" path="/var/lib/kubelet/pods/b50ee53b-7d97-4c27-8a3e-58ce35cc34ba/volumes" Apr 20 20:09:17.405525 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:17.405488 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:09:17.405933 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:17.405886 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerName="alertmanager" containerID="cri-o://57a58c42deef08889c0f64c255dec7cde51db5b7da08909388991d0c586409a2" gracePeriod=120 Apr 20 20:09:17.405999 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:17.405949 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerName="kube-rbac-proxy-metric" containerID="cri-o://5a0ab59b0723d106e21c06d480057a2f61d48b11fef5a638d4775acf5e10c594" gracePeriod=120 Apr 20 20:09:17.406054 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:17.406005 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerName="kube-rbac-proxy" containerID="cri-o://6a65ccab9aaf326bab1421e96d037da631ffcc45897cf11c0fa14f5f148478d4" gracePeriod=120 Apr 20 20:09:17.406136 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:17.406030 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerName="config-reloader" containerID="cri-o://a0d01c7244004ca80dfe2d60d63a5c8c44a69aa992e72eb3bd0371e0f28adda9" gracePeriod=120 Apr 20 20:09:17.406205 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:17.406014 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerName="prom-label-proxy" containerID="cri-o://7f39626ab2240c1782f612a7f5be4d43cbb7b60f7a5fafd86edb034555120daf" gracePeriod=120 Apr 20 20:09:17.406205 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:17.406045 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerName="kube-rbac-proxy-web" containerID="cri-o://9a80d73a3c076c512db1929090438926c7a76e493f8ddc56eec99020dba216b2" gracePeriod=120 Apr 20 20:09:17.909464 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:17.909434 2573 generic.go:358] "Generic (PLEG): container finished" podID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerID="7f39626ab2240c1782f612a7f5be4d43cbb7b60f7a5fafd86edb034555120daf" exitCode=0 Apr 20 20:09:17.909464 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:17.909462 2573 generic.go:358] "Generic (PLEG): container finished" podID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerID="5a0ab59b0723d106e21c06d480057a2f61d48b11fef5a638d4775acf5e10c594" exitCode=0 Apr 20 20:09:17.909464 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:17.909471 2573 generic.go:358] "Generic (PLEG): container finished" podID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerID="6a65ccab9aaf326bab1421e96d037da631ffcc45897cf11c0fa14f5f148478d4" exitCode=0 Apr 20 20:09:17.909690 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:17.909478 2573 generic.go:358] "Generic (PLEG): container finished" podID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerID="a0d01c7244004ca80dfe2d60d63a5c8c44a69aa992e72eb3bd0371e0f28adda9" exitCode=0 Apr 20 20:09:17.909690 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:17.909486 2573 generic.go:358] "Generic (PLEG): container finished" podID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerID="57a58c42deef08889c0f64c255dec7cde51db5b7da08909388991d0c586409a2" exitCode=0 Apr 20 20:09:17.909690 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:17.909502 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"77bfa676-1f7d-43e7-ac4a-280f6e507c80","Type":"ContainerDied","Data":"7f39626ab2240c1782f612a7f5be4d43cbb7b60f7a5fafd86edb034555120daf"} Apr 20 20:09:17.909690 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:17.909538 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"77bfa676-1f7d-43e7-ac4a-280f6e507c80","Type":"ContainerDied","Data":"5a0ab59b0723d106e21c06d480057a2f61d48b11fef5a638d4775acf5e10c594"} Apr 20 20:09:17.909690 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:17.909548 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"77bfa676-1f7d-43e7-ac4a-280f6e507c80","Type":"ContainerDied","Data":"6a65ccab9aaf326bab1421e96d037da631ffcc45897cf11c0fa14f5f148478d4"} Apr 20 20:09:17.909690 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:17.909558 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"77bfa676-1f7d-43e7-ac4a-280f6e507c80","Type":"ContainerDied","Data":"a0d01c7244004ca80dfe2d60d63a5c8c44a69aa992e72eb3bd0371e0f28adda9"} Apr 20 20:09:17.909690 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:17.909567 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"77bfa676-1f7d-43e7-ac4a-280f6e507c80","Type":"ContainerDied","Data":"57a58c42deef08889c0f64c255dec7cde51db5b7da08909388991d0c586409a2"} Apr 20 20:09:18.652509 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.652486 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:18.694098 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.693919 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6696669cb6-cdj9t"] Apr 20 20:09:18.694407 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.694335 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerName="prom-label-proxy" Apr 20 20:09:18.694407 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.694372 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerName="prom-label-proxy" Apr 20 20:09:18.694407 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.694390 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerName="kube-rbac-proxy-web" Apr 20 20:09:18.694407 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.694400 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerName="kube-rbac-proxy-web" Apr 20 20:09:18.694612 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.694412 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerName="kube-rbac-proxy" Apr 20 20:09:18.694612 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.694421 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerName="kube-rbac-proxy" Apr 20 20:09:18.694612 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.694435 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerName="init-config-reloader" Apr 20 20:09:18.694612 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.694444 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerName="init-config-reloader" Apr 20 20:09:18.694612 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.694455 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerName="kube-rbac-proxy-metric" Apr 20 20:09:18.694612 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.694464 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerName="kube-rbac-proxy-metric" Apr 20 20:09:18.694612 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.694477 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerName="alertmanager" Apr 20 20:09:18.694612 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.694485 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerName="alertmanager" Apr 20 20:09:18.694612 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.694497 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b50ee53b-7d97-4c27-8a3e-58ce35cc34ba" containerName="console" Apr 20 20:09:18.694612 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.694506 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b50ee53b-7d97-4c27-8a3e-58ce35cc34ba" containerName="console" Apr 20 20:09:18.694612 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.694522 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerName="config-reloader" Apr 20 20:09:18.694612 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.694530 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerName="config-reloader" Apr 20 20:09:18.694612 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.694605 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerName="kube-rbac-proxy" Apr 20 20:09:18.694612 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.694617 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerName="prom-label-proxy" Apr 20 20:09:18.695159 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.694630 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerName="alertmanager" Apr 20 20:09:18.695159 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.694639 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerName="kube-rbac-proxy-web" Apr 20 20:09:18.695159 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.694649 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerName="kube-rbac-proxy-metric" Apr 20 20:09:18.695159 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.694660 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerName="config-reloader" Apr 20 20:09:18.695159 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.694671 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b50ee53b-7d97-4c27-8a3e-58ce35cc34ba" containerName="console" Apr 20 20:09:18.698145 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.698118 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6696669cb6-cdj9t" Apr 20 20:09:18.708077 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.708053 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6696669cb6-cdj9t"] Apr 20 20:09:18.788596 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.788569 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-config-volume\") pod \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " Apr 20 20:09:18.788761 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.788605 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/77bfa676-1f7d-43e7-ac4a-280f6e507c80-tls-assets\") pod \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " Apr 20 20:09:18.788761 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.788630 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-secret-alertmanager-kube-rbac-proxy\") pod \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " Apr 20 20:09:18.788761 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.788659 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77bfa676-1f7d-43e7-ac4a-280f6e507c80-alertmanager-trusted-ca-bundle\") pod \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " Apr 20 20:09:18.788761 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.788706 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-cluster-tls-config\") pod \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " Apr 20 20:09:18.788761 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.788727 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdq5m\" (UniqueName: \"kubernetes.io/projected/77bfa676-1f7d-43e7-ac4a-280f6e507c80-kube-api-access-tdq5m\") pod \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " Apr 20 20:09:18.788761 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.788756 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-secret-alertmanager-main-tls\") pod \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " Apr 20 20:09:18.789103 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.788793 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/77bfa676-1f7d-43e7-ac4a-280f6e507c80-config-out\") pod \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " Apr 20 20:09:18.789103 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.788838 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-secret-alertmanager-kube-rbac-proxy-web\") pod \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " Apr 20 20:09:18.789103 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.788886 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/77bfa676-1f7d-43e7-ac4a-280f6e507c80-alertmanager-main-db\") pod \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " Apr 20 20:09:18.789103 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.788930 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/77bfa676-1f7d-43e7-ac4a-280f6e507c80-metrics-client-ca\") pod \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " Apr 20 20:09:18.789103 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.788959 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-secret-alertmanager-kube-rbac-proxy-metric\") pod \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " Apr 20 20:09:18.789103 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.788990 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-web-config\") pod \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\" (UID: \"77bfa676-1f7d-43e7-ac4a-280f6e507c80\") " Apr 20 20:09:18.789404 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.789116 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-trusted-ca-bundle\") pod \"console-6696669cb6-cdj9t\" (UID: \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\") " pod="openshift-console/console-6696669cb6-cdj9t" Apr 20 20:09:18.789404 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.789135 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77bfa676-1f7d-43e7-ac4a-280f6e507c80-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "77bfa676-1f7d-43e7-ac4a-280f6e507c80" (UID: "77bfa676-1f7d-43e7-ac4a-280f6e507c80"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:18.789404 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.789153 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-console-oauth-config\") pod \"console-6696669cb6-cdj9t\" (UID: \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\") " pod="openshift-console/console-6696669cb6-cdj9t" Apr 20 20:09:18.789404 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.789272 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-console-config\") pod \"console-6696669cb6-cdj9t\" (UID: \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\") " pod="openshift-console/console-6696669cb6-cdj9t" Apr 20 20:09:18.789404 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.789317 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-service-ca\") pod \"console-6696669cb6-cdj9t\" (UID: \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\") " pod="openshift-console/console-6696669cb6-cdj9t" Apr 20 20:09:18.789404 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.789343 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dxq7\" (UniqueName: \"kubernetes.io/projected/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-kube-api-access-8dxq7\") pod \"console-6696669cb6-cdj9t\" (UID: \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\") " pod="openshift-console/console-6696669cb6-cdj9t" Apr 20 20:09:18.789404 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.789387 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-console-serving-cert\") pod \"console-6696669cb6-cdj9t\" (UID: \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\") " pod="openshift-console/console-6696669cb6-cdj9t" Apr 20 20:09:18.789759 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.789412 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-oauth-serving-cert\") pod \"console-6696669cb6-cdj9t\" (UID: \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\") " pod="openshift-console/console-6696669cb6-cdj9t" Apr 20 20:09:18.789759 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.789495 2573 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77bfa676-1f7d-43e7-ac4a-280f6e507c80-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:09:18.789759 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.789497 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77bfa676-1f7d-43e7-ac4a-280f6e507c80-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "77bfa676-1f7d-43e7-ac4a-280f6e507c80" (UID: "77bfa676-1f7d-43e7-ac4a-280f6e507c80"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:18.790167 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.790131 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77bfa676-1f7d-43e7-ac4a-280f6e507c80-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "77bfa676-1f7d-43e7-ac4a-280f6e507c80" (UID: "77bfa676-1f7d-43e7-ac4a-280f6e507c80"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:09:18.791795 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.791759 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "77bfa676-1f7d-43e7-ac4a-280f6e507c80" (UID: "77bfa676-1f7d-43e7-ac4a-280f6e507c80"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:18.792147 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.792115 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "77bfa676-1f7d-43e7-ac4a-280f6e507c80" (UID: "77bfa676-1f7d-43e7-ac4a-280f6e507c80"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:18.792147 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.792135 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "77bfa676-1f7d-43e7-ac4a-280f6e507c80" (UID: "77bfa676-1f7d-43e7-ac4a-280f6e507c80"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:18.792426 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.792397 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77bfa676-1f7d-43e7-ac4a-280f6e507c80-config-out" (OuterVolumeSpecName: "config-out") pod "77bfa676-1f7d-43e7-ac4a-280f6e507c80" (UID: "77bfa676-1f7d-43e7-ac4a-280f6e507c80"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:09:18.792606 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.792580 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-config-volume" (OuterVolumeSpecName: "config-volume") pod "77bfa676-1f7d-43e7-ac4a-280f6e507c80" (UID: "77bfa676-1f7d-43e7-ac4a-280f6e507c80"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:18.792980 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.792956 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "77bfa676-1f7d-43e7-ac4a-280f6e507c80" (UID: "77bfa676-1f7d-43e7-ac4a-280f6e507c80"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:18.793242 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.793217 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77bfa676-1f7d-43e7-ac4a-280f6e507c80-kube-api-access-tdq5m" (OuterVolumeSpecName: "kube-api-access-tdq5m") pod "77bfa676-1f7d-43e7-ac4a-280f6e507c80" (UID: "77bfa676-1f7d-43e7-ac4a-280f6e507c80"). InnerVolumeSpecName "kube-api-access-tdq5m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:09:18.793423 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.793399 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77bfa676-1f7d-43e7-ac4a-280f6e507c80-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "77bfa676-1f7d-43e7-ac4a-280f6e507c80" (UID: "77bfa676-1f7d-43e7-ac4a-280f6e507c80"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:09:18.795988 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.795965 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "77bfa676-1f7d-43e7-ac4a-280f6e507c80" (UID: "77bfa676-1f7d-43e7-ac4a-280f6e507c80"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:18.802795 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.802770 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-web-config" (OuterVolumeSpecName: "web-config") pod "77bfa676-1f7d-43e7-ac4a-280f6e507c80" (UID: "77bfa676-1f7d-43e7-ac4a-280f6e507c80"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:18.890367 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.890329 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-console-oauth-config\") pod \"console-6696669cb6-cdj9t\" (UID: \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\") " pod="openshift-console/console-6696669cb6-cdj9t" Apr 20 20:09:18.890533 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.890403 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-console-config\") pod \"console-6696669cb6-cdj9t\" (UID: \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\") " pod="openshift-console/console-6696669cb6-cdj9t" Apr 20 20:09:18.890533 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.890428 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-service-ca\") pod \"console-6696669cb6-cdj9t\" (UID: \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\") " pod="openshift-console/console-6696669cb6-cdj9t" Apr 20 20:09:18.890533 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.890444 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dxq7\" (UniqueName: \"kubernetes.io/projected/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-kube-api-access-8dxq7\") pod \"console-6696669cb6-cdj9t\" (UID: \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\") " pod="openshift-console/console-6696669cb6-cdj9t" Apr 20 20:09:18.890533 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.890468 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-console-serving-cert\") pod \"console-6696669cb6-cdj9t\" (UID: \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\") " pod="openshift-console/console-6696669cb6-cdj9t" Apr 20 20:09:18.890533 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.890490 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-oauth-serving-cert\") pod \"console-6696669cb6-cdj9t\" (UID: \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\") " pod="openshift-console/console-6696669cb6-cdj9t" Apr 20 20:09:18.890789 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.890538 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-trusted-ca-bundle\") pod \"console-6696669cb6-cdj9t\" (UID: \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\") " pod="openshift-console/console-6696669cb6-cdj9t" Apr 20 20:09:18.890789 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.890576 2573 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-config-volume\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:09:18.890789 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.890590 2573 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/77bfa676-1f7d-43e7-ac4a-280f6e507c80-tls-assets\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:09:18.890789 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.890605 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:09:18.890789 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.890621 2573 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-cluster-tls-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:09:18.890789 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.890634 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tdq5m\" (UniqueName: \"kubernetes.io/projected/77bfa676-1f7d-43e7-ac4a-280f6e507c80-kube-api-access-tdq5m\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:09:18.890789 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.890648 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-secret-alertmanager-main-tls\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:09:18.890789 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.890661 2573 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/77bfa676-1f7d-43e7-ac4a-280f6e507c80-config-out\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:09:18.890789 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.890675 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:09:18.890789 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.890688 2573 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/77bfa676-1f7d-43e7-ac4a-280f6e507c80-alertmanager-main-db\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:09:18.890789 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.890702 2573 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/77bfa676-1f7d-43e7-ac4a-280f6e507c80-metrics-client-ca\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:09:18.890789 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.890719 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:09:18.890789 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.890733 2573 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/77bfa676-1f7d-43e7-ac4a-280f6e507c80-web-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:09:18.891459 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.891275 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-console-config\") pod \"console-6696669cb6-cdj9t\" (UID: \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\") " pod="openshift-console/console-6696669cb6-cdj9t" Apr 20 20:09:18.891459 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.891340 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-oauth-serving-cert\") pod \"console-6696669cb6-cdj9t\" (UID: \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\") " pod="openshift-console/console-6696669cb6-cdj9t" Apr 20 20:09:18.891459 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.891370 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-service-ca\") pod \"console-6696669cb6-cdj9t\" (UID: \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\") " pod="openshift-console/console-6696669cb6-cdj9t" Apr 20 20:09:18.891614 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.891597 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-trusted-ca-bundle\") pod \"console-6696669cb6-cdj9t\" (UID: \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\") " pod="openshift-console/console-6696669cb6-cdj9t" Apr 20 20:09:18.892776 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.892758 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-console-oauth-config\") pod \"console-6696669cb6-cdj9t\" (UID: \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\") " pod="openshift-console/console-6696669cb6-cdj9t" Apr 20 20:09:18.892918 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.892900 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-console-serving-cert\") pod \"console-6696669cb6-cdj9t\" (UID: \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\") " pod="openshift-console/console-6696669cb6-cdj9t" Apr 20 20:09:18.899300 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.899280 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dxq7\" (UniqueName: \"kubernetes.io/projected/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-kube-api-access-8dxq7\") pod \"console-6696669cb6-cdj9t\" (UID: \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\") " pod="openshift-console/console-6696669cb6-cdj9t" Apr 20 20:09:18.914807 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.914780 2573 generic.go:358] "Generic (PLEG): container finished" podID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" containerID="9a80d73a3c076c512db1929090438926c7a76e493f8ddc56eec99020dba216b2" exitCode=0 Apr 20 20:09:18.914920 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.914824 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"77bfa676-1f7d-43e7-ac4a-280f6e507c80","Type":"ContainerDied","Data":"9a80d73a3c076c512db1929090438926c7a76e493f8ddc56eec99020dba216b2"} Apr 20 20:09:18.914920 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.914847 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"77bfa676-1f7d-43e7-ac4a-280f6e507c80","Type":"ContainerDied","Data":"7805db9d18cdbd2e6c84eeaf976c524753445bb1c2ade6672f02db83e39275f4"} Apr 20 20:09:18.914920 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.914877 2573 scope.go:117] "RemoveContainer" containerID="7f39626ab2240c1782f612a7f5be4d43cbb7b60f7a5fafd86edb034555120daf" Apr 20 20:09:18.914920 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.914894 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:18.921724 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.921709 2573 scope.go:117] "RemoveContainer" containerID="5a0ab59b0723d106e21c06d480057a2f61d48b11fef5a638d4775acf5e10c594" Apr 20 20:09:18.928379 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.928364 2573 scope.go:117] "RemoveContainer" containerID="6a65ccab9aaf326bab1421e96d037da631ffcc45897cf11c0fa14f5f148478d4" Apr 20 20:09:18.934479 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.934463 2573 scope.go:117] "RemoveContainer" containerID="9a80d73a3c076c512db1929090438926c7a76e493f8ddc56eec99020dba216b2" Apr 20 20:09:18.938074 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.937500 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:09:18.941716 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.941448 2573 scope.go:117] "RemoveContainer" containerID="a0d01c7244004ca80dfe2d60d63a5c8c44a69aa992e72eb3bd0371e0f28adda9" Apr 20 20:09:18.944457 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.944433 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:09:18.948118 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.948098 2573 scope.go:117] "RemoveContainer" containerID="57a58c42deef08889c0f64c255dec7cde51db5b7da08909388991d0c586409a2" Apr 20 20:09:18.954225 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.954211 2573 scope.go:117] "RemoveContainer" containerID="626945f52344057886b0ad39b84207866aea778c72fcd1c49e20de6d3fbcc59b" Apr 20 20:09:18.960202 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.960185 2573 scope.go:117] "RemoveContainer" containerID="7f39626ab2240c1782f612a7f5be4d43cbb7b60f7a5fafd86edb034555120daf" Apr 20 20:09:18.960465 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:09:18.960440 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f39626ab2240c1782f612a7f5be4d43cbb7b60f7a5fafd86edb034555120daf\": container with ID starting with 7f39626ab2240c1782f612a7f5be4d43cbb7b60f7a5fafd86edb034555120daf not found: ID does not exist" containerID="7f39626ab2240c1782f612a7f5be4d43cbb7b60f7a5fafd86edb034555120daf" Apr 20 20:09:18.960517 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.960467 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f39626ab2240c1782f612a7f5be4d43cbb7b60f7a5fafd86edb034555120daf"} err="failed to get container status \"7f39626ab2240c1782f612a7f5be4d43cbb7b60f7a5fafd86edb034555120daf\": rpc error: code = NotFound desc = could not find container \"7f39626ab2240c1782f612a7f5be4d43cbb7b60f7a5fafd86edb034555120daf\": container with ID starting with 7f39626ab2240c1782f612a7f5be4d43cbb7b60f7a5fafd86edb034555120daf not found: ID does not exist" Apr 20 20:09:18.960517 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.960487 2573 scope.go:117] "RemoveContainer" containerID="5a0ab59b0723d106e21c06d480057a2f61d48b11fef5a638d4775acf5e10c594" Apr 20 20:09:18.960697 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:09:18.960680 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a0ab59b0723d106e21c06d480057a2f61d48b11fef5a638d4775acf5e10c594\": container with ID starting with 5a0ab59b0723d106e21c06d480057a2f61d48b11fef5a638d4775acf5e10c594 not found: ID does not exist" containerID="5a0ab59b0723d106e21c06d480057a2f61d48b11fef5a638d4775acf5e10c594" Apr 20 20:09:18.960747 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.960702 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a0ab59b0723d106e21c06d480057a2f61d48b11fef5a638d4775acf5e10c594"} err="failed to get container status \"5a0ab59b0723d106e21c06d480057a2f61d48b11fef5a638d4775acf5e10c594\": rpc error: code = NotFound desc = could not find container \"5a0ab59b0723d106e21c06d480057a2f61d48b11fef5a638d4775acf5e10c594\": container with ID starting with 5a0ab59b0723d106e21c06d480057a2f61d48b11fef5a638d4775acf5e10c594 not found: ID does not exist" Apr 20 20:09:18.960747 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.960718 2573 scope.go:117] "RemoveContainer" containerID="6a65ccab9aaf326bab1421e96d037da631ffcc45897cf11c0fa14f5f148478d4" Apr 20 20:09:18.961167 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:09:18.961120 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a65ccab9aaf326bab1421e96d037da631ffcc45897cf11c0fa14f5f148478d4\": container with ID starting with 6a65ccab9aaf326bab1421e96d037da631ffcc45897cf11c0fa14f5f148478d4 not found: ID does not exist" containerID="6a65ccab9aaf326bab1421e96d037da631ffcc45897cf11c0fa14f5f148478d4" Apr 20 20:09:18.961267 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.961174 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a65ccab9aaf326bab1421e96d037da631ffcc45897cf11c0fa14f5f148478d4"} err="failed to get container status \"6a65ccab9aaf326bab1421e96d037da631ffcc45897cf11c0fa14f5f148478d4\": rpc error: code = NotFound desc = could not find container \"6a65ccab9aaf326bab1421e96d037da631ffcc45897cf11c0fa14f5f148478d4\": container with ID starting with 6a65ccab9aaf326bab1421e96d037da631ffcc45897cf11c0fa14f5f148478d4 not found: ID does not exist" Apr 20 20:09:18.961267 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.961201 2573 scope.go:117] "RemoveContainer" containerID="9a80d73a3c076c512db1929090438926c7a76e493f8ddc56eec99020dba216b2" Apr 20 20:09:18.961554 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:09:18.961536 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a80d73a3c076c512db1929090438926c7a76e493f8ddc56eec99020dba216b2\": container with ID starting with 9a80d73a3c076c512db1929090438926c7a76e493f8ddc56eec99020dba216b2 not found: ID does not exist" containerID="9a80d73a3c076c512db1929090438926c7a76e493f8ddc56eec99020dba216b2" Apr 20 20:09:18.961633 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.961559 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a80d73a3c076c512db1929090438926c7a76e493f8ddc56eec99020dba216b2"} err="failed to get container status \"9a80d73a3c076c512db1929090438926c7a76e493f8ddc56eec99020dba216b2\": rpc error: code = NotFound desc = could not find container \"9a80d73a3c076c512db1929090438926c7a76e493f8ddc56eec99020dba216b2\": container with ID starting with 9a80d73a3c076c512db1929090438926c7a76e493f8ddc56eec99020dba216b2 not found: ID does not exist" Apr 20 20:09:18.961633 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.961578 2573 scope.go:117] "RemoveContainer" containerID="a0d01c7244004ca80dfe2d60d63a5c8c44a69aa992e72eb3bd0371e0f28adda9" Apr 20 20:09:18.961936 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:09:18.961911 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0d01c7244004ca80dfe2d60d63a5c8c44a69aa992e72eb3bd0371e0f28adda9\": container with ID starting with a0d01c7244004ca80dfe2d60d63a5c8c44a69aa992e72eb3bd0371e0f28adda9 not found: ID does not exist" containerID="a0d01c7244004ca80dfe2d60d63a5c8c44a69aa992e72eb3bd0371e0f28adda9" Apr 20 20:09:18.961998 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.961942 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0d01c7244004ca80dfe2d60d63a5c8c44a69aa992e72eb3bd0371e0f28adda9"} err="failed to get container status \"a0d01c7244004ca80dfe2d60d63a5c8c44a69aa992e72eb3bd0371e0f28adda9\": rpc error: code = NotFound desc = could not find container \"a0d01c7244004ca80dfe2d60d63a5c8c44a69aa992e72eb3bd0371e0f28adda9\": container with ID starting with a0d01c7244004ca80dfe2d60d63a5c8c44a69aa992e72eb3bd0371e0f28adda9 not found: ID does not exist" Apr 20 20:09:18.961998 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.961961 2573 scope.go:117] "RemoveContainer" containerID="57a58c42deef08889c0f64c255dec7cde51db5b7da08909388991d0c586409a2" Apr 20 20:09:18.962218 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:09:18.962202 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57a58c42deef08889c0f64c255dec7cde51db5b7da08909388991d0c586409a2\": container with ID starting with 57a58c42deef08889c0f64c255dec7cde51db5b7da08909388991d0c586409a2 not found: ID does not exist" containerID="57a58c42deef08889c0f64c255dec7cde51db5b7da08909388991d0c586409a2" Apr 20 20:09:18.962268 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.962223 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57a58c42deef08889c0f64c255dec7cde51db5b7da08909388991d0c586409a2"} err="failed to get container status \"57a58c42deef08889c0f64c255dec7cde51db5b7da08909388991d0c586409a2\": rpc error: code = NotFound desc = could not find container \"57a58c42deef08889c0f64c255dec7cde51db5b7da08909388991d0c586409a2\": container with ID starting with 57a58c42deef08889c0f64c255dec7cde51db5b7da08909388991d0c586409a2 not found: ID does not exist" Apr 20 20:09:18.962268 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.962237 2573 scope.go:117] "RemoveContainer" containerID="626945f52344057886b0ad39b84207866aea778c72fcd1c49e20de6d3fbcc59b" Apr 20 20:09:18.962435 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:09:18.962420 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"626945f52344057886b0ad39b84207866aea778c72fcd1c49e20de6d3fbcc59b\": container with ID starting with 626945f52344057886b0ad39b84207866aea778c72fcd1c49e20de6d3fbcc59b not found: ID does not exist" containerID="626945f52344057886b0ad39b84207866aea778c72fcd1c49e20de6d3fbcc59b" Apr 20 20:09:18.962468 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.962437 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"626945f52344057886b0ad39b84207866aea778c72fcd1c49e20de6d3fbcc59b"} err="failed to get container status \"626945f52344057886b0ad39b84207866aea778c72fcd1c49e20de6d3fbcc59b\": rpc error: code = NotFound desc = could not find container \"626945f52344057886b0ad39b84207866aea778c72fcd1c49e20de6d3fbcc59b\": container with ID starting with 626945f52344057886b0ad39b84207866aea778c72fcd1c49e20de6d3fbcc59b not found: ID does not exist" Apr 20 20:09:18.970861 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.970831 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:09:18.976252 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.976237 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:18.979028 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.978993 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 20:09:18.979028 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.979026 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 20:09:18.979170 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.979085 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 20:09:18.979170 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.979121 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 20:09:18.979170 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.979130 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 20:09:18.979170 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.979164 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 20:09:18.979367 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.979121 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 20:09:18.979551 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.979536 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 20:09:18.979623 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.979581 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-slp46\"" Apr 20 20:09:18.984230 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.984212 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 20:09:18.994661 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:18.994639 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:09:19.009230 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.009210 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6696669cb6-cdj9t" Apr 20 20:09:19.092146 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.092118 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4dfd3363-cf4b-40c1-954f-e78a2323e7be-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.092314 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.092166 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ml5t\" (UniqueName: \"kubernetes.io/projected/4dfd3363-cf4b-40c1-954f-e78a2323e7be-kube-api-access-7ml5t\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.092314 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.092194 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4dfd3363-cf4b-40c1-954f-e78a2323e7be-config-volume\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.092314 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.092274 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4dfd3363-cf4b-40c1-954f-e78a2323e7be-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.092314 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.092312 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4dfd3363-cf4b-40c1-954f-e78a2323e7be-config-out\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.092534 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.092338 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4dfd3363-cf4b-40c1-954f-e78a2323e7be-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.092534 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.092372 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4dfd3363-cf4b-40c1-954f-e78a2323e7be-web-config\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.092534 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.092399 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dfd3363-cf4b-40c1-954f-e78a2323e7be-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.092534 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.092425 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4dfd3363-cf4b-40c1-954f-e78a2323e7be-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.092534 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.092466 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4dfd3363-cf4b-40c1-954f-e78a2323e7be-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.092534 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.092489 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4dfd3363-cf4b-40c1-954f-e78a2323e7be-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.092534 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.092528 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4dfd3363-cf4b-40c1-954f-e78a2323e7be-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.092885 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.092583 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4dfd3363-cf4b-40c1-954f-e78a2323e7be-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.140351 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.140319 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6696669cb6-cdj9t"] Apr 20 20:09:19.142529 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:09:19.142506 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b4d9fdf_d201_4c1d_a5c4_672a88db3a23.slice/crio-2d5983af5e22121062c515272dd6e8d51e5cb995b6f63c0fc0f4f4059c338bce WatchSource:0}: Error finding container 2d5983af5e22121062c515272dd6e8d51e5cb995b6f63c0fc0f4f4059c338bce: Status 404 returned error can't find the container with id 2d5983af5e22121062c515272dd6e8d51e5cb995b6f63c0fc0f4f4059c338bce Apr 20 20:09:19.193018 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.192986 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4dfd3363-cf4b-40c1-954f-e78a2323e7be-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.193110 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.193032 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4dfd3363-cf4b-40c1-954f-e78a2323e7be-config-out\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.193216 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.193187 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4dfd3363-cf4b-40c1-954f-e78a2323e7be-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.193269 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.193255 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4dfd3363-cf4b-40c1-954f-e78a2323e7be-web-config\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.193322 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.193287 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dfd3363-cf4b-40c1-954f-e78a2323e7be-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.193322 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.193313 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4dfd3363-cf4b-40c1-954f-e78a2323e7be-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.193419 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.193349 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4dfd3363-cf4b-40c1-954f-e78a2323e7be-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.193419 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.193376 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4dfd3363-cf4b-40c1-954f-e78a2323e7be-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.193528 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.193417 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4dfd3363-cf4b-40c1-954f-e78a2323e7be-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.193528 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.193469 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4dfd3363-cf4b-40c1-954f-e78a2323e7be-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.193528 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.193511 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4dfd3363-cf4b-40c1-954f-e78a2323e7be-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.193691 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.193575 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ml5t\" (UniqueName: \"kubernetes.io/projected/4dfd3363-cf4b-40c1-954f-e78a2323e7be-kube-api-access-7ml5t\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.193691 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.193604 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4dfd3363-cf4b-40c1-954f-e78a2323e7be-config-volume\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.194185 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.194129 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4dfd3363-cf4b-40c1-954f-e78a2323e7be-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.195094 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.194324 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dfd3363-cf4b-40c1-954f-e78a2323e7be-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.195094 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.194832 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4dfd3363-cf4b-40c1-954f-e78a2323e7be-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.195932 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.195889 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4dfd3363-cf4b-40c1-954f-e78a2323e7be-config-out\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.196028 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.195973 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4dfd3363-cf4b-40c1-954f-e78a2323e7be-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.196627 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.196604 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4dfd3363-cf4b-40c1-954f-e78a2323e7be-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.196722 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.196700 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4dfd3363-cf4b-40c1-954f-e78a2323e7be-web-config\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.197070 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.197050 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4dfd3363-cf4b-40c1-954f-e78a2323e7be-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.197161 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.197142 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4dfd3363-cf4b-40c1-954f-e78a2323e7be-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.197522 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.197502 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4dfd3363-cf4b-40c1-954f-e78a2323e7be-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.198204 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.198180 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4dfd3363-cf4b-40c1-954f-e78a2323e7be-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.198357 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.198339 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4dfd3363-cf4b-40c1-954f-e78a2323e7be-config-volume\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.206371 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.206353 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ml5t\" (UniqueName: \"kubernetes.io/projected/4dfd3363-cf4b-40c1-954f-e78a2323e7be-kube-api-access-7ml5t\") pod \"alertmanager-main-0\" (UID: \"4dfd3363-cf4b-40c1-954f-e78a2323e7be\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.286342 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.286304 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 20:09:19.411823 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.411797 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 20:09:19.414238 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:09:19.414215 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dfd3363_cf4b_40c1_954f_e78a2323e7be.slice/crio-5b17d1986e719ad317ded7a2916cc9f6688d536d58f5723d89a580a62e093a8c WatchSource:0}: Error finding container 5b17d1986e719ad317ded7a2916cc9f6688d536d58f5723d89a580a62e093a8c: Status 404 returned error can't find the container with id 5b17d1986e719ad317ded7a2916cc9f6688d536d58f5723d89a580a62e093a8c Apr 20 20:09:19.920126 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.920088 2573 generic.go:358] "Generic (PLEG): container finished" podID="4dfd3363-cf4b-40c1-954f-e78a2323e7be" containerID="43f337d4a8af617e3bfa7c64a2f99a94480dee8638ffa707c190607a86ce8372" exitCode=0 Apr 20 20:09:19.920569 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.920176 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4dfd3363-cf4b-40c1-954f-e78a2323e7be","Type":"ContainerDied","Data":"43f337d4a8af617e3bfa7c64a2f99a94480dee8638ffa707c190607a86ce8372"} Apr 20 20:09:19.920569 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.920210 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4dfd3363-cf4b-40c1-954f-e78a2323e7be","Type":"ContainerStarted","Data":"5b17d1986e719ad317ded7a2916cc9f6688d536d58f5723d89a580a62e093a8c"} Apr 20 20:09:19.921448 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.921427 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6696669cb6-cdj9t" event={"ID":"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23","Type":"ContainerStarted","Data":"bde4d6e3a27f6c612681d570f7e7115f1e78206b480acce8048c1c9bb19c4aad"} Apr 20 20:09:19.921528 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.921456 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6696669cb6-cdj9t" event={"ID":"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23","Type":"ContainerStarted","Data":"2d5983af5e22121062c515272dd6e8d51e5cb995b6f63c0fc0f4f4059c338bce"} Apr 20 20:09:19.966888 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:19.966827 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6696669cb6-cdj9t" podStartSLOduration=1.9668121589999998 podStartE2EDuration="1.966812159s" podCreationTimestamp="2026-04-20 20:09:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:09:19.965966509 +0000 UTC m=+242.381259276" watchObservedRunningTime="2026-04-20 20:09:19.966812159 +0000 UTC m=+242.382104901" Apr 20 20:09:20.172812 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:20.172786 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77bfa676-1f7d-43e7-ac4a-280f6e507c80" path="/var/lib/kubelet/pods/77bfa676-1f7d-43e7-ac4a-280f6e507c80/volumes" Apr 20 20:09:20.930244 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:20.930203 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4dfd3363-cf4b-40c1-954f-e78a2323e7be","Type":"ContainerStarted","Data":"0198d15676762dbc59dde6ac266cd7149581fe86a505ae091e6e6d41bb1504af"} Apr 20 20:09:20.930244 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:20.930247 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4dfd3363-cf4b-40c1-954f-e78a2323e7be","Type":"ContainerStarted","Data":"0c5c47f17adc83c5e4e6b3f2b02adeb6d8504c9ce9fd076007c908458d175b58"} Apr 20 20:09:20.930657 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:20.930259 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4dfd3363-cf4b-40c1-954f-e78a2323e7be","Type":"ContainerStarted","Data":"e291dac55536b79523ff42e974373d3080807d5805a2b832c983205c24522820"} Apr 20 20:09:20.930657 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:20.930268 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4dfd3363-cf4b-40c1-954f-e78a2323e7be","Type":"ContainerStarted","Data":"f143e0ddcca6c92912eb51ac09f862a669b7ff48fcc815313531ab964281e19b"} Apr 20 20:09:20.930657 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:20.930277 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4dfd3363-cf4b-40c1-954f-e78a2323e7be","Type":"ContainerStarted","Data":"1a345da4f9566448060b113b6cd356f1cdd780bc3f9f5fb1e76c2db079e36a0d"} Apr 20 20:09:20.930657 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:20.930290 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4dfd3363-cf4b-40c1-954f-e78a2323e7be","Type":"ContainerStarted","Data":"22fa2723d47b829b43a4675210f8ef78e32dc836d77030eb776d886a720361c3"} Apr 20 20:09:20.963804 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:20.963756 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.9637422 podStartE2EDuration="2.9637422s" podCreationTimestamp="2026-04-20 20:09:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:09:20.961803072 +0000 UTC m=+243.377095849" watchObservedRunningTime="2026-04-20 20:09:20.9637422 +0000 UTC m=+243.379034938" Apr 20 20:09:29.009913 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:29.009827 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6696669cb6-cdj9t" Apr 20 20:09:29.009913 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:29.009885 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6696669cb6-cdj9t" Apr 20 20:09:29.014357 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:29.014331 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6696669cb6-cdj9t" Apr 20 20:09:29.882629 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:29.882593 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-metrics-certs\") pod \"network-metrics-daemon-npkgv\" (UID: \"923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d\") " pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:09:29.884928 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:29.884903 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d-metrics-certs\") pod \"network-metrics-daemon-npkgv\" (UID: \"923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d\") " pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:09:29.961652 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:29.961625 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6696669cb6-cdj9t" Apr 20 20:09:30.014406 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:30.014376 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-67877d55c5-d2cnd"] Apr 20 20:09:30.070255 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:30.070224 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jt5w7\"" Apr 20 20:09:30.078443 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:30.078428 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-npkgv" Apr 20 20:09:30.398195 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:30.398166 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-npkgv"] Apr 20 20:09:30.401676 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:09:30.401613 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod923d6e1f_e27c_45f6_a24a_ccb3fc0e1c3d.slice/crio-b986a754c82a522ad941143960959546219e57253fc4f9ab1a1fa0de9b252156 WatchSource:0}: Error finding container b986a754c82a522ad941143960959546219e57253fc4f9ab1a1fa0de9b252156: Status 404 returned error can't find the container with id b986a754c82a522ad941143960959546219e57253fc4f9ab1a1fa0de9b252156 Apr 20 20:09:30.961887 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:30.961832 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-npkgv" event={"ID":"923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d","Type":"ContainerStarted","Data":"b986a754c82a522ad941143960959546219e57253fc4f9ab1a1fa0de9b252156"} Apr 20 20:09:31.966166 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:31.966136 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-npkgv" event={"ID":"923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d","Type":"ContainerStarted","Data":"637ecf50500304f7191ad67212381bc0e3041a9efe14795164e8b3e69730b74b"} Apr 20 20:09:31.966166 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:31.966168 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-npkgv" event={"ID":"923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d","Type":"ContainerStarted","Data":"6d9b7e3888995b4dd508318be5a82c85212daf1193c9edc65eeabfec0a48aa44"} Apr 20 20:09:31.996910 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:31.996847 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-npkgv" podStartSLOduration=252.917493907 podStartE2EDuration="4m13.996833955s" podCreationTimestamp="2026-04-20 20:05:18 +0000 UTC" firstStartedPulling="2026-04-20 20:09:30.403326697 +0000 UTC m=+252.818619418" lastFinishedPulling="2026-04-20 20:09:31.482666736 +0000 UTC m=+253.897959466" observedRunningTime="2026-04-20 20:09:31.995670339 +0000 UTC m=+254.410963082" watchObservedRunningTime="2026-04-20 20:09:31.996833955 +0000 UTC m=+254.412126757" Apr 20 20:09:55.037172 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:55.037117 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-67877d55c5-d2cnd" podUID="9c1e48be-77e7-4c56-97df-9f0139870ab8" containerName="console" containerID="cri-o://47f2c7023ba7661fbc234ee9181e44f86801292f0619b2b0f69aa0aaae3ba467" gracePeriod=15 Apr 20 20:09:55.269658 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:55.269637 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67877d55c5-d2cnd_9c1e48be-77e7-4c56-97df-9f0139870ab8/console/0.log" Apr 20 20:09:55.269783 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:55.269711 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67877d55c5-d2cnd" Apr 20 20:09:55.368209 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:55.368177 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58wcf\" (UniqueName: \"kubernetes.io/projected/9c1e48be-77e7-4c56-97df-9f0139870ab8-kube-api-access-58wcf\") pod \"9c1e48be-77e7-4c56-97df-9f0139870ab8\" (UID: \"9c1e48be-77e7-4c56-97df-9f0139870ab8\") " Apr 20 20:09:55.368209 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:55.368224 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c1e48be-77e7-4c56-97df-9f0139870ab8-console-serving-cert\") pod \"9c1e48be-77e7-4c56-97df-9f0139870ab8\" (UID: \"9c1e48be-77e7-4c56-97df-9f0139870ab8\") " Apr 20 20:09:55.368456 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:55.368255 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c1e48be-77e7-4c56-97df-9f0139870ab8-trusted-ca-bundle\") pod \"9c1e48be-77e7-4c56-97df-9f0139870ab8\" (UID: \"9c1e48be-77e7-4c56-97df-9f0139870ab8\") " Apr 20 20:09:55.368456 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:55.368276 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c1e48be-77e7-4c56-97df-9f0139870ab8-service-ca\") pod \"9c1e48be-77e7-4c56-97df-9f0139870ab8\" (UID: \"9c1e48be-77e7-4c56-97df-9f0139870ab8\") " Apr 20 20:09:55.368456 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:55.368301 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c1e48be-77e7-4c56-97df-9f0139870ab8-console-config\") pod \"9c1e48be-77e7-4c56-97df-9f0139870ab8\" (UID: \"9c1e48be-77e7-4c56-97df-9f0139870ab8\") " Apr 20 20:09:55.368456 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:55.368329 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c1e48be-77e7-4c56-97df-9f0139870ab8-console-oauth-config\") pod \"9c1e48be-77e7-4c56-97df-9f0139870ab8\" (UID: \"9c1e48be-77e7-4c56-97df-9f0139870ab8\") " Apr 20 20:09:55.368456 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:55.368354 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c1e48be-77e7-4c56-97df-9f0139870ab8-oauth-serving-cert\") pod \"9c1e48be-77e7-4c56-97df-9f0139870ab8\" (UID: \"9c1e48be-77e7-4c56-97df-9f0139870ab8\") " Apr 20 20:09:55.368715 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:55.368690 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c1e48be-77e7-4c56-97df-9f0139870ab8-service-ca" (OuterVolumeSpecName: "service-ca") pod "9c1e48be-77e7-4c56-97df-9f0139870ab8" (UID: "9c1e48be-77e7-4c56-97df-9f0139870ab8"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:55.368772 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:55.368710 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c1e48be-77e7-4c56-97df-9f0139870ab8-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9c1e48be-77e7-4c56-97df-9f0139870ab8" (UID: "9c1e48be-77e7-4c56-97df-9f0139870ab8"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:55.368825 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:55.368805 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c1e48be-77e7-4c56-97df-9f0139870ab8-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9c1e48be-77e7-4c56-97df-9f0139870ab8" (UID: "9c1e48be-77e7-4c56-97df-9f0139870ab8"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:55.368916 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:55.368892 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c1e48be-77e7-4c56-97df-9f0139870ab8-console-config" (OuterVolumeSpecName: "console-config") pod "9c1e48be-77e7-4c56-97df-9f0139870ab8" (UID: "9c1e48be-77e7-4c56-97df-9f0139870ab8"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:55.370416 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:55.370388 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c1e48be-77e7-4c56-97df-9f0139870ab8-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9c1e48be-77e7-4c56-97df-9f0139870ab8" (UID: "9c1e48be-77e7-4c56-97df-9f0139870ab8"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:55.370521 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:55.370446 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c1e48be-77e7-4c56-97df-9f0139870ab8-kube-api-access-58wcf" (OuterVolumeSpecName: "kube-api-access-58wcf") pod "9c1e48be-77e7-4c56-97df-9f0139870ab8" (UID: "9c1e48be-77e7-4c56-97df-9f0139870ab8"). InnerVolumeSpecName "kube-api-access-58wcf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:09:55.370560 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:55.370521 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c1e48be-77e7-4c56-97df-9f0139870ab8-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9c1e48be-77e7-4c56-97df-9f0139870ab8" (UID: "9c1e48be-77e7-4c56-97df-9f0139870ab8"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:55.469133 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:55.469088 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9c1e48be-77e7-4c56-97df-9f0139870ab8-service-ca\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:09:55.469133 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:55.469127 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9c1e48be-77e7-4c56-97df-9f0139870ab8-console-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:09:55.469133 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:55.469138 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9c1e48be-77e7-4c56-97df-9f0139870ab8-console-oauth-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:09:55.469133 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:55.469148 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9c1e48be-77e7-4c56-97df-9f0139870ab8-oauth-serving-cert\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:09:55.469385 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:55.469158 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-58wcf\" (UniqueName: \"kubernetes.io/projected/9c1e48be-77e7-4c56-97df-9f0139870ab8-kube-api-access-58wcf\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:09:55.469385 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:55.469167 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c1e48be-77e7-4c56-97df-9f0139870ab8-console-serving-cert\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:09:55.469385 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:55.469175 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c1e48be-77e7-4c56-97df-9f0139870ab8-trusted-ca-bundle\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:09:56.034481 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:56.034452 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67877d55c5-d2cnd_9c1e48be-77e7-4c56-97df-9f0139870ab8/console/0.log" Apr 20 20:09:56.034641 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:56.034491 2573 generic.go:358] "Generic (PLEG): container finished" podID="9c1e48be-77e7-4c56-97df-9f0139870ab8" containerID="47f2c7023ba7661fbc234ee9181e44f86801292f0619b2b0f69aa0aaae3ba467" exitCode=2 Apr 20 20:09:56.034641 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:56.034564 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67877d55c5-d2cnd" Apr 20 20:09:56.034641 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:56.034584 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67877d55c5-d2cnd" event={"ID":"9c1e48be-77e7-4c56-97df-9f0139870ab8","Type":"ContainerDied","Data":"47f2c7023ba7661fbc234ee9181e44f86801292f0619b2b0f69aa0aaae3ba467"} Apr 20 20:09:56.034641 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:56.034622 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67877d55c5-d2cnd" event={"ID":"9c1e48be-77e7-4c56-97df-9f0139870ab8","Type":"ContainerDied","Data":"c1730c428532539c36aa6a78bca3c985543069c49c0623e4aff2f9fa7c203c9a"} Apr 20 20:09:56.034641 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:56.034638 2573 scope.go:117] "RemoveContainer" containerID="47f2c7023ba7661fbc234ee9181e44f86801292f0619b2b0f69aa0aaae3ba467" Apr 20 20:09:56.043502 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:56.043337 2573 scope.go:117] "RemoveContainer" containerID="47f2c7023ba7661fbc234ee9181e44f86801292f0619b2b0f69aa0aaae3ba467" Apr 20 20:09:56.043727 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:09:56.043594 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47f2c7023ba7661fbc234ee9181e44f86801292f0619b2b0f69aa0aaae3ba467\": container with ID starting with 47f2c7023ba7661fbc234ee9181e44f86801292f0619b2b0f69aa0aaae3ba467 not found: ID does not exist" containerID="47f2c7023ba7661fbc234ee9181e44f86801292f0619b2b0f69aa0aaae3ba467" Apr 20 20:09:56.043727 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:56.043617 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47f2c7023ba7661fbc234ee9181e44f86801292f0619b2b0f69aa0aaae3ba467"} err="failed to get container status \"47f2c7023ba7661fbc234ee9181e44f86801292f0619b2b0f69aa0aaae3ba467\": rpc error: code = NotFound desc = could not find container \"47f2c7023ba7661fbc234ee9181e44f86801292f0619b2b0f69aa0aaae3ba467\": container with ID starting with 47f2c7023ba7661fbc234ee9181e44f86801292f0619b2b0f69aa0aaae3ba467 not found: ID does not exist" Apr 20 20:09:56.056237 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:56.056209 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-67877d55c5-d2cnd"] Apr 20 20:09:56.061511 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:56.061491 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-67877d55c5-d2cnd"] Apr 20 20:09:56.170201 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:09:56.170171 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c1e48be-77e7-4c56-97df-9f0139870ab8" path="/var/lib/kubelet/pods/9c1e48be-77e7-4c56-97df-9f0139870ab8/volumes" Apr 20 20:10:08.345143 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:08.345108 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7w84j"] Apr 20 20:10:08.345534 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:08.345433 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c1e48be-77e7-4c56-97df-9f0139870ab8" containerName="console" Apr 20 20:10:08.345534 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:08.345444 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c1e48be-77e7-4c56-97df-9f0139870ab8" containerName="console" Apr 20 20:10:08.345534 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:08.345495 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c1e48be-77e7-4c56-97df-9f0139870ab8" containerName="console" Apr 20 20:10:08.350955 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:08.350936 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7w84j" Apr 20 20:10:08.354165 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:08.354132 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 20:10:08.354721 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:08.354699 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 20:10:08.355306 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:08.355283 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-8r572\"" Apr 20 20:10:08.355622 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:08.355602 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7w84j"] Apr 20 20:10:08.467529 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:08.467497 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6c23079-c5be-43aa-85a7-1bdc1eb57acc-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7w84j\" (UID: \"f6c23079-c5be-43aa-85a7-1bdc1eb57acc\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7w84j" Apr 20 20:10:08.467689 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:08.467564 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6c23079-c5be-43aa-85a7-1bdc1eb57acc-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7w84j\" (UID: \"f6c23079-c5be-43aa-85a7-1bdc1eb57acc\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7w84j" Apr 20 20:10:08.467689 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:08.467622 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cngbp\" (UniqueName: \"kubernetes.io/projected/f6c23079-c5be-43aa-85a7-1bdc1eb57acc-kube-api-access-cngbp\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7w84j\" (UID: \"f6c23079-c5be-43aa-85a7-1bdc1eb57acc\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7w84j" Apr 20 20:10:08.568575 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:08.568543 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6c23079-c5be-43aa-85a7-1bdc1eb57acc-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7w84j\" (UID: \"f6c23079-c5be-43aa-85a7-1bdc1eb57acc\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7w84j" Apr 20 20:10:08.568700 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:08.568579 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cngbp\" (UniqueName: \"kubernetes.io/projected/f6c23079-c5be-43aa-85a7-1bdc1eb57acc-kube-api-access-cngbp\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7w84j\" (UID: \"f6c23079-c5be-43aa-85a7-1bdc1eb57acc\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7w84j" Apr 20 20:10:08.568700 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:08.568631 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6c23079-c5be-43aa-85a7-1bdc1eb57acc-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7w84j\" (UID: \"f6c23079-c5be-43aa-85a7-1bdc1eb57acc\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7w84j" Apr 20 20:10:08.569031 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:08.569011 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6c23079-c5be-43aa-85a7-1bdc1eb57acc-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7w84j\" (UID: \"f6c23079-c5be-43aa-85a7-1bdc1eb57acc\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7w84j" Apr 20 20:10:08.569067 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:08.569022 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6c23079-c5be-43aa-85a7-1bdc1eb57acc-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7w84j\" (UID: \"f6c23079-c5be-43aa-85a7-1bdc1eb57acc\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7w84j" Apr 20 20:10:08.577608 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:08.577579 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cngbp\" (UniqueName: \"kubernetes.io/projected/f6c23079-c5be-43aa-85a7-1bdc1eb57acc-kube-api-access-cngbp\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7w84j\" (UID: \"f6c23079-c5be-43aa-85a7-1bdc1eb57acc\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7w84j" Apr 20 20:10:08.660779 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:08.660714 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7w84j" Apr 20 20:10:08.778342 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:08.778318 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7w84j"] Apr 20 20:10:08.780653 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:10:08.780617 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6c23079_c5be_43aa_85a7_1bdc1eb57acc.slice/crio-78bc576469049665c4a5a2420665fbfc3221c877da799708c548f42d535dc8be WatchSource:0}: Error finding container 78bc576469049665c4a5a2420665fbfc3221c877da799708c548f42d535dc8be: Status 404 returned error can't find the container with id 78bc576469049665c4a5a2420665fbfc3221c877da799708c548f42d535dc8be Apr 20 20:10:09.072360 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:09.072320 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7w84j" event={"ID":"f6c23079-c5be-43aa-85a7-1bdc1eb57acc","Type":"ContainerStarted","Data":"78bc576469049665c4a5a2420665fbfc3221c877da799708c548f42d535dc8be"} Apr 20 20:10:15.092124 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:15.092092 2573 generic.go:358] "Generic (PLEG): container finished" podID="f6c23079-c5be-43aa-85a7-1bdc1eb57acc" containerID="9dcdf12e866cefda4ee2d58d535c8a9681bb2bb8e05bc2b5ab62f6494d5d8be3" exitCode=0 Apr 20 20:10:15.092486 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:15.092157 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7w84j" event={"ID":"f6c23079-c5be-43aa-85a7-1bdc1eb57acc","Type":"ContainerDied","Data":"9dcdf12e866cefda4ee2d58d535c8a9681bb2bb8e05bc2b5ab62f6494d5d8be3"} Apr 20 20:10:18.063923 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:18.063901 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/ovn-acl-logging/0.log" Apr 20 20:10:18.064287 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:18.064047 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/ovn-acl-logging/0.log" Apr 20 20:10:18.066743 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:18.066726 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 20:10:18.101527 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:18.101497 2573 generic.go:358] "Generic (PLEG): container finished" podID="f6c23079-c5be-43aa-85a7-1bdc1eb57acc" containerID="395419e172a91d618bea6f4440055addf4de6a826eb7cef911906ef4d1fa2534" exitCode=0 Apr 20 20:10:18.101613 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:18.101531 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7w84j" event={"ID":"f6c23079-c5be-43aa-85a7-1bdc1eb57acc","Type":"ContainerDied","Data":"395419e172a91d618bea6f4440055addf4de6a826eb7cef911906ef4d1fa2534"} Apr 20 20:10:25.126010 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:25.125974 2573 generic.go:358] "Generic (PLEG): container finished" podID="f6c23079-c5be-43aa-85a7-1bdc1eb57acc" containerID="f7944d04f1e1b390dc5bb327b628278385582a0f43f11e1fc195e760fff0c9bb" exitCode=0 Apr 20 20:10:25.126366 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:25.126049 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7w84j" event={"ID":"f6c23079-c5be-43aa-85a7-1bdc1eb57acc","Type":"ContainerDied","Data":"f7944d04f1e1b390dc5bb327b628278385582a0f43f11e1fc195e760fff0c9bb"} Apr 20 20:10:26.250481 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:26.250455 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7w84j" Apr 20 20:10:26.316790 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:26.316759 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6c23079-c5be-43aa-85a7-1bdc1eb57acc-util\") pod \"f6c23079-c5be-43aa-85a7-1bdc1eb57acc\" (UID: \"f6c23079-c5be-43aa-85a7-1bdc1eb57acc\") " Apr 20 20:10:26.316995 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:26.316834 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cngbp\" (UniqueName: \"kubernetes.io/projected/f6c23079-c5be-43aa-85a7-1bdc1eb57acc-kube-api-access-cngbp\") pod \"f6c23079-c5be-43aa-85a7-1bdc1eb57acc\" (UID: \"f6c23079-c5be-43aa-85a7-1bdc1eb57acc\") " Apr 20 20:10:26.316995 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:26.316893 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6c23079-c5be-43aa-85a7-1bdc1eb57acc-bundle\") pod \"f6c23079-c5be-43aa-85a7-1bdc1eb57acc\" (UID: \"f6c23079-c5be-43aa-85a7-1bdc1eb57acc\") " Apr 20 20:10:26.317466 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:26.317440 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6c23079-c5be-43aa-85a7-1bdc1eb57acc-bundle" (OuterVolumeSpecName: "bundle") pod "f6c23079-c5be-43aa-85a7-1bdc1eb57acc" (UID: "f6c23079-c5be-43aa-85a7-1bdc1eb57acc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:10:26.319016 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:26.318993 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6c23079-c5be-43aa-85a7-1bdc1eb57acc-kube-api-access-cngbp" (OuterVolumeSpecName: "kube-api-access-cngbp") pod "f6c23079-c5be-43aa-85a7-1bdc1eb57acc" (UID: "f6c23079-c5be-43aa-85a7-1bdc1eb57acc"). InnerVolumeSpecName "kube-api-access-cngbp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:10:26.321091 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:26.321068 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6c23079-c5be-43aa-85a7-1bdc1eb57acc-util" (OuterVolumeSpecName: "util") pod "f6c23079-c5be-43aa-85a7-1bdc1eb57acc" (UID: "f6c23079-c5be-43aa-85a7-1bdc1eb57acc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:10:26.418492 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:26.418397 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f6c23079-c5be-43aa-85a7-1bdc1eb57acc-util\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:10:26.418492 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:26.418441 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cngbp\" (UniqueName: \"kubernetes.io/projected/f6c23079-c5be-43aa-85a7-1bdc1eb57acc-kube-api-access-cngbp\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:10:26.418492 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:26.418453 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f6c23079-c5be-43aa-85a7-1bdc1eb57acc-bundle\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:10:27.132919 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:27.132884 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7w84j" event={"ID":"f6c23079-c5be-43aa-85a7-1bdc1eb57acc","Type":"ContainerDied","Data":"78bc576469049665c4a5a2420665fbfc3221c877da799708c548f42d535dc8be"} Apr 20 20:10:27.132919 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:27.132922 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7w84j" Apr 20 20:10:27.133122 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:27.132923 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78bc576469049665c4a5a2420665fbfc3221c877da799708c548f42d535dc8be" Apr 20 20:10:29.999931 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:29.999896 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5kprq"] Apr 20 20:10:30.000295 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:30.000190 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6c23079-c5be-43aa-85a7-1bdc1eb57acc" containerName="util" Apr 20 20:10:30.000295 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:30.000201 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6c23079-c5be-43aa-85a7-1bdc1eb57acc" containerName="util" Apr 20 20:10:30.000295 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:30.000209 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6c23079-c5be-43aa-85a7-1bdc1eb57acc" containerName="extract" Apr 20 20:10:30.000295 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:30.000215 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6c23079-c5be-43aa-85a7-1bdc1eb57acc" containerName="extract" Apr 20 20:10:30.000295 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:30.000226 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6c23079-c5be-43aa-85a7-1bdc1eb57acc" containerName="pull" Apr 20 20:10:30.000295 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:30.000231 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6c23079-c5be-43aa-85a7-1bdc1eb57acc" containerName="pull" Apr 20 20:10:30.000295 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:30.000288 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6c23079-c5be-43aa-85a7-1bdc1eb57acc" containerName="extract" Apr 20 20:10:30.052540 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:30.052512 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5kprq"] Apr 20 20:10:30.052695 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:30.052630 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5kprq" Apr 20 20:10:30.055422 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:30.055400 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 20 20:10:30.055573 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:30.055405 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 20 20:10:30.055702 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:30.055688 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-lbvcz\"" Apr 20 20:10:30.055906 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:30.055883 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 20 20:10:30.150384 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:30.150353 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxxzd\" (UniqueName: \"kubernetes.io/projected/33e14da8-f0bb-40fd-b74f-1c855fd565cc-kube-api-access-gxxzd\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-5kprq\" (UID: \"33e14da8-f0bb-40fd-b74f-1c855fd565cc\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5kprq" Apr 20 20:10:30.150555 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:30.150429 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/33e14da8-f0bb-40fd-b74f-1c855fd565cc-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-5kprq\" (UID: \"33e14da8-f0bb-40fd-b74f-1c855fd565cc\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5kprq" Apr 20 20:10:30.251437 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:30.251357 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxxzd\" (UniqueName: \"kubernetes.io/projected/33e14da8-f0bb-40fd-b74f-1c855fd565cc-kube-api-access-gxxzd\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-5kprq\" (UID: \"33e14da8-f0bb-40fd-b74f-1c855fd565cc\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5kprq" Apr 20 20:10:30.251437 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:30.251421 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/33e14da8-f0bb-40fd-b74f-1c855fd565cc-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-5kprq\" (UID: \"33e14da8-f0bb-40fd-b74f-1c855fd565cc\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5kprq" Apr 20 20:10:30.253706 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:30.253687 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/33e14da8-f0bb-40fd-b74f-1c855fd565cc-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-5kprq\" (UID: \"33e14da8-f0bb-40fd-b74f-1c855fd565cc\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5kprq" Apr 20 20:10:30.260283 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:30.260255 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxxzd\" (UniqueName: \"kubernetes.io/projected/33e14da8-f0bb-40fd-b74f-1c855fd565cc-kube-api-access-gxxzd\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-5kprq\" (UID: \"33e14da8-f0bb-40fd-b74f-1c855fd565cc\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5kprq" Apr 20 20:10:30.362995 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:30.362960 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5kprq" Apr 20 20:10:30.485649 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:30.485617 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5kprq"] Apr 20 20:10:30.489072 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:10:30.489041 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33e14da8_f0bb_40fd_b74f_1c855fd565cc.slice/crio-5d970575f6be89994faadb58e261c443239417991c52ebd2b488dcc1c84e2956 WatchSource:0}: Error finding container 5d970575f6be89994faadb58e261c443239417991c52ebd2b488dcc1c84e2956: Status 404 returned error can't find the container with id 5d970575f6be89994faadb58e261c443239417991c52ebd2b488dcc1c84e2956 Apr 20 20:10:30.491372 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:30.491350 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:10:31.151015 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:31.150981 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5kprq" event={"ID":"33e14da8-f0bb-40fd-b74f-1c855fd565cc","Type":"ContainerStarted","Data":"5d970575f6be89994faadb58e261c443239417991c52ebd2b488dcc1c84e2956"} Apr 20 20:10:34.163092 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:34.163052 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5kprq" event={"ID":"33e14da8-f0bb-40fd-b74f-1c855fd565cc","Type":"ContainerStarted","Data":"053c2e0d5d27db9e8afbeb8d1e59fd097cd21c2d0ec043ba531de5a8cd1a72a0"} Apr 20 20:10:34.163531 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:34.163169 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5kprq" Apr 20 20:10:34.182772 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:34.182727 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5kprq" podStartSLOduration=1.6179135740000001 podStartE2EDuration="5.182713877s" podCreationTimestamp="2026-04-20 20:10:29 +0000 UTC" firstStartedPulling="2026-04-20 20:10:30.491516234 +0000 UTC m=+312.906808954" lastFinishedPulling="2026-04-20 20:10:34.056316537 +0000 UTC m=+316.471609257" observedRunningTime="2026-04-20 20:10:34.182675803 +0000 UTC m=+316.597968546" watchObservedRunningTime="2026-04-20 20:10:34.182713877 +0000 UTC m=+316.598006616" Apr 20 20:10:34.519742 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:34.519658 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-ggqw5"] Apr 20 20:10:34.523114 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:34.523089 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-ggqw5" Apr 20 20:10:34.525937 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:34.525910 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-zrwkg\"" Apr 20 20:10:34.525937 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:34.525931 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 20 20:10:34.526117 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:34.525918 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 20 20:10:34.530383 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:34.530356 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-ggqw5"] Apr 20 20:10:34.690515 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:34.690480 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc5hf\" (UniqueName: \"kubernetes.io/projected/04bc208a-f36c-4d80-9611-42a4aef9a900-kube-api-access-wc5hf\") pod \"keda-operator-ffbb595cb-ggqw5\" (UID: \"04bc208a-f36c-4d80-9611-42a4aef9a900\") " pod="openshift-keda/keda-operator-ffbb595cb-ggqw5" Apr 20 20:10:34.690699 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:34.690534 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/04bc208a-f36c-4d80-9611-42a4aef9a900-cabundle0\") pod \"keda-operator-ffbb595cb-ggqw5\" (UID: \"04bc208a-f36c-4d80-9611-42a4aef9a900\") " pod="openshift-keda/keda-operator-ffbb595cb-ggqw5" Apr 20 20:10:34.690699 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:34.690607 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/04bc208a-f36c-4d80-9611-42a4aef9a900-certificates\") pod \"keda-operator-ffbb595cb-ggqw5\" (UID: \"04bc208a-f36c-4d80-9611-42a4aef9a900\") " pod="openshift-keda/keda-operator-ffbb595cb-ggqw5" Apr 20 20:10:34.791350 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:34.791314 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/04bc208a-f36c-4d80-9611-42a4aef9a900-cabundle0\") pod \"keda-operator-ffbb595cb-ggqw5\" (UID: \"04bc208a-f36c-4d80-9611-42a4aef9a900\") " pod="openshift-keda/keda-operator-ffbb595cb-ggqw5" Apr 20 20:10:34.791526 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:34.791357 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/04bc208a-f36c-4d80-9611-42a4aef9a900-certificates\") pod \"keda-operator-ffbb595cb-ggqw5\" (UID: \"04bc208a-f36c-4d80-9611-42a4aef9a900\") " pod="openshift-keda/keda-operator-ffbb595cb-ggqw5" Apr 20 20:10:34.791526 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:34.791447 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wc5hf\" (UniqueName: \"kubernetes.io/projected/04bc208a-f36c-4d80-9611-42a4aef9a900-kube-api-access-wc5hf\") pod \"keda-operator-ffbb595cb-ggqw5\" (UID: \"04bc208a-f36c-4d80-9611-42a4aef9a900\") " pod="openshift-keda/keda-operator-ffbb595cb-ggqw5" Apr 20 20:10:34.791639 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:10:34.791569 2573 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 20 20:10:34.791639 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:10:34.791595 2573 secret.go:281] references non-existent secret key: ca.crt Apr 20 20:10:34.791639 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:10:34.791606 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 20 20:10:34.791639 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:10:34.791622 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-ggqw5: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 20 20:10:34.791809 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:10:34.791692 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/04bc208a-f36c-4d80-9611-42a4aef9a900-certificates podName:04bc208a-f36c-4d80-9611-42a4aef9a900 nodeName:}" failed. No retries permitted until 2026-04-20 20:10:35.291670749 +0000 UTC m=+317.706963672 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/04bc208a-f36c-4d80-9611-42a4aef9a900-certificates") pod "keda-operator-ffbb595cb-ggqw5" (UID: "04bc208a-f36c-4d80-9611-42a4aef9a900") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 20 20:10:34.792221 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:34.792142 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/04bc208a-f36c-4d80-9611-42a4aef9a900-cabundle0\") pod \"keda-operator-ffbb595cb-ggqw5\" (UID: \"04bc208a-f36c-4d80-9611-42a4aef9a900\") " pod="openshift-keda/keda-operator-ffbb595cb-ggqw5" Apr 20 20:10:34.814465 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:34.814431 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc5hf\" (UniqueName: \"kubernetes.io/projected/04bc208a-f36c-4d80-9611-42a4aef9a900-kube-api-access-wc5hf\") pod \"keda-operator-ffbb595cb-ggqw5\" (UID: \"04bc208a-f36c-4d80-9611-42a4aef9a900\") " pod="openshift-keda/keda-operator-ffbb595cb-ggqw5" Apr 20 20:10:35.108534 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:35.108461 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-tdncg"] Apr 20 20:10:35.111828 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:35.111811 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-tdncg" Apr 20 20:10:35.115011 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:35.114986 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 20 20:10:35.121508 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:35.121486 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-tdncg"] Apr 20 20:10:35.195202 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:35.195171 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lx8p\" (UniqueName: \"kubernetes.io/projected/c65f5891-9982-4509-aa17-a2f9dea61584-kube-api-access-2lx8p\") pod \"keda-admission-cf49989db-tdncg\" (UID: \"c65f5891-9982-4509-aa17-a2f9dea61584\") " pod="openshift-keda/keda-admission-cf49989db-tdncg" Apr 20 20:10:35.195202 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:35.195204 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c65f5891-9982-4509-aa17-a2f9dea61584-certificates\") pod \"keda-admission-cf49989db-tdncg\" (UID: \"c65f5891-9982-4509-aa17-a2f9dea61584\") " pod="openshift-keda/keda-admission-cf49989db-tdncg" Apr 20 20:10:35.296311 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:35.296275 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2lx8p\" (UniqueName: \"kubernetes.io/projected/c65f5891-9982-4509-aa17-a2f9dea61584-kube-api-access-2lx8p\") pod \"keda-admission-cf49989db-tdncg\" (UID: \"c65f5891-9982-4509-aa17-a2f9dea61584\") " pod="openshift-keda/keda-admission-cf49989db-tdncg" Apr 20 20:10:35.296478 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:35.296317 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c65f5891-9982-4509-aa17-a2f9dea61584-certificates\") pod \"keda-admission-cf49989db-tdncg\" (UID: \"c65f5891-9982-4509-aa17-a2f9dea61584\") " pod="openshift-keda/keda-admission-cf49989db-tdncg" Apr 20 20:10:35.296478 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:35.296376 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/04bc208a-f36c-4d80-9611-42a4aef9a900-certificates\") pod \"keda-operator-ffbb595cb-ggqw5\" (UID: \"04bc208a-f36c-4d80-9611-42a4aef9a900\") " pod="openshift-keda/keda-operator-ffbb595cb-ggqw5" Apr 20 20:10:35.299002 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:35.298963 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c65f5891-9982-4509-aa17-a2f9dea61584-certificates\") pod \"keda-admission-cf49989db-tdncg\" (UID: \"c65f5891-9982-4509-aa17-a2f9dea61584\") " pod="openshift-keda/keda-admission-cf49989db-tdncg" Apr 20 20:10:35.299002 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:35.298984 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/04bc208a-f36c-4d80-9611-42a4aef9a900-certificates\") pod \"keda-operator-ffbb595cb-ggqw5\" (UID: \"04bc208a-f36c-4d80-9611-42a4aef9a900\") " pod="openshift-keda/keda-operator-ffbb595cb-ggqw5" Apr 20 20:10:35.310161 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:35.310133 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lx8p\" (UniqueName: \"kubernetes.io/projected/c65f5891-9982-4509-aa17-a2f9dea61584-kube-api-access-2lx8p\") pod \"keda-admission-cf49989db-tdncg\" (UID: \"c65f5891-9982-4509-aa17-a2f9dea61584\") " pod="openshift-keda/keda-admission-cf49989db-tdncg" Apr 20 20:10:35.422601 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:35.422522 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-tdncg" Apr 20 20:10:35.434467 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:35.434430 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-ggqw5" Apr 20 20:10:35.562830 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:35.562650 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-tdncg"] Apr 20 20:10:35.577179 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:35.577069 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-ggqw5"] Apr 20 20:10:35.579666 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:10:35.579640 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04bc208a_f36c_4d80_9611_42a4aef9a900.slice/crio-2ee4510b10a26f2b714f3c9e08bd0434eef39d24cc928cd8dfd8b4d87a76d35a WatchSource:0}: Error finding container 2ee4510b10a26f2b714f3c9e08bd0434eef39d24cc928cd8dfd8b4d87a76d35a: Status 404 returned error can't find the container with id 2ee4510b10a26f2b714f3c9e08bd0434eef39d24cc928cd8dfd8b4d87a76d35a Apr 20 20:10:36.172632 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:36.172598 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-ggqw5" event={"ID":"04bc208a-f36c-4d80-9611-42a4aef9a900","Type":"ContainerStarted","Data":"2ee4510b10a26f2b714f3c9e08bd0434eef39d24cc928cd8dfd8b4d87a76d35a"} Apr 20 20:10:36.173555 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:36.173529 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-tdncg" event={"ID":"c65f5891-9982-4509-aa17-a2f9dea61584","Type":"ContainerStarted","Data":"4a3d09db4298b2f2cc6614871868611391d7ec742f06c6659eae9cdd9205d4ba"} Apr 20 20:10:38.181369 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:38.181336 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-tdncg" event={"ID":"c65f5891-9982-4509-aa17-a2f9dea61584","Type":"ContainerStarted","Data":"d9ecb775f977bf2aa604e9177161f9ad7ba886563f4f38d023bd2c0d3ceda7ae"} Apr 20 20:10:38.181873 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:38.181420 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-tdncg" Apr 20 20:10:38.220412 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:38.220359 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-tdncg" podStartSLOduration=1.5960585040000002 podStartE2EDuration="3.22034615s" podCreationTimestamp="2026-04-20 20:10:35 +0000 UTC" firstStartedPulling="2026-04-20 20:10:35.56476605 +0000 UTC m=+317.980058776" lastFinishedPulling="2026-04-20 20:10:37.189053685 +0000 UTC m=+319.604346422" observedRunningTime="2026-04-20 20:10:38.220056906 +0000 UTC m=+320.635349660" watchObservedRunningTime="2026-04-20 20:10:38.22034615 +0000 UTC m=+320.635638893" Apr 20 20:10:41.192891 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:41.192830 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-ggqw5" event={"ID":"04bc208a-f36c-4d80-9611-42a4aef9a900","Type":"ContainerStarted","Data":"d27ae0c04b4b530d59ff3bc75583ffd051d91bcdf2a38071407aa2cdc7761de3"} Apr 20 20:10:41.193281 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:41.192970 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-ggqw5" Apr 20 20:10:41.220070 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:41.219977 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-ggqw5" podStartSLOduration=1.830440824 podStartE2EDuration="7.219962773s" podCreationTimestamp="2026-04-20 20:10:34 +0000 UTC" firstStartedPulling="2026-04-20 20:10:35.581464082 +0000 UTC m=+317.996756810" lastFinishedPulling="2026-04-20 20:10:40.970986034 +0000 UTC m=+323.386278759" observedRunningTime="2026-04-20 20:10:41.218668298 +0000 UTC m=+323.633961055" watchObservedRunningTime="2026-04-20 20:10:41.219962773 +0000 UTC m=+323.635255515" Apr 20 20:10:55.171013 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:55.170981 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5kprq" Apr 20 20:10:59.187233 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:10:59.187157 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-tdncg" Apr 20 20:11:02.197625 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:02.197596 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-ggqw5" Apr 20 20:11:42.319374 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:42.319340 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-6klw2"] Apr 20 20:11:42.322764 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:42.322741 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-6klw2" Apr 20 20:11:42.326156 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:42.326135 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 20 20:11:42.326296 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:42.326187 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 20 20:11:42.328834 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:42.327400 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-xhtp9\"" Apr 20 20:11:42.328834 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:42.327657 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 20 20:11:42.335486 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:42.335337 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-6klw2"] Apr 20 20:11:42.337525 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:42.337498 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-lnvqt"] Apr 20 20:11:42.341105 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:42.341086 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-lnvqt" Apr 20 20:11:42.344302 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:42.344282 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 20 20:11:42.347679 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:42.347658 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-d99n9\"" Apr 20 20:11:42.349313 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:42.349290 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-lnvqt"] Apr 20 20:11:42.442217 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:42.442183 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a878566e-78f7-46cc-a5ee-04fcc3b13829-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-lnvqt\" (UID: \"a878566e-78f7-46cc-a5ee-04fcc3b13829\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-lnvqt" Apr 20 20:11:42.442385 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:42.442248 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsdrd\" (UniqueName: \"kubernetes.io/projected/2283cb8c-b80b-4ef8-bc1a-442d0e02df8b-kube-api-access-nsdrd\") pod \"kserve-controller-manager-6f655776dd-6klw2\" (UID: \"2283cb8c-b80b-4ef8-bc1a-442d0e02df8b\") " pod="kserve/kserve-controller-manager-6f655776dd-6klw2" Apr 20 20:11:42.442385 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:42.442307 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2283cb8c-b80b-4ef8-bc1a-442d0e02df8b-cert\") pod \"kserve-controller-manager-6f655776dd-6klw2\" (UID: \"2283cb8c-b80b-4ef8-bc1a-442d0e02df8b\") " pod="kserve/kserve-controller-manager-6f655776dd-6klw2" Apr 20 20:11:42.442385 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:42.442329 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cskn4\" (UniqueName: \"kubernetes.io/projected/a878566e-78f7-46cc-a5ee-04fcc3b13829-kube-api-access-cskn4\") pod \"llmisvc-controller-manager-68cc5db7c4-lnvqt\" (UID: \"a878566e-78f7-46cc-a5ee-04fcc3b13829\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-lnvqt" Apr 20 20:11:42.542981 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:42.542947 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2283cb8c-b80b-4ef8-bc1a-442d0e02df8b-cert\") pod \"kserve-controller-manager-6f655776dd-6klw2\" (UID: \"2283cb8c-b80b-4ef8-bc1a-442d0e02df8b\") " pod="kserve/kserve-controller-manager-6f655776dd-6klw2" Apr 20 20:11:42.542981 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:42.542990 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cskn4\" (UniqueName: \"kubernetes.io/projected/a878566e-78f7-46cc-a5ee-04fcc3b13829-kube-api-access-cskn4\") pod \"llmisvc-controller-manager-68cc5db7c4-lnvqt\" (UID: \"a878566e-78f7-46cc-a5ee-04fcc3b13829\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-lnvqt" Apr 20 20:11:42.543182 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:42.543021 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a878566e-78f7-46cc-a5ee-04fcc3b13829-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-lnvqt\" (UID: \"a878566e-78f7-46cc-a5ee-04fcc3b13829\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-lnvqt" Apr 20 20:11:42.543182 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:42.543061 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsdrd\" (UniqueName: \"kubernetes.io/projected/2283cb8c-b80b-4ef8-bc1a-442d0e02df8b-kube-api-access-nsdrd\") pod \"kserve-controller-manager-6f655776dd-6klw2\" (UID: \"2283cb8c-b80b-4ef8-bc1a-442d0e02df8b\") " pod="kserve/kserve-controller-manager-6f655776dd-6klw2" Apr 20 20:11:42.543182 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:11:42.543168 2573 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 20 20:11:42.543287 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:11:42.543255 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a878566e-78f7-46cc-a5ee-04fcc3b13829-cert podName:a878566e-78f7-46cc-a5ee-04fcc3b13829 nodeName:}" failed. No retries permitted until 2026-04-20 20:11:43.043234679 +0000 UTC m=+385.458527413 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a878566e-78f7-46cc-a5ee-04fcc3b13829-cert") pod "llmisvc-controller-manager-68cc5db7c4-lnvqt" (UID: "a878566e-78f7-46cc-a5ee-04fcc3b13829") : secret "llmisvc-webhook-server-cert" not found Apr 20 20:11:42.545378 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:42.545350 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2283cb8c-b80b-4ef8-bc1a-442d0e02df8b-cert\") pod \"kserve-controller-manager-6f655776dd-6klw2\" (UID: \"2283cb8c-b80b-4ef8-bc1a-442d0e02df8b\") " pod="kserve/kserve-controller-manager-6f655776dd-6klw2" Apr 20 20:11:42.554967 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:42.554944 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cskn4\" (UniqueName: \"kubernetes.io/projected/a878566e-78f7-46cc-a5ee-04fcc3b13829-kube-api-access-cskn4\") pod \"llmisvc-controller-manager-68cc5db7c4-lnvqt\" (UID: \"a878566e-78f7-46cc-a5ee-04fcc3b13829\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-lnvqt" Apr 20 20:11:42.555071 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:42.554999 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsdrd\" (UniqueName: \"kubernetes.io/projected/2283cb8c-b80b-4ef8-bc1a-442d0e02df8b-kube-api-access-nsdrd\") pod \"kserve-controller-manager-6f655776dd-6klw2\" (UID: \"2283cb8c-b80b-4ef8-bc1a-442d0e02df8b\") " pod="kserve/kserve-controller-manager-6f655776dd-6klw2" Apr 20 20:11:42.638674 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:42.638589 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-6klw2" Apr 20 20:11:42.763303 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:42.763277 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-6klw2"] Apr 20 20:11:42.765711 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:11:42.765683 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2283cb8c_b80b_4ef8_bc1a_442d0e02df8b.slice/crio-31da227adf57a2189ec2db50332473b41436bdf0d0969a0b95bd91d562e63e19 WatchSource:0}: Error finding container 31da227adf57a2189ec2db50332473b41436bdf0d0969a0b95bd91d562e63e19: Status 404 returned error can't find the container with id 31da227adf57a2189ec2db50332473b41436bdf0d0969a0b95bd91d562e63e19 Apr 20 20:11:43.048260 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:43.048221 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a878566e-78f7-46cc-a5ee-04fcc3b13829-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-lnvqt\" (UID: \"a878566e-78f7-46cc-a5ee-04fcc3b13829\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-lnvqt" Apr 20 20:11:43.050590 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:43.050570 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a878566e-78f7-46cc-a5ee-04fcc3b13829-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-lnvqt\" (UID: \"a878566e-78f7-46cc-a5ee-04fcc3b13829\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-lnvqt" Apr 20 20:11:43.253238 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:43.253195 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-lnvqt" Apr 20 20:11:43.371543 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:43.371516 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-lnvqt"] Apr 20 20:11:43.373564 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:11:43.373536 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda878566e_78f7_46cc_a5ee_04fcc3b13829.slice/crio-b2deb13d4ea3f70f3dca2d601c5ed5f29002a8e535355883bfa7109f2edc3d28 WatchSource:0}: Error finding container b2deb13d4ea3f70f3dca2d601c5ed5f29002a8e535355883bfa7109f2edc3d28: Status 404 returned error can't find the container with id b2deb13d4ea3f70f3dca2d601c5ed5f29002a8e535355883bfa7109f2edc3d28 Apr 20 20:11:43.387896 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:43.387871 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-6klw2" event={"ID":"2283cb8c-b80b-4ef8-bc1a-442d0e02df8b","Type":"ContainerStarted","Data":"31da227adf57a2189ec2db50332473b41436bdf0d0969a0b95bd91d562e63e19"} Apr 20 20:11:43.388883 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:43.388833 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-lnvqt" event={"ID":"a878566e-78f7-46cc-a5ee-04fcc3b13829","Type":"ContainerStarted","Data":"b2deb13d4ea3f70f3dca2d601c5ed5f29002a8e535355883bfa7109f2edc3d28"} Apr 20 20:11:47.404413 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:47.404378 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-6klw2" event={"ID":"2283cb8c-b80b-4ef8-bc1a-442d0e02df8b","Type":"ContainerStarted","Data":"ea3f094e9d1be946b74beb91dc2d216fafefd99eacd9c5139262f3c8b53e8fb6"} Apr 20 20:11:47.404835 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:47.404597 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6f655776dd-6klw2" Apr 20 20:11:47.405785 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:47.405766 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-lnvqt" event={"ID":"a878566e-78f7-46cc-a5ee-04fcc3b13829","Type":"ContainerStarted","Data":"e3287532a7ff890403df0ac3dd4cb3fe6ca9aa241d7846676f791a4cfef30d4f"} Apr 20 20:11:47.405885 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:47.405875 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-lnvqt" Apr 20 20:11:47.434645 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:47.434592 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6f655776dd-6klw2" podStartSLOduration=1.778764436 podStartE2EDuration="5.434580602s" podCreationTimestamp="2026-04-20 20:11:42 +0000 UTC" firstStartedPulling="2026-04-20 20:11:42.766977979 +0000 UTC m=+385.182270704" lastFinishedPulling="2026-04-20 20:11:46.422794136 +0000 UTC m=+388.838086870" observedRunningTime="2026-04-20 20:11:47.432570268 +0000 UTC m=+389.847863012" watchObservedRunningTime="2026-04-20 20:11:47.434580602 +0000 UTC m=+389.849873343" Apr 20 20:11:47.453343 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:11:47.453295 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-lnvqt" podStartSLOduration=2.411458041 podStartE2EDuration="5.453279846s" podCreationTimestamp="2026-04-20 20:11:42 +0000 UTC" firstStartedPulling="2026-04-20 20:11:43.374970669 +0000 UTC m=+385.790263389" lastFinishedPulling="2026-04-20 20:11:46.416792458 +0000 UTC m=+388.832085194" observedRunningTime="2026-04-20 20:11:47.451655997 +0000 UTC m=+389.866948752" watchObservedRunningTime="2026-04-20 20:11:47.453279846 +0000 UTC m=+389.868572588" Apr 20 20:12:18.411288 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:18.411254 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-lnvqt" Apr 20 20:12:18.414487 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:18.414470 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6f655776dd-6klw2" Apr 20 20:12:19.714037 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:19.714000 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-6klw2"] Apr 20 20:12:19.714500 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:19.714290 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-6f655776dd-6klw2" podUID="2283cb8c-b80b-4ef8-bc1a-442d0e02df8b" containerName="manager" containerID="cri-o://ea3f094e9d1be946b74beb91dc2d216fafefd99eacd9c5139262f3c8b53e8fb6" gracePeriod=10 Apr 20 20:12:19.740130 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:19.740097 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-8rrmj"] Apr 20 20:12:19.794096 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:19.794069 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-8rrmj"] Apr 20 20:12:19.794197 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:19.794186 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-8rrmj" Apr 20 20:12:19.846316 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:19.846286 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl5cb\" (UniqueName: \"kubernetes.io/projected/f5b8eb9d-98f7-4d39-be5d-103edbf66559-kube-api-access-jl5cb\") pod \"kserve-controller-manager-6f655776dd-8rrmj\" (UID: \"f5b8eb9d-98f7-4d39-be5d-103edbf66559\") " pod="kserve/kserve-controller-manager-6f655776dd-8rrmj" Apr 20 20:12:19.846461 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:19.846336 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f5b8eb9d-98f7-4d39-be5d-103edbf66559-cert\") pod \"kserve-controller-manager-6f655776dd-8rrmj\" (UID: \"f5b8eb9d-98f7-4d39-be5d-103edbf66559\") " pod="kserve/kserve-controller-manager-6f655776dd-8rrmj" Apr 20 20:12:19.947409 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:19.947380 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jl5cb\" (UniqueName: \"kubernetes.io/projected/f5b8eb9d-98f7-4d39-be5d-103edbf66559-kube-api-access-jl5cb\") pod \"kserve-controller-manager-6f655776dd-8rrmj\" (UID: \"f5b8eb9d-98f7-4d39-be5d-103edbf66559\") " pod="kserve/kserve-controller-manager-6f655776dd-8rrmj" Apr 20 20:12:19.947572 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:19.947435 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f5b8eb9d-98f7-4d39-be5d-103edbf66559-cert\") pod \"kserve-controller-manager-6f655776dd-8rrmj\" (UID: \"f5b8eb9d-98f7-4d39-be5d-103edbf66559\") " pod="kserve/kserve-controller-manager-6f655776dd-8rrmj" Apr 20 20:12:19.950098 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:19.950033 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f5b8eb9d-98f7-4d39-be5d-103edbf66559-cert\") pod \"kserve-controller-manager-6f655776dd-8rrmj\" (UID: \"f5b8eb9d-98f7-4d39-be5d-103edbf66559\") " pod="kserve/kserve-controller-manager-6f655776dd-8rrmj" Apr 20 20:12:19.955829 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:19.955663 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl5cb\" (UniqueName: \"kubernetes.io/projected/f5b8eb9d-98f7-4d39-be5d-103edbf66559-kube-api-access-jl5cb\") pod \"kserve-controller-manager-6f655776dd-8rrmj\" (UID: \"f5b8eb9d-98f7-4d39-be5d-103edbf66559\") " pod="kserve/kserve-controller-manager-6f655776dd-8rrmj" Apr 20 20:12:19.969914 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:19.969868 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-6klw2" Apr 20 20:12:20.048238 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:20.048202 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2283cb8c-b80b-4ef8-bc1a-442d0e02df8b-cert\") pod \"2283cb8c-b80b-4ef8-bc1a-442d0e02df8b\" (UID: \"2283cb8c-b80b-4ef8-bc1a-442d0e02df8b\") " Apr 20 20:12:20.048395 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:20.048241 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsdrd\" (UniqueName: \"kubernetes.io/projected/2283cb8c-b80b-4ef8-bc1a-442d0e02df8b-kube-api-access-nsdrd\") pod \"2283cb8c-b80b-4ef8-bc1a-442d0e02df8b\" (UID: \"2283cb8c-b80b-4ef8-bc1a-442d0e02df8b\") " Apr 20 20:12:20.050424 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:20.050396 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2283cb8c-b80b-4ef8-bc1a-442d0e02df8b-kube-api-access-nsdrd" (OuterVolumeSpecName: "kube-api-access-nsdrd") pod "2283cb8c-b80b-4ef8-bc1a-442d0e02df8b" (UID: "2283cb8c-b80b-4ef8-bc1a-442d0e02df8b"). InnerVolumeSpecName "kube-api-access-nsdrd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:12:20.050506 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:20.050404 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2283cb8c-b80b-4ef8-bc1a-442d0e02df8b-cert" (OuterVolumeSpecName: "cert") pod "2283cb8c-b80b-4ef8-bc1a-442d0e02df8b" (UID: "2283cb8c-b80b-4ef8-bc1a-442d0e02df8b"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:12:20.149249 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:20.149214 2573 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2283cb8c-b80b-4ef8-bc1a-442d0e02df8b-cert\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:12:20.149249 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:20.149244 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nsdrd\" (UniqueName: \"kubernetes.io/projected/2283cb8c-b80b-4ef8-bc1a-442d0e02df8b-kube-api-access-nsdrd\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:12:20.166979 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:20.166949 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-8rrmj" Apr 20 20:12:20.287286 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:20.287249 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-8rrmj"] Apr 20 20:12:20.289320 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:12:20.289292 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5b8eb9d_98f7_4d39_be5d_103edbf66559.slice/crio-95c21b1fe3a69e676d31f13530f69231b70c84a0342c355af085716802ff1f80 WatchSource:0}: Error finding container 95c21b1fe3a69e676d31f13530f69231b70c84a0342c355af085716802ff1f80: Status 404 returned error can't find the container with id 95c21b1fe3a69e676d31f13530f69231b70c84a0342c355af085716802ff1f80 Apr 20 20:12:20.517764 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:20.517673 2573 generic.go:358] "Generic (PLEG): container finished" podID="2283cb8c-b80b-4ef8-bc1a-442d0e02df8b" containerID="ea3f094e9d1be946b74beb91dc2d216fafefd99eacd9c5139262f3c8b53e8fb6" exitCode=0 Apr 20 20:12:20.517764 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:20.517746 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-6klw2" Apr 20 20:12:20.518012 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:20.517749 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-6klw2" event={"ID":"2283cb8c-b80b-4ef8-bc1a-442d0e02df8b","Type":"ContainerDied","Data":"ea3f094e9d1be946b74beb91dc2d216fafefd99eacd9c5139262f3c8b53e8fb6"} Apr 20 20:12:20.518012 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:20.517865 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-6klw2" event={"ID":"2283cb8c-b80b-4ef8-bc1a-442d0e02df8b","Type":"ContainerDied","Data":"31da227adf57a2189ec2db50332473b41436bdf0d0969a0b95bd91d562e63e19"} Apr 20 20:12:20.518012 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:20.517886 2573 scope.go:117] "RemoveContainer" containerID="ea3f094e9d1be946b74beb91dc2d216fafefd99eacd9c5139262f3c8b53e8fb6" Apr 20 20:12:20.518950 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:20.518922 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-8rrmj" event={"ID":"f5b8eb9d-98f7-4d39-be5d-103edbf66559","Type":"ContainerStarted","Data":"95c21b1fe3a69e676d31f13530f69231b70c84a0342c355af085716802ff1f80"} Apr 20 20:12:20.525885 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:20.525868 2573 scope.go:117] "RemoveContainer" containerID="ea3f094e9d1be946b74beb91dc2d216fafefd99eacd9c5139262f3c8b53e8fb6" Apr 20 20:12:20.526113 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:12:20.526097 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea3f094e9d1be946b74beb91dc2d216fafefd99eacd9c5139262f3c8b53e8fb6\": container with ID starting with ea3f094e9d1be946b74beb91dc2d216fafefd99eacd9c5139262f3c8b53e8fb6 not found: ID does not exist" containerID="ea3f094e9d1be946b74beb91dc2d216fafefd99eacd9c5139262f3c8b53e8fb6" Apr 20 20:12:20.526160 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:20.526120 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea3f094e9d1be946b74beb91dc2d216fafefd99eacd9c5139262f3c8b53e8fb6"} err="failed to get container status \"ea3f094e9d1be946b74beb91dc2d216fafefd99eacd9c5139262f3c8b53e8fb6\": rpc error: code = NotFound desc = could not find container \"ea3f094e9d1be946b74beb91dc2d216fafefd99eacd9c5139262f3c8b53e8fb6\": container with ID starting with ea3f094e9d1be946b74beb91dc2d216fafefd99eacd9c5139262f3c8b53e8fb6 not found: ID does not exist" Apr 20 20:12:20.535343 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:20.535320 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-6klw2"] Apr 20 20:12:20.539084 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:20.539065 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-6klw2"] Apr 20 20:12:21.522872 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:21.522819 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-8rrmj" event={"ID":"f5b8eb9d-98f7-4d39-be5d-103edbf66559","Type":"ContainerStarted","Data":"25837383b7384bdb7e3a18d1d3f3545d972bc4373e274ead39b2df3acde3c443"} Apr 20 20:12:21.523316 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:21.522891 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6f655776dd-8rrmj" Apr 20 20:12:21.540648 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:21.540603 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6f655776dd-8rrmj" podStartSLOduration=2.10957946 podStartE2EDuration="2.540588601s" podCreationTimestamp="2026-04-20 20:12:19 +0000 UTC" firstStartedPulling="2026-04-20 20:12:20.290697075 +0000 UTC m=+422.705989795" lastFinishedPulling="2026-04-20 20:12:20.721706216 +0000 UTC m=+423.136998936" observedRunningTime="2026-04-20 20:12:21.540180413 +0000 UTC m=+423.955473169" watchObservedRunningTime="2026-04-20 20:12:21.540588601 +0000 UTC m=+423.955881343" Apr 20 20:12:22.170511 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:22.170474 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2283cb8c-b80b-4ef8-bc1a-442d0e02df8b" path="/var/lib/kubelet/pods/2283cb8c-b80b-4ef8-bc1a-442d0e02df8b/volumes" Apr 20 20:12:52.531039 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:12:52.530964 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6f655776dd-8rrmj" Apr 20 20:13:18.220759 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:18.220733 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-56b9cc76b8-4d2sk"] Apr 20 20:13:18.221093 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:18.221077 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2283cb8c-b80b-4ef8-bc1a-442d0e02df8b" containerName="manager" Apr 20 20:13:18.221093 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:18.221089 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2283cb8c-b80b-4ef8-bc1a-442d0e02df8b" containerName="manager" Apr 20 20:13:18.221172 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:18.221166 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="2283cb8c-b80b-4ef8-bc1a-442d0e02df8b" containerName="manager" Apr 20 20:13:18.224576 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:18.224557 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56b9cc76b8-4d2sk" Apr 20 20:13:18.245620 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:18.245599 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56b9cc76b8-4d2sk"] Apr 20 20:13:18.291017 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:18.290995 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4265a32c-5867-48f2-b24f-1e561f30967e-console-serving-cert\") pod \"console-56b9cc76b8-4d2sk\" (UID: \"4265a32c-5867-48f2-b24f-1e561f30967e\") " pod="openshift-console/console-56b9cc76b8-4d2sk" Apr 20 20:13:18.291131 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:18.291029 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4265a32c-5867-48f2-b24f-1e561f30967e-oauth-serving-cert\") pod \"console-56b9cc76b8-4d2sk\" (UID: \"4265a32c-5867-48f2-b24f-1e561f30967e\") " pod="openshift-console/console-56b9cc76b8-4d2sk" Apr 20 20:13:18.291131 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:18.291049 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4265a32c-5867-48f2-b24f-1e561f30967e-console-config\") pod \"console-56b9cc76b8-4d2sk\" (UID: \"4265a32c-5867-48f2-b24f-1e561f30967e\") " pod="openshift-console/console-56b9cc76b8-4d2sk" Apr 20 20:13:18.291131 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:18.291107 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wgpg\" (UniqueName: \"kubernetes.io/projected/4265a32c-5867-48f2-b24f-1e561f30967e-kube-api-access-4wgpg\") pod \"console-56b9cc76b8-4d2sk\" (UID: \"4265a32c-5867-48f2-b24f-1e561f30967e\") " pod="openshift-console/console-56b9cc76b8-4d2sk" Apr 20 20:13:18.291257 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:18.291202 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4265a32c-5867-48f2-b24f-1e561f30967e-console-oauth-config\") pod \"console-56b9cc76b8-4d2sk\" (UID: \"4265a32c-5867-48f2-b24f-1e561f30967e\") " pod="openshift-console/console-56b9cc76b8-4d2sk" Apr 20 20:13:18.291257 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:18.291230 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4265a32c-5867-48f2-b24f-1e561f30967e-service-ca\") pod \"console-56b9cc76b8-4d2sk\" (UID: \"4265a32c-5867-48f2-b24f-1e561f30967e\") " pod="openshift-console/console-56b9cc76b8-4d2sk" Apr 20 20:13:18.291329 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:18.291258 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4265a32c-5867-48f2-b24f-1e561f30967e-trusted-ca-bundle\") pod \"console-56b9cc76b8-4d2sk\" (UID: \"4265a32c-5867-48f2-b24f-1e561f30967e\") " pod="openshift-console/console-56b9cc76b8-4d2sk" Apr 20 20:13:18.392060 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:18.391982 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4265a32c-5867-48f2-b24f-1e561f30967e-service-ca\") pod \"console-56b9cc76b8-4d2sk\" (UID: \"4265a32c-5867-48f2-b24f-1e561f30967e\") " pod="openshift-console/console-56b9cc76b8-4d2sk" Apr 20 20:13:18.392060 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:18.392026 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4265a32c-5867-48f2-b24f-1e561f30967e-trusted-ca-bundle\") pod \"console-56b9cc76b8-4d2sk\" (UID: \"4265a32c-5867-48f2-b24f-1e561f30967e\") " pod="openshift-console/console-56b9cc76b8-4d2sk" Apr 20 20:13:18.392248 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:18.392074 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4265a32c-5867-48f2-b24f-1e561f30967e-console-serving-cert\") pod \"console-56b9cc76b8-4d2sk\" (UID: \"4265a32c-5867-48f2-b24f-1e561f30967e\") " pod="openshift-console/console-56b9cc76b8-4d2sk" Apr 20 20:13:18.392248 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:18.392113 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4265a32c-5867-48f2-b24f-1e561f30967e-oauth-serving-cert\") pod \"console-56b9cc76b8-4d2sk\" (UID: \"4265a32c-5867-48f2-b24f-1e561f30967e\") " pod="openshift-console/console-56b9cc76b8-4d2sk" Apr 20 20:13:18.392248 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:18.392140 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4265a32c-5867-48f2-b24f-1e561f30967e-console-config\") pod \"console-56b9cc76b8-4d2sk\" (UID: \"4265a32c-5867-48f2-b24f-1e561f30967e\") " pod="openshift-console/console-56b9cc76b8-4d2sk" Apr 20 20:13:18.392248 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:18.392169 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wgpg\" (UniqueName: \"kubernetes.io/projected/4265a32c-5867-48f2-b24f-1e561f30967e-kube-api-access-4wgpg\") pod \"console-56b9cc76b8-4d2sk\" (UID: \"4265a32c-5867-48f2-b24f-1e561f30967e\") " pod="openshift-console/console-56b9cc76b8-4d2sk" Apr 20 20:13:18.392445 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:18.392246 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4265a32c-5867-48f2-b24f-1e561f30967e-console-oauth-config\") pod \"console-56b9cc76b8-4d2sk\" (UID: \"4265a32c-5867-48f2-b24f-1e561f30967e\") " pod="openshift-console/console-56b9cc76b8-4d2sk" Apr 20 20:13:18.392878 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:18.392819 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4265a32c-5867-48f2-b24f-1e561f30967e-service-ca\") pod \"console-56b9cc76b8-4d2sk\" (UID: \"4265a32c-5867-48f2-b24f-1e561f30967e\") " pod="openshift-console/console-56b9cc76b8-4d2sk" Apr 20 20:13:18.392878 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:18.392819 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4265a32c-5867-48f2-b24f-1e561f30967e-oauth-serving-cert\") pod \"console-56b9cc76b8-4d2sk\" (UID: \"4265a32c-5867-48f2-b24f-1e561f30967e\") " pod="openshift-console/console-56b9cc76b8-4d2sk" Apr 20 20:13:18.393055 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:18.392918 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4265a32c-5867-48f2-b24f-1e561f30967e-console-config\") pod \"console-56b9cc76b8-4d2sk\" (UID: \"4265a32c-5867-48f2-b24f-1e561f30967e\") " pod="openshift-console/console-56b9cc76b8-4d2sk" Apr 20 20:13:18.393113 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:18.393084 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4265a32c-5867-48f2-b24f-1e561f30967e-trusted-ca-bundle\") pod \"console-56b9cc76b8-4d2sk\" (UID: \"4265a32c-5867-48f2-b24f-1e561f30967e\") " pod="openshift-console/console-56b9cc76b8-4d2sk" Apr 20 20:13:18.394486 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:18.394459 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4265a32c-5867-48f2-b24f-1e561f30967e-console-serving-cert\") pod \"console-56b9cc76b8-4d2sk\" (UID: \"4265a32c-5867-48f2-b24f-1e561f30967e\") " pod="openshift-console/console-56b9cc76b8-4d2sk" Apr 20 20:13:18.394598 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:18.394554 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4265a32c-5867-48f2-b24f-1e561f30967e-console-oauth-config\") pod \"console-56b9cc76b8-4d2sk\" (UID: \"4265a32c-5867-48f2-b24f-1e561f30967e\") " pod="openshift-console/console-56b9cc76b8-4d2sk" Apr 20 20:13:18.401398 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:18.401376 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wgpg\" (UniqueName: \"kubernetes.io/projected/4265a32c-5867-48f2-b24f-1e561f30967e-kube-api-access-4wgpg\") pod \"console-56b9cc76b8-4d2sk\" (UID: \"4265a32c-5867-48f2-b24f-1e561f30967e\") " pod="openshift-console/console-56b9cc76b8-4d2sk" Apr 20 20:13:18.532903 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:18.532880 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56b9cc76b8-4d2sk" Apr 20 20:13:18.655753 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:18.655720 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56b9cc76b8-4d2sk"] Apr 20 20:13:18.662371 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:13:18.662325 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4265a32c_5867_48f2_b24f_1e561f30967e.slice/crio-0c017c9346698df4c42ac3278ba4941ee6177c2b35c2f3694239b966e637470f WatchSource:0}: Error finding container 0c017c9346698df4c42ac3278ba4941ee6177c2b35c2f3694239b966e637470f: Status 404 returned error can't find the container with id 0c017c9346698df4c42ac3278ba4941ee6177c2b35c2f3694239b966e637470f Apr 20 20:13:18.714115 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:18.714083 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56b9cc76b8-4d2sk" event={"ID":"4265a32c-5867-48f2-b24f-1e561f30967e","Type":"ContainerStarted","Data":"0c017c9346698df4c42ac3278ba4941ee6177c2b35c2f3694239b966e637470f"} Apr 20 20:13:19.719185 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:19.719144 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56b9cc76b8-4d2sk" event={"ID":"4265a32c-5867-48f2-b24f-1e561f30967e","Type":"ContainerStarted","Data":"704ac353955cf3967157ec499351827311a6341dad1fffc2066ebfd2a7ddcd80"} Apr 20 20:13:19.738464 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:19.738412 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-56b9cc76b8-4d2sk" podStartSLOduration=1.738395731 podStartE2EDuration="1.738395731s" podCreationTimestamp="2026-04-20 20:13:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:13:19.737015233 +0000 UTC m=+482.152307981" watchObservedRunningTime="2026-04-20 20:13:19.738395731 +0000 UTC m=+482.153688474" Apr 20 20:13:28.533359 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:28.533316 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-56b9cc76b8-4d2sk" Apr 20 20:13:28.533359 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:28.533366 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-56b9cc76b8-4d2sk" Apr 20 20:13:28.537714 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:28.537690 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-56b9cc76b8-4d2sk" Apr 20 20:13:28.752294 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:28.752270 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-56b9cc76b8-4d2sk" Apr 20 20:13:28.807201 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:28.807132 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6696669cb6-cdj9t"] Apr 20 20:13:30.412535 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:30.412503 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd"] Apr 20 20:13:30.415742 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:30.415722 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" Apr 20 20:13:30.418496 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:30.418479 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 20 20:13:30.418496 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:30.418494 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 20 20:13:30.418640 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:30.418505 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-127f1-predictor-serving-cert\"" Apr 20 20:13:30.419607 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:30.419580 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-127f1-kube-rbac-proxy-sar-config\"" Apr 20 20:13:30.419607 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:30.419581 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-lqk4h\"" Apr 20 20:13:30.423345 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:30.423083 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd"] Apr 20 20:13:30.482991 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:30.482970 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f42e68c8-47bf-47a6-8400-a14c37e81640-proxy-tls\") pod \"error-404-isvc-127f1-predictor-548d5f46c9-cb6dd\" (UID: \"f42e68c8-47bf-47a6-8400-a14c37e81640\") " pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" Apr 20 20:13:30.483110 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:30.483008 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-127f1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f42e68c8-47bf-47a6-8400-a14c37e81640-error-404-isvc-127f1-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-127f1-predictor-548d5f46c9-cb6dd\" (UID: \"f42e68c8-47bf-47a6-8400-a14c37e81640\") " pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" Apr 20 20:13:30.483110 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:30.483033 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpx8s\" (UniqueName: \"kubernetes.io/projected/f42e68c8-47bf-47a6-8400-a14c37e81640-kube-api-access-hpx8s\") pod \"error-404-isvc-127f1-predictor-548d5f46c9-cb6dd\" (UID: \"f42e68c8-47bf-47a6-8400-a14c37e81640\") " pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" Apr 20 20:13:30.584187 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:30.584161 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f42e68c8-47bf-47a6-8400-a14c37e81640-proxy-tls\") pod \"error-404-isvc-127f1-predictor-548d5f46c9-cb6dd\" (UID: \"f42e68c8-47bf-47a6-8400-a14c37e81640\") " pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" Apr 20 20:13:30.584295 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:30.584207 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-127f1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f42e68c8-47bf-47a6-8400-a14c37e81640-error-404-isvc-127f1-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-127f1-predictor-548d5f46c9-cb6dd\" (UID: \"f42e68c8-47bf-47a6-8400-a14c37e81640\") " pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" Apr 20 20:13:30.584295 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:30.584233 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hpx8s\" (UniqueName: \"kubernetes.io/projected/f42e68c8-47bf-47a6-8400-a14c37e81640-kube-api-access-hpx8s\") pod \"error-404-isvc-127f1-predictor-548d5f46c9-cb6dd\" (UID: \"f42e68c8-47bf-47a6-8400-a14c37e81640\") " pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" Apr 20 20:13:30.584954 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:30.584932 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-127f1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f42e68c8-47bf-47a6-8400-a14c37e81640-error-404-isvc-127f1-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-127f1-predictor-548d5f46c9-cb6dd\" (UID: \"f42e68c8-47bf-47a6-8400-a14c37e81640\") " pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" Apr 20 20:13:30.586528 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:30.586510 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f42e68c8-47bf-47a6-8400-a14c37e81640-proxy-tls\") pod \"error-404-isvc-127f1-predictor-548d5f46c9-cb6dd\" (UID: \"f42e68c8-47bf-47a6-8400-a14c37e81640\") " pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" Apr 20 20:13:30.594143 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:30.594122 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpx8s\" (UniqueName: \"kubernetes.io/projected/f42e68c8-47bf-47a6-8400-a14c37e81640-kube-api-access-hpx8s\") pod \"error-404-isvc-127f1-predictor-548d5f46c9-cb6dd\" (UID: \"f42e68c8-47bf-47a6-8400-a14c37e81640\") " pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" Apr 20 20:13:30.727147 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:30.727084 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" Apr 20 20:13:31.056316 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:31.056241 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd"] Apr 20 20:13:31.058584 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:13:31.058556 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf42e68c8_47bf_47a6_8400_a14c37e81640.slice/crio-7c3b30addf49468cc0c169c48bdbffbc77e50bcd9d69ac13f6f6db906d634717 WatchSource:0}: Error finding container 7c3b30addf49468cc0c169c48bdbffbc77e50bcd9d69ac13f6f6db906d634717: Status 404 returned error can't find the container with id 7c3b30addf49468cc0c169c48bdbffbc77e50bcd9d69ac13f6f6db906d634717 Apr 20 20:13:31.761326 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:31.761266 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" event={"ID":"f42e68c8-47bf-47a6-8400-a14c37e81640","Type":"ContainerStarted","Data":"7c3b30addf49468cc0c169c48bdbffbc77e50bcd9d69ac13f6f6db906d634717"} Apr 20 20:13:44.812742 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:44.812710 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" event={"ID":"f42e68c8-47bf-47a6-8400-a14c37e81640","Type":"ContainerStarted","Data":"6005aa23a2f6eeeb0ead6df82344d7c29dd011d32f95858b7ce2c4c97649ba73"} Apr 20 20:13:47.824520 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:47.824483 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" event={"ID":"f42e68c8-47bf-47a6-8400-a14c37e81640","Type":"ContainerStarted","Data":"439be062f38e02dad63837a04f7c5776d8062797dd6f29b72327efcad1a6afdb"} Apr 20 20:13:47.824937 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:47.824576 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" Apr 20 20:13:47.852376 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:47.852331 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" podStartSLOduration=1.455783577 podStartE2EDuration="17.852318115s" podCreationTimestamp="2026-04-20 20:13:30 +0000 UTC" firstStartedPulling="2026-04-20 20:13:31.060430914 +0000 UTC m=+493.475723648" lastFinishedPulling="2026-04-20 20:13:47.456965465 +0000 UTC m=+509.872258186" observedRunningTime="2026-04-20 20:13:47.849929588 +0000 UTC m=+510.265222330" watchObservedRunningTime="2026-04-20 20:13:47.852318115 +0000 UTC m=+510.267610934" Apr 20 20:13:48.828683 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:48.828651 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" Apr 20 20:13:48.829869 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:48.829828 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" podUID="f42e68c8-47bf-47a6-8400-a14c37e81640" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 20 20:13:49.832248 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:49.832203 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" podUID="f42e68c8-47bf-47a6-8400-a14c37e81640" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 20 20:13:53.833017 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:53.832975 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6696669cb6-cdj9t" podUID="6b4d9fdf-d201-4c1d-a5c4-672a88db3a23" containerName="console" containerID="cri-o://bde4d6e3a27f6c612681d570f7e7115f1e78206b480acce8048c1c9bb19c4aad" gracePeriod=15 Apr 20 20:13:54.075227 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.075205 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6696669cb6-cdj9t_6b4d9fdf-d201-4c1d-a5c4-672a88db3a23/console/0.log" Apr 20 20:13:54.075332 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.075265 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6696669cb6-cdj9t" Apr 20 20:13:54.191737 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.191655 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dxq7\" (UniqueName: \"kubernetes.io/projected/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-kube-api-access-8dxq7\") pod \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\" (UID: \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\") " Apr 20 20:13:54.191737 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.191732 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-service-ca\") pod \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\" (UID: \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\") " Apr 20 20:13:54.191988 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.191754 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-oauth-serving-cert\") pod \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\" (UID: \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\") " Apr 20 20:13:54.191988 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.191779 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-console-serving-cert\") pod \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\" (UID: \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\") " Apr 20 20:13:54.191988 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.191809 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-console-config\") pod \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\" (UID: \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\") " Apr 20 20:13:54.191988 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.191839 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-trusted-ca-bundle\") pod \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\" (UID: \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\") " Apr 20 20:13:54.191988 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.191895 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-console-oauth-config\") pod \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\" (UID: \"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23\") " Apr 20 20:13:54.192241 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.192166 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6b4d9fdf-d201-4c1d-a5c4-672a88db3a23" (UID: "6b4d9fdf-d201-4c1d-a5c4-672a88db3a23"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:13:54.192298 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.192237 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-service-ca" (OuterVolumeSpecName: "service-ca") pod "6b4d9fdf-d201-4c1d-a5c4-672a88db3a23" (UID: "6b4d9fdf-d201-4c1d-a5c4-672a88db3a23"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:13:54.192298 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.192239 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-console-config" (OuterVolumeSpecName: "console-config") pod "6b4d9fdf-d201-4c1d-a5c4-672a88db3a23" (UID: "6b4d9fdf-d201-4c1d-a5c4-672a88db3a23"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:13:54.192375 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.192333 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6b4d9fdf-d201-4c1d-a5c4-672a88db3a23" (UID: "6b4d9fdf-d201-4c1d-a5c4-672a88db3a23"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:13:54.193920 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.193899 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6b4d9fdf-d201-4c1d-a5c4-672a88db3a23" (UID: "6b4d9fdf-d201-4c1d-a5c4-672a88db3a23"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:13:54.194194 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.194177 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6b4d9fdf-d201-4c1d-a5c4-672a88db3a23" (UID: "6b4d9fdf-d201-4c1d-a5c4-672a88db3a23"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:13:54.194249 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.194215 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-kube-api-access-8dxq7" (OuterVolumeSpecName: "kube-api-access-8dxq7") pod "6b4d9fdf-d201-4c1d-a5c4-672a88db3a23" (UID: "6b4d9fdf-d201-4c1d-a5c4-672a88db3a23"). InnerVolumeSpecName "kube-api-access-8dxq7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:13:54.292938 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.292901 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8dxq7\" (UniqueName: \"kubernetes.io/projected/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-kube-api-access-8dxq7\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:13:54.292938 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.292937 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-service-ca\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:13:54.292938 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.292948 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-oauth-serving-cert\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:13:54.293147 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.292957 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-console-serving-cert\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:13:54.293147 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.292966 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-console-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:13:54.293147 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.292975 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-trusted-ca-bundle\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:13:54.293147 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.292984 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23-console-oauth-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:13:54.837466 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.837438 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" Apr 20 20:13:54.838056 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.838020 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" podUID="f42e68c8-47bf-47a6-8400-a14c37e81640" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 20 20:13:54.850318 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.850299 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6696669cb6-cdj9t_6b4d9fdf-d201-4c1d-a5c4-672a88db3a23/console/0.log" Apr 20 20:13:54.850430 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.850335 2573 generic.go:358] "Generic (PLEG): container finished" podID="6b4d9fdf-d201-4c1d-a5c4-672a88db3a23" containerID="bde4d6e3a27f6c612681d570f7e7115f1e78206b480acce8048c1c9bb19c4aad" exitCode=2 Apr 20 20:13:54.850430 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.850393 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6696669cb6-cdj9t" Apr 20 20:13:54.850503 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.850422 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6696669cb6-cdj9t" event={"ID":"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23","Type":"ContainerDied","Data":"bde4d6e3a27f6c612681d570f7e7115f1e78206b480acce8048c1c9bb19c4aad"} Apr 20 20:13:54.850503 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.850460 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6696669cb6-cdj9t" event={"ID":"6b4d9fdf-d201-4c1d-a5c4-672a88db3a23","Type":"ContainerDied","Data":"2d5983af5e22121062c515272dd6e8d51e5cb995b6f63c0fc0f4f4059c338bce"} Apr 20 20:13:54.850503 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.850475 2573 scope.go:117] "RemoveContainer" containerID="bde4d6e3a27f6c612681d570f7e7115f1e78206b480acce8048c1c9bb19c4aad" Apr 20 20:13:54.859630 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.859612 2573 scope.go:117] "RemoveContainer" containerID="bde4d6e3a27f6c612681d570f7e7115f1e78206b480acce8048c1c9bb19c4aad" Apr 20 20:13:54.859907 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:13:54.859889 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bde4d6e3a27f6c612681d570f7e7115f1e78206b480acce8048c1c9bb19c4aad\": container with ID starting with bde4d6e3a27f6c612681d570f7e7115f1e78206b480acce8048c1c9bb19c4aad not found: ID does not exist" containerID="bde4d6e3a27f6c612681d570f7e7115f1e78206b480acce8048c1c9bb19c4aad" Apr 20 20:13:54.859969 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.859919 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bde4d6e3a27f6c612681d570f7e7115f1e78206b480acce8048c1c9bb19c4aad"} err="failed to get container status \"bde4d6e3a27f6c612681d570f7e7115f1e78206b480acce8048c1c9bb19c4aad\": rpc error: code = NotFound desc = could not find container \"bde4d6e3a27f6c612681d570f7e7115f1e78206b480acce8048c1c9bb19c4aad\": container with ID starting with bde4d6e3a27f6c612681d570f7e7115f1e78206b480acce8048c1c9bb19c4aad not found: ID does not exist" Apr 20 20:13:54.876250 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.876221 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6696669cb6-cdj9t"] Apr 20 20:13:54.880775 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:54.880756 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6696669cb6-cdj9t"] Apr 20 20:13:56.171263 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:13:56.171192 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b4d9fdf-d201-4c1d-a5c4-672a88db3a23" path="/var/lib/kubelet/pods/6b4d9fdf-d201-4c1d-a5c4-672a88db3a23/volumes" Apr 20 20:14:04.837987 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:14:04.837937 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" podUID="f42e68c8-47bf-47a6-8400-a14c37e81640" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 20 20:14:14.838551 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:14:14.838502 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" podUID="f42e68c8-47bf-47a6-8400-a14c37e81640" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 20 20:14:24.839058 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:14:24.839004 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" podUID="f42e68c8-47bf-47a6-8400-a14c37e81640" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 20 20:14:34.838533 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:14:34.838501 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" Apr 20 20:15:00.249163 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:00.249124 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd"] Apr 20 20:15:00.249653 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:00.249500 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" podUID="f42e68c8-47bf-47a6-8400-a14c37e81640" containerName="kserve-container" containerID="cri-o://6005aa23a2f6eeeb0ead6df82344d7c29dd011d32f95858b7ce2c4c97649ba73" gracePeriod=30 Apr 20 20:15:00.249653 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:00.249565 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" podUID="f42e68c8-47bf-47a6-8400-a14c37e81640" containerName="kube-rbac-proxy" containerID="cri-o://439be062f38e02dad63837a04f7c5776d8062797dd6f29b72327efcad1a6afdb" gracePeriod=30 Apr 20 20:15:00.521045 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:00.520973 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9"] Apr 20 20:15:00.521370 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:00.521355 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b4d9fdf-d201-4c1d-a5c4-672a88db3a23" containerName="console" Apr 20 20:15:00.521424 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:00.521372 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b4d9fdf-d201-4c1d-a5c4-672a88db3a23" containerName="console" Apr 20 20:15:00.521464 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:00.521425 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b4d9fdf-d201-4c1d-a5c4-672a88db3a23" containerName="console" Apr 20 20:15:00.524375 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:00.524353 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" Apr 20 20:15:00.526797 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:00.526774 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-9290e-kube-rbac-proxy-sar-config\"" Apr 20 20:15:00.526887 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:00.526809 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-9290e-predictor-serving-cert\"" Apr 20 20:15:00.533565 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:00.533538 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9"] Apr 20 20:15:00.617510 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:00.617473 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-9290e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e3e06a0f-33be-4b0a-995f-5b9df5292665-error-404-isvc-9290e-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-9290e-predictor-df657ff87-2cvh9\" (UID: \"e3e06a0f-33be-4b0a-995f-5b9df5292665\") " pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" Apr 20 20:15:00.617510 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:00.617511 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlxgg\" (UniqueName: \"kubernetes.io/projected/e3e06a0f-33be-4b0a-995f-5b9df5292665-kube-api-access-nlxgg\") pod \"error-404-isvc-9290e-predictor-df657ff87-2cvh9\" (UID: \"e3e06a0f-33be-4b0a-995f-5b9df5292665\") " pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" Apr 20 20:15:00.617698 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:00.617604 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3e06a0f-33be-4b0a-995f-5b9df5292665-proxy-tls\") pod \"error-404-isvc-9290e-predictor-df657ff87-2cvh9\" (UID: \"e3e06a0f-33be-4b0a-995f-5b9df5292665\") " pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" Apr 20 20:15:00.719077 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:00.719045 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-9290e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e3e06a0f-33be-4b0a-995f-5b9df5292665-error-404-isvc-9290e-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-9290e-predictor-df657ff87-2cvh9\" (UID: \"e3e06a0f-33be-4b0a-995f-5b9df5292665\") " pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" Apr 20 20:15:00.719077 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:00.719078 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nlxgg\" (UniqueName: \"kubernetes.io/projected/e3e06a0f-33be-4b0a-995f-5b9df5292665-kube-api-access-nlxgg\") pod \"error-404-isvc-9290e-predictor-df657ff87-2cvh9\" (UID: \"e3e06a0f-33be-4b0a-995f-5b9df5292665\") " pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" Apr 20 20:15:00.719275 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:00.719137 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3e06a0f-33be-4b0a-995f-5b9df5292665-proxy-tls\") pod \"error-404-isvc-9290e-predictor-df657ff87-2cvh9\" (UID: \"e3e06a0f-33be-4b0a-995f-5b9df5292665\") " pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" Apr 20 20:15:00.719777 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:00.719753 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-9290e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e3e06a0f-33be-4b0a-995f-5b9df5292665-error-404-isvc-9290e-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-9290e-predictor-df657ff87-2cvh9\" (UID: \"e3e06a0f-33be-4b0a-995f-5b9df5292665\") " pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" Apr 20 20:15:00.721596 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:00.721577 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3e06a0f-33be-4b0a-995f-5b9df5292665-proxy-tls\") pod \"error-404-isvc-9290e-predictor-df657ff87-2cvh9\" (UID: \"e3e06a0f-33be-4b0a-995f-5b9df5292665\") " pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" Apr 20 20:15:00.728135 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:00.728116 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlxgg\" (UniqueName: \"kubernetes.io/projected/e3e06a0f-33be-4b0a-995f-5b9df5292665-kube-api-access-nlxgg\") pod \"error-404-isvc-9290e-predictor-df657ff87-2cvh9\" (UID: \"e3e06a0f-33be-4b0a-995f-5b9df5292665\") " pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" Apr 20 20:15:00.835516 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:00.835493 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" Apr 20 20:15:00.957175 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:00.957151 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9"] Apr 20 20:15:00.959715 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:15:00.959686 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3e06a0f_33be_4b0a_995f_5b9df5292665.slice/crio-5890a1da3900bd75605b1c12cd8b3bc0fb0482bd64755812bcbb08e20ca50d1d WatchSource:0}: Error finding container 5890a1da3900bd75605b1c12cd8b3bc0fb0482bd64755812bcbb08e20ca50d1d: Status 404 returned error can't find the container with id 5890a1da3900bd75605b1c12cd8b3bc0fb0482bd64755812bcbb08e20ca50d1d Apr 20 20:15:01.063548 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:01.063524 2573 generic.go:358] "Generic (PLEG): container finished" podID="f42e68c8-47bf-47a6-8400-a14c37e81640" containerID="439be062f38e02dad63837a04f7c5776d8062797dd6f29b72327efcad1a6afdb" exitCode=2 Apr 20 20:15:01.063645 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:01.063585 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" event={"ID":"f42e68c8-47bf-47a6-8400-a14c37e81640","Type":"ContainerDied","Data":"439be062f38e02dad63837a04f7c5776d8062797dd6f29b72327efcad1a6afdb"} Apr 20 20:15:01.065120 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:01.065098 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" event={"ID":"e3e06a0f-33be-4b0a-995f-5b9df5292665","Type":"ContainerStarted","Data":"1d637fce5752f79072e2b506a1c6b6e180ce92a95923bb195eea04a0abce1a4a"} Apr 20 20:15:01.065212 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:01.065133 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" event={"ID":"e3e06a0f-33be-4b0a-995f-5b9df5292665","Type":"ContainerStarted","Data":"5890a1da3900bd75605b1c12cd8b3bc0fb0482bd64755812bcbb08e20ca50d1d"} Apr 20 20:15:02.070415 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:02.070378 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" event={"ID":"e3e06a0f-33be-4b0a-995f-5b9df5292665","Type":"ContainerStarted","Data":"392edf79032c7cb47737552043b562007ce32abd7c4dcfbba1ea1d070a992889"} Apr 20 20:15:02.070772 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:02.070629 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" Apr 20 20:15:02.070772 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:02.070735 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" Apr 20 20:15:02.072131 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:02.072108 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" podUID="e3e06a0f-33be-4b0a-995f-5b9df5292665" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 20 20:15:02.086961 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:02.086914 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" podStartSLOduration=2.08690057 podStartE2EDuration="2.08690057s" podCreationTimestamp="2026-04-20 20:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:15:02.085911504 +0000 UTC m=+584.501204247" watchObservedRunningTime="2026-04-20 20:15:02.08690057 +0000 UTC m=+584.502193313" Apr 20 20:15:03.076316 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:03.076286 2573 generic.go:358] "Generic (PLEG): container finished" podID="f42e68c8-47bf-47a6-8400-a14c37e81640" containerID="6005aa23a2f6eeeb0ead6df82344d7c29dd011d32f95858b7ce2c4c97649ba73" exitCode=0 Apr 20 20:15:03.076676 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:03.076350 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" event={"ID":"f42e68c8-47bf-47a6-8400-a14c37e81640","Type":"ContainerDied","Data":"6005aa23a2f6eeeb0ead6df82344d7c29dd011d32f95858b7ce2c4c97649ba73"} Apr 20 20:15:03.076789 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:03.076762 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" podUID="e3e06a0f-33be-4b0a-995f-5b9df5292665" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 20 20:15:03.092881 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:03.092843 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" Apr 20 20:15:03.241807 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:03.241727 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f42e68c8-47bf-47a6-8400-a14c37e81640-proxy-tls\") pod \"f42e68c8-47bf-47a6-8400-a14c37e81640\" (UID: \"f42e68c8-47bf-47a6-8400-a14c37e81640\") " Apr 20 20:15:03.241807 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:03.241802 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpx8s\" (UniqueName: \"kubernetes.io/projected/f42e68c8-47bf-47a6-8400-a14c37e81640-kube-api-access-hpx8s\") pod \"f42e68c8-47bf-47a6-8400-a14c37e81640\" (UID: \"f42e68c8-47bf-47a6-8400-a14c37e81640\") " Apr 20 20:15:03.242015 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:03.241842 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-127f1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f42e68c8-47bf-47a6-8400-a14c37e81640-error-404-isvc-127f1-kube-rbac-proxy-sar-config\") pod \"f42e68c8-47bf-47a6-8400-a14c37e81640\" (UID: \"f42e68c8-47bf-47a6-8400-a14c37e81640\") " Apr 20 20:15:03.242313 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:03.242275 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f42e68c8-47bf-47a6-8400-a14c37e81640-error-404-isvc-127f1-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-127f1-kube-rbac-proxy-sar-config") pod "f42e68c8-47bf-47a6-8400-a14c37e81640" (UID: "f42e68c8-47bf-47a6-8400-a14c37e81640"). InnerVolumeSpecName "error-404-isvc-127f1-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:15:03.244069 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:03.244045 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f42e68c8-47bf-47a6-8400-a14c37e81640-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f42e68c8-47bf-47a6-8400-a14c37e81640" (UID: "f42e68c8-47bf-47a6-8400-a14c37e81640"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:15:03.244176 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:03.244124 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f42e68c8-47bf-47a6-8400-a14c37e81640-kube-api-access-hpx8s" (OuterVolumeSpecName: "kube-api-access-hpx8s") pod "f42e68c8-47bf-47a6-8400-a14c37e81640" (UID: "f42e68c8-47bf-47a6-8400-a14c37e81640"). InnerVolumeSpecName "kube-api-access-hpx8s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:15:03.343560 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:03.343521 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f42e68c8-47bf-47a6-8400-a14c37e81640-proxy-tls\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:15:03.343716 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:03.343565 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hpx8s\" (UniqueName: \"kubernetes.io/projected/f42e68c8-47bf-47a6-8400-a14c37e81640-kube-api-access-hpx8s\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:15:03.343716 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:03.343584 2573 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-127f1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f42e68c8-47bf-47a6-8400-a14c37e81640-error-404-isvc-127f1-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:15:04.081391 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:04.081362 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" Apr 20 20:15:04.081781 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:04.081361 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd" event={"ID":"f42e68c8-47bf-47a6-8400-a14c37e81640","Type":"ContainerDied","Data":"7c3b30addf49468cc0c169c48bdbffbc77e50bcd9d69ac13f6f6db906d634717"} Apr 20 20:15:04.081781 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:04.081489 2573 scope.go:117] "RemoveContainer" containerID="439be062f38e02dad63837a04f7c5776d8062797dd6f29b72327efcad1a6afdb" Apr 20 20:15:04.091205 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:04.091186 2573 scope.go:117] "RemoveContainer" containerID="6005aa23a2f6eeeb0ead6df82344d7c29dd011d32f95858b7ce2c4c97649ba73" Apr 20 20:15:04.104270 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:04.104248 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd"] Apr 20 20:15:04.107652 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:04.107631 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-127f1-predictor-548d5f46c9-cb6dd"] Apr 20 20:15:04.171430 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:04.171396 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f42e68c8-47bf-47a6-8400-a14c37e81640" path="/var/lib/kubelet/pods/f42e68c8-47bf-47a6-8400-a14c37e81640/volumes" Apr 20 20:15:08.080595 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:08.080564 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" Apr 20 20:15:08.081110 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:08.081085 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" podUID="e3e06a0f-33be-4b0a-995f-5b9df5292665" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 20 20:15:18.081144 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:18.081087 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" podUID="e3e06a0f-33be-4b0a-995f-5b9df5292665" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 20 20:15:18.089888 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:18.089837 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/ovn-acl-logging/0.log" Apr 20 20:15:18.090043 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:18.089844 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/ovn-acl-logging/0.log" Apr 20 20:15:28.081613 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:28.081521 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" podUID="e3e06a0f-33be-4b0a-995f-5b9df5292665" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 20 20:15:38.081811 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:38.081773 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" podUID="e3e06a0f-33be-4b0a-995f-5b9df5292665" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 20 20:15:40.590484 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:40.590446 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw"] Apr 20 20:15:40.590840 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:40.590813 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f42e68c8-47bf-47a6-8400-a14c37e81640" containerName="kserve-container" Apr 20 20:15:40.590840 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:40.590825 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f42e68c8-47bf-47a6-8400-a14c37e81640" containerName="kserve-container" Apr 20 20:15:40.590840 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:40.590833 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f42e68c8-47bf-47a6-8400-a14c37e81640" containerName="kube-rbac-proxy" Apr 20 20:15:40.590840 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:40.590839 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f42e68c8-47bf-47a6-8400-a14c37e81640" containerName="kube-rbac-proxy" Apr 20 20:15:40.591081 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:40.590933 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f42e68c8-47bf-47a6-8400-a14c37e81640" containerName="kube-rbac-proxy" Apr 20 20:15:40.591081 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:40.590951 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f42e68c8-47bf-47a6-8400-a14c37e81640" containerName="kserve-container" Apr 20 20:15:40.594665 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:40.594635 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" Apr 20 20:15:40.598663 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:40.598636 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-e6f34-kube-rbac-proxy-sar-config\"" Apr 20 20:15:40.599111 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:40.599091 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-e6f34-predictor-serving-cert\"" Apr 20 20:15:40.611377 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:40.611348 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw"] Apr 20 20:15:40.642582 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:40.642552 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/171726d5-6680-4524-b824-23696042dc39-proxy-tls\") pod \"error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw\" (UID: \"171726d5-6680-4524-b824-23696042dc39\") " pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" Apr 20 20:15:40.642758 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:40.642591 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wvpw\" (UniqueName: \"kubernetes.io/projected/171726d5-6680-4524-b824-23696042dc39-kube-api-access-8wvpw\") pod \"error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw\" (UID: \"171726d5-6680-4524-b824-23696042dc39\") " pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" Apr 20 20:15:40.642758 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:40.642698 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-e6f34-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/171726d5-6680-4524-b824-23696042dc39-error-404-isvc-e6f34-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw\" (UID: \"171726d5-6680-4524-b824-23696042dc39\") " pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" Apr 20 20:15:40.743784 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:40.743750 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/171726d5-6680-4524-b824-23696042dc39-proxy-tls\") pod \"error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw\" (UID: \"171726d5-6680-4524-b824-23696042dc39\") " pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" Apr 20 20:15:40.743784 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:40.743793 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wvpw\" (UniqueName: \"kubernetes.io/projected/171726d5-6680-4524-b824-23696042dc39-kube-api-access-8wvpw\") pod \"error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw\" (UID: \"171726d5-6680-4524-b824-23696042dc39\") " pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" Apr 20 20:15:40.744070 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:40.743842 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-e6f34-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/171726d5-6680-4524-b824-23696042dc39-error-404-isvc-e6f34-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw\" (UID: \"171726d5-6680-4524-b824-23696042dc39\") " pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" Apr 20 20:15:40.744563 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:40.744540 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-e6f34-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/171726d5-6680-4524-b824-23696042dc39-error-404-isvc-e6f34-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw\" (UID: \"171726d5-6680-4524-b824-23696042dc39\") " pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" Apr 20 20:15:40.746372 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:40.746352 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/171726d5-6680-4524-b824-23696042dc39-proxy-tls\") pod \"error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw\" (UID: \"171726d5-6680-4524-b824-23696042dc39\") " pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" Apr 20 20:15:40.753613 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:40.753589 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wvpw\" (UniqueName: \"kubernetes.io/projected/171726d5-6680-4524-b824-23696042dc39-kube-api-access-8wvpw\") pod \"error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw\" (UID: \"171726d5-6680-4524-b824-23696042dc39\") " pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" Apr 20 20:15:40.911749 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:40.911665 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" Apr 20 20:15:41.047147 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:41.047120 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw"] Apr 20 20:15:41.049031 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:15:41.048995 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod171726d5_6680_4524_b824_23696042dc39.slice/crio-0c2dd862773d03590e35ac0723d8607b814b345f5d2c4f72c89dd142e0b39b42 WatchSource:0}: Error finding container 0c2dd862773d03590e35ac0723d8607b814b345f5d2c4f72c89dd142e0b39b42: Status 404 returned error can't find the container with id 0c2dd862773d03590e35ac0723d8607b814b345f5d2c4f72c89dd142e0b39b42 Apr 20 20:15:41.050641 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:41.050624 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:15:41.201489 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:41.201453 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" event={"ID":"171726d5-6680-4524-b824-23696042dc39","Type":"ContainerStarted","Data":"3d90903e94fa58c7e0e7c4b45f295c418e56e9cbb6650ee31458ec0d370e3bd8"} Apr 20 20:15:41.201489 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:41.201491 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" event={"ID":"171726d5-6680-4524-b824-23696042dc39","Type":"ContainerStarted","Data":"5c48b8c91a89ce2b4be8191a39c5f6b6879b1cf6940fbcebfdae76637be634cf"} Apr 20 20:15:41.201682 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:41.201505 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" event={"ID":"171726d5-6680-4524-b824-23696042dc39","Type":"ContainerStarted","Data":"0c2dd862773d03590e35ac0723d8607b814b345f5d2c4f72c89dd142e0b39b42"} Apr 20 20:15:41.201682 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:41.201580 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" Apr 20 20:15:41.223660 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:41.223604 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" podStartSLOduration=1.223586669 podStartE2EDuration="1.223586669s" podCreationTimestamp="2026-04-20 20:15:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:15:41.221262114 +0000 UTC m=+623.636554858" watchObservedRunningTime="2026-04-20 20:15:41.223586669 +0000 UTC m=+623.638879410" Apr 20 20:15:42.204505 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:42.204477 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" Apr 20 20:15:42.205795 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:42.205770 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" podUID="171726d5-6680-4524-b824-23696042dc39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 20 20:15:43.207689 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:43.207649 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" podUID="171726d5-6680-4524-b824-23696042dc39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 20 20:15:48.081635 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:48.081605 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" Apr 20 20:15:48.212093 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:48.212064 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" Apr 20 20:15:48.212582 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:48.212558 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" podUID="171726d5-6680-4524-b824-23696042dc39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 20 20:15:58.212484 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:15:58.212441 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" podUID="171726d5-6680-4524-b824-23696042dc39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 20 20:16:08.213378 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:16:08.213336 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" podUID="171726d5-6680-4524-b824-23696042dc39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 20 20:16:18.212465 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:16:18.212427 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" podUID="171726d5-6680-4524-b824-23696042dc39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 20 20:16:28.213057 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:16:28.213025 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" Apr 20 20:20:18.122714 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:20:18.122633 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/ovn-acl-logging/0.log" Apr 20 20:20:18.124475 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:20:18.124456 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/ovn-acl-logging/0.log" Apr 20 20:24:15.349870 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:15.349814 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9"] Apr 20 20:24:15.352405 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:15.350199 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" podUID="e3e06a0f-33be-4b0a-995f-5b9df5292665" containerName="kserve-container" containerID="cri-o://1d637fce5752f79072e2b506a1c6b6e180ce92a95923bb195eea04a0abce1a4a" gracePeriod=30 Apr 20 20:24:15.352405 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:15.350255 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" podUID="e3e06a0f-33be-4b0a-995f-5b9df5292665" containerName="kube-rbac-proxy" containerID="cri-o://392edf79032c7cb47737552043b562007ce32abd7c4dcfbba1ea1d070a992889" gracePeriod=30 Apr 20 20:24:15.467672 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:15.467643 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s"] Apr 20 20:24:15.477095 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:15.477071 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" Apr 20 20:24:15.479729 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:15.479684 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-7541f-predictor-serving-cert\"" Apr 20 20:24:15.479887 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:15.479740 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-7541f-kube-rbac-proxy-sar-config\"" Apr 20 20:24:15.481548 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:15.481524 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s"] Apr 20 20:24:15.530871 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:15.530824 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55817dee-9562-415a-aa34-340f14b7b88f-proxy-tls\") pod \"error-404-isvc-7541f-predictor-6b4d6f7547-62k8s\" (UID: \"55817dee-9562-415a-aa34-340f14b7b88f\") " pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" Apr 20 20:24:15.530979 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:15.530903 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf65n\" (UniqueName: \"kubernetes.io/projected/55817dee-9562-415a-aa34-340f14b7b88f-kube-api-access-wf65n\") pod \"error-404-isvc-7541f-predictor-6b4d6f7547-62k8s\" (UID: \"55817dee-9562-415a-aa34-340f14b7b88f\") " pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" Apr 20 20:24:15.530979 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:15.530932 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-7541f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/55817dee-9562-415a-aa34-340f14b7b88f-error-404-isvc-7541f-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-7541f-predictor-6b4d6f7547-62k8s\" (UID: \"55817dee-9562-415a-aa34-340f14b7b88f\") " pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" Apr 20 20:24:15.631617 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:15.631545 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55817dee-9562-415a-aa34-340f14b7b88f-proxy-tls\") pod \"error-404-isvc-7541f-predictor-6b4d6f7547-62k8s\" (UID: \"55817dee-9562-415a-aa34-340f14b7b88f\") " pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" Apr 20 20:24:15.631753 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:15.631634 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wf65n\" (UniqueName: \"kubernetes.io/projected/55817dee-9562-415a-aa34-340f14b7b88f-kube-api-access-wf65n\") pod \"error-404-isvc-7541f-predictor-6b4d6f7547-62k8s\" (UID: \"55817dee-9562-415a-aa34-340f14b7b88f\") " pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" Apr 20 20:24:15.631753 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:15.631663 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-7541f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/55817dee-9562-415a-aa34-340f14b7b88f-error-404-isvc-7541f-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-7541f-predictor-6b4d6f7547-62k8s\" (UID: \"55817dee-9562-415a-aa34-340f14b7b88f\") " pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" Apr 20 20:24:15.632294 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:15.632270 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-7541f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/55817dee-9562-415a-aa34-340f14b7b88f-error-404-isvc-7541f-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-7541f-predictor-6b4d6f7547-62k8s\" (UID: \"55817dee-9562-415a-aa34-340f14b7b88f\") " pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" Apr 20 20:24:15.633904 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:15.633877 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55817dee-9562-415a-aa34-340f14b7b88f-proxy-tls\") pod \"error-404-isvc-7541f-predictor-6b4d6f7547-62k8s\" (UID: \"55817dee-9562-415a-aa34-340f14b7b88f\") " pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" Apr 20 20:24:15.639736 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:15.639716 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf65n\" (UniqueName: \"kubernetes.io/projected/55817dee-9562-415a-aa34-340f14b7b88f-kube-api-access-wf65n\") pod \"error-404-isvc-7541f-predictor-6b4d6f7547-62k8s\" (UID: \"55817dee-9562-415a-aa34-340f14b7b88f\") " pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" Apr 20 20:24:15.788626 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:15.788595 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" Apr 20 20:24:15.906833 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:15.906809 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s"] Apr 20 20:24:15.908772 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:24:15.908744 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55817dee_9562_415a_aa34_340f14b7b88f.slice/crio-cbe2370295d004d449094ff1084d062a30892c1004daac9365497f1f068a82fa WatchSource:0}: Error finding container cbe2370295d004d449094ff1084d062a30892c1004daac9365497f1f068a82fa: Status 404 returned error can't find the container with id cbe2370295d004d449094ff1084d062a30892c1004daac9365497f1f068a82fa Apr 20 20:24:15.910503 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:15.910488 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:24:15.945658 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:15.945631 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" event={"ID":"55817dee-9562-415a-aa34-340f14b7b88f","Type":"ContainerStarted","Data":"cbe2370295d004d449094ff1084d062a30892c1004daac9365497f1f068a82fa"} Apr 20 20:24:15.947392 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:15.947368 2573 generic.go:358] "Generic (PLEG): container finished" podID="e3e06a0f-33be-4b0a-995f-5b9df5292665" containerID="392edf79032c7cb47737552043b562007ce32abd7c4dcfbba1ea1d070a992889" exitCode=2 Apr 20 20:24:15.947488 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:15.947436 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" event={"ID":"e3e06a0f-33be-4b0a-995f-5b9df5292665","Type":"ContainerDied","Data":"392edf79032c7cb47737552043b562007ce32abd7c4dcfbba1ea1d070a992889"} Apr 20 20:24:16.952787 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:16.952753 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" event={"ID":"55817dee-9562-415a-aa34-340f14b7b88f","Type":"ContainerStarted","Data":"ee441bd50befe93e88e26d461a5190d4d9155993d2debf84efb963d5e3daff62"} Apr 20 20:24:16.953226 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:16.952796 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" event={"ID":"55817dee-9562-415a-aa34-340f14b7b88f","Type":"ContainerStarted","Data":"1127c20d7419d41fe4ec86aa892dc69e65b8e8410c22436237f417bc9421eb68"} Apr 20 20:24:16.953226 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:16.952936 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" Apr 20 20:24:16.971783 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:16.971742 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" podStartSLOduration=1.971730405 podStartE2EDuration="1.971730405s" podCreationTimestamp="2026-04-20 20:24:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:24:16.969264347 +0000 UTC m=+1139.384557090" watchObservedRunningTime="2026-04-20 20:24:16.971730405 +0000 UTC m=+1139.387023146" Apr 20 20:24:17.955922 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:17.955886 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" Apr 20 20:24:17.957005 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:17.956973 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" podUID="55817dee-9562-415a-aa34-340f14b7b88f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 20 20:24:18.077824 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:18.077791 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" podUID="e3e06a0f-33be-4b0a-995f-5b9df5292665" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.34:8643/healthz\": dial tcp 10.133.0.34:8643: connect: connection refused" Apr 20 20:24:18.081117 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:18.081080 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" podUID="e3e06a0f-33be-4b0a-995f-5b9df5292665" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 20 20:24:18.199601 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:18.199573 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" Apr 20 20:24:18.254842 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:18.254780 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3e06a0f-33be-4b0a-995f-5b9df5292665-proxy-tls\") pod \"e3e06a0f-33be-4b0a-995f-5b9df5292665\" (UID: \"e3e06a0f-33be-4b0a-995f-5b9df5292665\") " Apr 20 20:24:18.254985 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:18.254897 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlxgg\" (UniqueName: \"kubernetes.io/projected/e3e06a0f-33be-4b0a-995f-5b9df5292665-kube-api-access-nlxgg\") pod \"e3e06a0f-33be-4b0a-995f-5b9df5292665\" (UID: \"e3e06a0f-33be-4b0a-995f-5b9df5292665\") " Apr 20 20:24:18.254985 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:18.254947 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-9290e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e3e06a0f-33be-4b0a-995f-5b9df5292665-error-404-isvc-9290e-kube-rbac-proxy-sar-config\") pod \"e3e06a0f-33be-4b0a-995f-5b9df5292665\" (UID: \"e3e06a0f-33be-4b0a-995f-5b9df5292665\") " Apr 20 20:24:18.255305 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:18.255277 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3e06a0f-33be-4b0a-995f-5b9df5292665-error-404-isvc-9290e-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-9290e-kube-rbac-proxy-sar-config") pod "e3e06a0f-33be-4b0a-995f-5b9df5292665" (UID: "e3e06a0f-33be-4b0a-995f-5b9df5292665"). InnerVolumeSpecName "error-404-isvc-9290e-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:24:18.256751 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:18.256722 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3e06a0f-33be-4b0a-995f-5b9df5292665-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e3e06a0f-33be-4b0a-995f-5b9df5292665" (UID: "e3e06a0f-33be-4b0a-995f-5b9df5292665"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:24:18.256910 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:18.256809 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3e06a0f-33be-4b0a-995f-5b9df5292665-kube-api-access-nlxgg" (OuterVolumeSpecName: "kube-api-access-nlxgg") pod "e3e06a0f-33be-4b0a-995f-5b9df5292665" (UID: "e3e06a0f-33be-4b0a-995f-5b9df5292665"). InnerVolumeSpecName "kube-api-access-nlxgg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:24:18.356141 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:18.356110 2573 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-9290e-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e3e06a0f-33be-4b0a-995f-5b9df5292665-error-404-isvc-9290e-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:24:18.356141 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:18.356135 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e3e06a0f-33be-4b0a-995f-5b9df5292665-proxy-tls\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:24:18.356141 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:18.356145 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nlxgg\" (UniqueName: \"kubernetes.io/projected/e3e06a0f-33be-4b0a-995f-5b9df5292665-kube-api-access-nlxgg\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:24:18.959986 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:18.959951 2573 generic.go:358] "Generic (PLEG): container finished" podID="e3e06a0f-33be-4b0a-995f-5b9df5292665" containerID="1d637fce5752f79072e2b506a1c6b6e180ce92a95923bb195eea04a0abce1a4a" exitCode=0 Apr 20 20:24:18.960374 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:18.960029 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" Apr 20 20:24:18.960374 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:18.960035 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" event={"ID":"e3e06a0f-33be-4b0a-995f-5b9df5292665","Type":"ContainerDied","Data":"1d637fce5752f79072e2b506a1c6b6e180ce92a95923bb195eea04a0abce1a4a"} Apr 20 20:24:18.960374 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:18.960072 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9" event={"ID":"e3e06a0f-33be-4b0a-995f-5b9df5292665","Type":"ContainerDied","Data":"5890a1da3900bd75605b1c12cd8b3bc0fb0482bd64755812bcbb08e20ca50d1d"} Apr 20 20:24:18.960374 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:18.960089 2573 scope.go:117] "RemoveContainer" containerID="392edf79032c7cb47737552043b562007ce32abd7c4dcfbba1ea1d070a992889" Apr 20 20:24:18.960682 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:18.960652 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" podUID="55817dee-9562-415a-aa34-340f14b7b88f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 20 20:24:18.968937 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:18.968919 2573 scope.go:117] "RemoveContainer" containerID="1d637fce5752f79072e2b506a1c6b6e180ce92a95923bb195eea04a0abce1a4a" Apr 20 20:24:18.975719 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:18.975703 2573 scope.go:117] "RemoveContainer" containerID="392edf79032c7cb47737552043b562007ce32abd7c4dcfbba1ea1d070a992889" Apr 20 20:24:18.975985 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:24:18.975959 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"392edf79032c7cb47737552043b562007ce32abd7c4dcfbba1ea1d070a992889\": container with ID starting with 392edf79032c7cb47737552043b562007ce32abd7c4dcfbba1ea1d070a992889 not found: ID does not exist" containerID="392edf79032c7cb47737552043b562007ce32abd7c4dcfbba1ea1d070a992889" Apr 20 20:24:18.976124 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:18.975991 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"392edf79032c7cb47737552043b562007ce32abd7c4dcfbba1ea1d070a992889"} err="failed to get container status \"392edf79032c7cb47737552043b562007ce32abd7c4dcfbba1ea1d070a992889\": rpc error: code = NotFound desc = could not find container \"392edf79032c7cb47737552043b562007ce32abd7c4dcfbba1ea1d070a992889\": container with ID starting with 392edf79032c7cb47737552043b562007ce32abd7c4dcfbba1ea1d070a992889 not found: ID does not exist" Apr 20 20:24:18.976124 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:18.976008 2573 scope.go:117] "RemoveContainer" containerID="1d637fce5752f79072e2b506a1c6b6e180ce92a95923bb195eea04a0abce1a4a" Apr 20 20:24:18.976227 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:24:18.976210 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d637fce5752f79072e2b506a1c6b6e180ce92a95923bb195eea04a0abce1a4a\": container with ID starting with 1d637fce5752f79072e2b506a1c6b6e180ce92a95923bb195eea04a0abce1a4a not found: ID does not exist" containerID="1d637fce5752f79072e2b506a1c6b6e180ce92a95923bb195eea04a0abce1a4a" Apr 20 20:24:18.976264 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:18.976234 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d637fce5752f79072e2b506a1c6b6e180ce92a95923bb195eea04a0abce1a4a"} err="failed to get container status \"1d637fce5752f79072e2b506a1c6b6e180ce92a95923bb195eea04a0abce1a4a\": rpc error: code = NotFound desc = could not find container \"1d637fce5752f79072e2b506a1c6b6e180ce92a95923bb195eea04a0abce1a4a\": container with ID starting with 1d637fce5752f79072e2b506a1c6b6e180ce92a95923bb195eea04a0abce1a4a not found: ID does not exist" Apr 20 20:24:18.982044 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:18.982025 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9"] Apr 20 20:24:18.986108 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:18.986088 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-9290e-predictor-df657ff87-2cvh9"] Apr 20 20:24:20.171487 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:20.171449 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3e06a0f-33be-4b0a-995f-5b9df5292665" path="/var/lib/kubelet/pods/e3e06a0f-33be-4b0a-995f-5b9df5292665/volumes" Apr 20 20:24:23.965809 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:23.965658 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" Apr 20 20:24:23.966305 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:23.966255 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" podUID="55817dee-9562-415a-aa34-340f14b7b88f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 20 20:24:33.966206 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:33.966158 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" podUID="55817dee-9562-415a-aa34-340f14b7b88f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 20 20:24:43.966790 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:43.966742 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" podUID="55817dee-9562-415a-aa34-340f14b7b88f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 20 20:24:53.966411 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:53.966368 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" podUID="55817dee-9562-415a-aa34-340f14b7b88f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 20 20:24:55.071269 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:55.071236 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw"] Apr 20 20:24:55.071819 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:55.071512 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" podUID="171726d5-6680-4524-b824-23696042dc39" containerName="kserve-container" containerID="cri-o://5c48b8c91a89ce2b4be8191a39c5f6b6879b1cf6940fbcebfdae76637be634cf" gracePeriod=30 Apr 20 20:24:55.071819 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:55.071568 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" podUID="171726d5-6680-4524-b824-23696042dc39" containerName="kube-rbac-proxy" containerID="cri-o://3d90903e94fa58c7e0e7c4b45f295c418e56e9cbb6650ee31458ec0d370e3bd8" gracePeriod=30 Apr 20 20:24:55.244634 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:55.244600 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj"] Apr 20 20:24:55.245096 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:55.245077 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e3e06a0f-33be-4b0a-995f-5b9df5292665" containerName="kserve-container" Apr 20 20:24:55.245096 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:55.245097 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e06a0f-33be-4b0a-995f-5b9df5292665" containerName="kserve-container" Apr 20 20:24:55.245263 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:55.245126 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e3e06a0f-33be-4b0a-995f-5b9df5292665" containerName="kube-rbac-proxy" Apr 20 20:24:55.245263 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:55.245134 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e06a0f-33be-4b0a-995f-5b9df5292665" containerName="kube-rbac-proxy" Apr 20 20:24:55.245263 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:55.245240 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e3e06a0f-33be-4b0a-995f-5b9df5292665" containerName="kube-rbac-proxy" Apr 20 20:24:55.245263 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:55.245258 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e3e06a0f-33be-4b0a-995f-5b9df5292665" containerName="kserve-container" Apr 20 20:24:55.248382 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:55.248361 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" Apr 20 20:24:55.250739 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:55.250716 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-63de8-kube-rbac-proxy-sar-config\"" Apr 20 20:24:55.250879 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:55.250768 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-63de8-predictor-serving-cert\"" Apr 20 20:24:55.254305 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:55.254285 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj"] Apr 20 20:24:55.267640 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:55.267617 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-63de8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2e558e2f-8541-47ab-bd38-141408b5dbcb-error-404-isvc-63de8-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-63de8-predictor-544568fdd5-wfwsj\" (UID: \"2e558e2f-8541-47ab-bd38-141408b5dbcb\") " pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" Apr 20 20:24:55.267733 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:55.267650 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e558e2f-8541-47ab-bd38-141408b5dbcb-proxy-tls\") pod \"error-404-isvc-63de8-predictor-544568fdd5-wfwsj\" (UID: \"2e558e2f-8541-47ab-bd38-141408b5dbcb\") " pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" Apr 20 20:24:55.267733 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:55.267679 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2hgw\" (UniqueName: \"kubernetes.io/projected/2e558e2f-8541-47ab-bd38-141408b5dbcb-kube-api-access-f2hgw\") pod \"error-404-isvc-63de8-predictor-544568fdd5-wfwsj\" (UID: \"2e558e2f-8541-47ab-bd38-141408b5dbcb\") " pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" Apr 20 20:24:55.369158 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:55.369066 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e558e2f-8541-47ab-bd38-141408b5dbcb-proxy-tls\") pod \"error-404-isvc-63de8-predictor-544568fdd5-wfwsj\" (UID: \"2e558e2f-8541-47ab-bd38-141408b5dbcb\") " pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" Apr 20 20:24:55.369158 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:55.369124 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2hgw\" (UniqueName: \"kubernetes.io/projected/2e558e2f-8541-47ab-bd38-141408b5dbcb-kube-api-access-f2hgw\") pod \"error-404-isvc-63de8-predictor-544568fdd5-wfwsj\" (UID: \"2e558e2f-8541-47ab-bd38-141408b5dbcb\") " pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" Apr 20 20:24:55.369371 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:55.369191 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-63de8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2e558e2f-8541-47ab-bd38-141408b5dbcb-error-404-isvc-63de8-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-63de8-predictor-544568fdd5-wfwsj\" (UID: \"2e558e2f-8541-47ab-bd38-141408b5dbcb\") " pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" Apr 20 20:24:55.369820 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:55.369796 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-63de8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2e558e2f-8541-47ab-bd38-141408b5dbcb-error-404-isvc-63de8-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-63de8-predictor-544568fdd5-wfwsj\" (UID: \"2e558e2f-8541-47ab-bd38-141408b5dbcb\") " pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" Apr 20 20:24:55.371706 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:55.371686 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e558e2f-8541-47ab-bd38-141408b5dbcb-proxy-tls\") pod \"error-404-isvc-63de8-predictor-544568fdd5-wfwsj\" (UID: \"2e558e2f-8541-47ab-bd38-141408b5dbcb\") " pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" Apr 20 20:24:55.376655 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:55.376631 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2hgw\" (UniqueName: \"kubernetes.io/projected/2e558e2f-8541-47ab-bd38-141408b5dbcb-kube-api-access-f2hgw\") pod \"error-404-isvc-63de8-predictor-544568fdd5-wfwsj\" (UID: \"2e558e2f-8541-47ab-bd38-141408b5dbcb\") " pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" Apr 20 20:24:55.560286 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:55.560247 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" Apr 20 20:24:55.679433 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:55.679394 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj"] Apr 20 20:24:55.682322 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:24:55.682293 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e558e2f_8541_47ab_bd38_141408b5dbcb.slice/crio-be24177f794dce7bdb9846c383c37e3d76c0f7dec950fd2cc2c808b92fadd594 WatchSource:0}: Error finding container be24177f794dce7bdb9846c383c37e3d76c0f7dec950fd2cc2c808b92fadd594: Status 404 returned error can't find the container with id be24177f794dce7bdb9846c383c37e3d76c0f7dec950fd2cc2c808b92fadd594 Apr 20 20:24:56.090405 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:56.090373 2573 generic.go:358] "Generic (PLEG): container finished" podID="171726d5-6680-4524-b824-23696042dc39" containerID="3d90903e94fa58c7e0e7c4b45f295c418e56e9cbb6650ee31458ec0d370e3bd8" exitCode=2 Apr 20 20:24:56.090797 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:56.090407 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" event={"ID":"171726d5-6680-4524-b824-23696042dc39","Type":"ContainerDied","Data":"3d90903e94fa58c7e0e7c4b45f295c418e56e9cbb6650ee31458ec0d370e3bd8"} Apr 20 20:24:56.092013 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:56.091990 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" event={"ID":"2e558e2f-8541-47ab-bd38-141408b5dbcb","Type":"ContainerStarted","Data":"4f45416b9c53d308df574d6b50bebd3861b3518839929077f5939a253eb4fefb"} Apr 20 20:24:56.092133 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:56.092018 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" event={"ID":"2e558e2f-8541-47ab-bd38-141408b5dbcb","Type":"ContainerStarted","Data":"1c380e6a2c47b78d8357f924b4c72e191f6c6d6e00fd37615f9a8979244277ac"} Apr 20 20:24:56.092133 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:56.092027 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" event={"ID":"2e558e2f-8541-47ab-bd38-141408b5dbcb","Type":"ContainerStarted","Data":"be24177f794dce7bdb9846c383c37e3d76c0f7dec950fd2cc2c808b92fadd594"} Apr 20 20:24:56.092133 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:56.092100 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" Apr 20 20:24:56.109412 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:56.109375 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" podStartSLOduration=1.109363092 podStartE2EDuration="1.109363092s" podCreationTimestamp="2026-04-20 20:24:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:24:56.10752647 +0000 UTC m=+1178.522819214" watchObservedRunningTime="2026-04-20 20:24:56.109363092 +0000 UTC m=+1178.524655831" Apr 20 20:24:57.095758 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:57.095730 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" Apr 20 20:24:57.097103 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:57.097070 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" podUID="2e558e2f-8541-47ab-bd38-141408b5dbcb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 20 20:24:58.099632 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:58.099592 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" podUID="2e558e2f-8541-47ab-bd38-141408b5dbcb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 20 20:24:58.208079 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:58.208034 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" podUID="171726d5-6680-4524-b824-23696042dc39" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.35:8643/healthz\": dial tcp 10.133.0.35:8643: connect: connection refused" Apr 20 20:24:58.212795 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:58.212760 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" podUID="171726d5-6680-4524-b824-23696042dc39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 20 20:24:58.326392 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:58.326369 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" Apr 20 20:24:58.397587 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:58.397521 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-e6f34-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/171726d5-6680-4524-b824-23696042dc39-error-404-isvc-e6f34-kube-rbac-proxy-sar-config\") pod \"171726d5-6680-4524-b824-23696042dc39\" (UID: \"171726d5-6680-4524-b824-23696042dc39\") " Apr 20 20:24:58.397587 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:58.397566 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wvpw\" (UniqueName: \"kubernetes.io/projected/171726d5-6680-4524-b824-23696042dc39-kube-api-access-8wvpw\") pod \"171726d5-6680-4524-b824-23696042dc39\" (UID: \"171726d5-6680-4524-b824-23696042dc39\") " Apr 20 20:24:58.397747 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:58.397612 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/171726d5-6680-4524-b824-23696042dc39-proxy-tls\") pod \"171726d5-6680-4524-b824-23696042dc39\" (UID: \"171726d5-6680-4524-b824-23696042dc39\") " Apr 20 20:24:58.397977 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:58.397947 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/171726d5-6680-4524-b824-23696042dc39-error-404-isvc-e6f34-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-e6f34-kube-rbac-proxy-sar-config") pod "171726d5-6680-4524-b824-23696042dc39" (UID: "171726d5-6680-4524-b824-23696042dc39"). InnerVolumeSpecName "error-404-isvc-e6f34-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:24:58.399706 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:58.399681 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/171726d5-6680-4524-b824-23696042dc39-kube-api-access-8wvpw" (OuterVolumeSpecName: "kube-api-access-8wvpw") pod "171726d5-6680-4524-b824-23696042dc39" (UID: "171726d5-6680-4524-b824-23696042dc39"). InnerVolumeSpecName "kube-api-access-8wvpw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:24:58.399814 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:58.399681 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/171726d5-6680-4524-b824-23696042dc39-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "171726d5-6680-4524-b824-23696042dc39" (UID: "171726d5-6680-4524-b824-23696042dc39"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:24:58.498516 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:58.498477 2573 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-e6f34-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/171726d5-6680-4524-b824-23696042dc39-error-404-isvc-e6f34-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:24:58.498516 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:58.498511 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8wvpw\" (UniqueName: \"kubernetes.io/projected/171726d5-6680-4524-b824-23696042dc39-kube-api-access-8wvpw\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:24:58.498697 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:58.498524 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/171726d5-6680-4524-b824-23696042dc39-proxy-tls\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:24:59.103762 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:59.103727 2573 generic.go:358] "Generic (PLEG): container finished" podID="171726d5-6680-4524-b824-23696042dc39" containerID="5c48b8c91a89ce2b4be8191a39c5f6b6879b1cf6940fbcebfdae76637be634cf" exitCode=0 Apr 20 20:24:59.104161 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:59.103806 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" Apr 20 20:24:59.104161 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:59.103806 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" event={"ID":"171726d5-6680-4524-b824-23696042dc39","Type":"ContainerDied","Data":"5c48b8c91a89ce2b4be8191a39c5f6b6879b1cf6940fbcebfdae76637be634cf"} Apr 20 20:24:59.104161 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:59.103847 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw" event={"ID":"171726d5-6680-4524-b824-23696042dc39","Type":"ContainerDied","Data":"0c2dd862773d03590e35ac0723d8607b814b345f5d2c4f72c89dd142e0b39b42"} Apr 20 20:24:59.104161 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:59.103876 2573 scope.go:117] "RemoveContainer" containerID="3d90903e94fa58c7e0e7c4b45f295c418e56e9cbb6650ee31458ec0d370e3bd8" Apr 20 20:24:59.112565 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:59.112550 2573 scope.go:117] "RemoveContainer" containerID="5c48b8c91a89ce2b4be8191a39c5f6b6879b1cf6940fbcebfdae76637be634cf" Apr 20 20:24:59.119447 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:59.119431 2573 scope.go:117] "RemoveContainer" containerID="3d90903e94fa58c7e0e7c4b45f295c418e56e9cbb6650ee31458ec0d370e3bd8" Apr 20 20:24:59.119693 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:24:59.119673 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d90903e94fa58c7e0e7c4b45f295c418e56e9cbb6650ee31458ec0d370e3bd8\": container with ID starting with 3d90903e94fa58c7e0e7c4b45f295c418e56e9cbb6650ee31458ec0d370e3bd8 not found: ID does not exist" containerID="3d90903e94fa58c7e0e7c4b45f295c418e56e9cbb6650ee31458ec0d370e3bd8" Apr 20 20:24:59.119749 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:59.119701 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d90903e94fa58c7e0e7c4b45f295c418e56e9cbb6650ee31458ec0d370e3bd8"} err="failed to get container status \"3d90903e94fa58c7e0e7c4b45f295c418e56e9cbb6650ee31458ec0d370e3bd8\": rpc error: code = NotFound desc = could not find container \"3d90903e94fa58c7e0e7c4b45f295c418e56e9cbb6650ee31458ec0d370e3bd8\": container with ID starting with 3d90903e94fa58c7e0e7c4b45f295c418e56e9cbb6650ee31458ec0d370e3bd8 not found: ID does not exist" Apr 20 20:24:59.119749 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:59.119719 2573 scope.go:117] "RemoveContainer" containerID="5c48b8c91a89ce2b4be8191a39c5f6b6879b1cf6940fbcebfdae76637be634cf" Apr 20 20:24:59.119976 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:24:59.119958 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c48b8c91a89ce2b4be8191a39c5f6b6879b1cf6940fbcebfdae76637be634cf\": container with ID starting with 5c48b8c91a89ce2b4be8191a39c5f6b6879b1cf6940fbcebfdae76637be634cf not found: ID does not exist" containerID="5c48b8c91a89ce2b4be8191a39c5f6b6879b1cf6940fbcebfdae76637be634cf" Apr 20 20:24:59.120032 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:59.119982 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c48b8c91a89ce2b4be8191a39c5f6b6879b1cf6940fbcebfdae76637be634cf"} err="failed to get container status \"5c48b8c91a89ce2b4be8191a39c5f6b6879b1cf6940fbcebfdae76637be634cf\": rpc error: code = NotFound desc = could not find container \"5c48b8c91a89ce2b4be8191a39c5f6b6879b1cf6940fbcebfdae76637be634cf\": container with ID starting with 5c48b8c91a89ce2b4be8191a39c5f6b6879b1cf6940fbcebfdae76637be634cf not found: ID does not exist" Apr 20 20:24:59.124526 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:59.124503 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw"] Apr 20 20:24:59.127634 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:24:59.127615 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-e6f34-predictor-595bdccbdf-lf8lw"] Apr 20 20:25:00.171738 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:00.171698 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="171726d5-6680-4524-b824-23696042dc39" path="/var/lib/kubelet/pods/171726d5-6680-4524-b824-23696042dc39/volumes" Apr 20 20:25:03.105288 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:03.105257 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" Apr 20 20:25:03.105761 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:03.105737 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" podUID="2e558e2f-8541-47ab-bd38-141408b5dbcb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 20 20:25:03.967043 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:03.967010 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" Apr 20 20:25:13.106810 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:13.106770 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" podUID="2e558e2f-8541-47ab-bd38-141408b5dbcb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 20 20:25:18.147815 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:18.147787 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/ovn-acl-logging/0.log" Apr 20 20:25:18.151116 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:18.151095 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/ovn-acl-logging/0.log" Apr 20 20:25:23.106232 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:23.106192 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" podUID="2e558e2f-8541-47ab-bd38-141408b5dbcb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 20 20:25:25.713307 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:25.713275 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s"] Apr 20 20:25:25.713693 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:25.713558 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" podUID="55817dee-9562-415a-aa34-340f14b7b88f" containerName="kserve-container" containerID="cri-o://1127c20d7419d41fe4ec86aa892dc69e65b8e8410c22436237f417bc9421eb68" gracePeriod=30 Apr 20 20:25:25.713693 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:25.713586 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" podUID="55817dee-9562-415a-aa34-340f14b7b88f" containerName="kube-rbac-proxy" containerID="cri-o://ee441bd50befe93e88e26d461a5190d4d9155993d2debf84efb963d5e3daff62" gracePeriod=30 Apr 20 20:25:25.767775 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:25.767741 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s"] Apr 20 20:25:25.768186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:25.768171 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="171726d5-6680-4524-b824-23696042dc39" containerName="kube-rbac-proxy" Apr 20 20:25:25.768234 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:25.768188 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="171726d5-6680-4524-b824-23696042dc39" containerName="kube-rbac-proxy" Apr 20 20:25:25.768234 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:25.768207 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="171726d5-6680-4524-b824-23696042dc39" containerName="kserve-container" Apr 20 20:25:25.768234 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:25.768213 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="171726d5-6680-4524-b824-23696042dc39" containerName="kserve-container" Apr 20 20:25:25.768327 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:25.768282 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="171726d5-6680-4524-b824-23696042dc39" containerName="kserve-container" Apr 20 20:25:25.768327 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:25.768293 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="171726d5-6680-4524-b824-23696042dc39" containerName="kube-rbac-proxy" Apr 20 20:25:25.772685 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:25.772661 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" Apr 20 20:25:25.776214 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:25.775404 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-38d71-predictor-serving-cert\"" Apr 20 20:25:25.776214 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:25.775419 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-38d71-kube-rbac-proxy-sar-config\"" Apr 20 20:25:25.778088 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:25.778067 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s"] Apr 20 20:25:25.941612 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:25.941568 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-38d71-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4864b38c-ba48-4aa5-aabb-7a5a221ab350-error-404-isvc-38d71-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-38d71-predictor-5cd74bff75-4bl6s\" (UID: \"4864b38c-ba48-4aa5-aabb-7a5a221ab350\") " pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" Apr 20 20:25:25.941801 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:25.941709 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l572\" (UniqueName: \"kubernetes.io/projected/4864b38c-ba48-4aa5-aabb-7a5a221ab350-kube-api-access-9l572\") pod \"error-404-isvc-38d71-predictor-5cd74bff75-4bl6s\" (UID: \"4864b38c-ba48-4aa5-aabb-7a5a221ab350\") " pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" Apr 20 20:25:25.941801 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:25.941774 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4864b38c-ba48-4aa5-aabb-7a5a221ab350-proxy-tls\") pod \"error-404-isvc-38d71-predictor-5cd74bff75-4bl6s\" (UID: \"4864b38c-ba48-4aa5-aabb-7a5a221ab350\") " pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" Apr 20 20:25:26.043058 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:26.043028 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9l572\" (UniqueName: \"kubernetes.io/projected/4864b38c-ba48-4aa5-aabb-7a5a221ab350-kube-api-access-9l572\") pod \"error-404-isvc-38d71-predictor-5cd74bff75-4bl6s\" (UID: \"4864b38c-ba48-4aa5-aabb-7a5a221ab350\") " pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" Apr 20 20:25:26.043363 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:26.043068 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4864b38c-ba48-4aa5-aabb-7a5a221ab350-proxy-tls\") pod \"error-404-isvc-38d71-predictor-5cd74bff75-4bl6s\" (UID: \"4864b38c-ba48-4aa5-aabb-7a5a221ab350\") " pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" Apr 20 20:25:26.043363 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:26.043124 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-38d71-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4864b38c-ba48-4aa5-aabb-7a5a221ab350-error-404-isvc-38d71-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-38d71-predictor-5cd74bff75-4bl6s\" (UID: \"4864b38c-ba48-4aa5-aabb-7a5a221ab350\") " pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" Apr 20 20:25:26.043754 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:26.043729 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-38d71-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4864b38c-ba48-4aa5-aabb-7a5a221ab350-error-404-isvc-38d71-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-38d71-predictor-5cd74bff75-4bl6s\" (UID: \"4864b38c-ba48-4aa5-aabb-7a5a221ab350\") " pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" Apr 20 20:25:26.045491 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:26.045472 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4864b38c-ba48-4aa5-aabb-7a5a221ab350-proxy-tls\") pod \"error-404-isvc-38d71-predictor-5cd74bff75-4bl6s\" (UID: \"4864b38c-ba48-4aa5-aabb-7a5a221ab350\") " pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" Apr 20 20:25:26.051011 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:26.050993 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l572\" (UniqueName: \"kubernetes.io/projected/4864b38c-ba48-4aa5-aabb-7a5a221ab350-kube-api-access-9l572\") pod \"error-404-isvc-38d71-predictor-5cd74bff75-4bl6s\" (UID: \"4864b38c-ba48-4aa5-aabb-7a5a221ab350\") " pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" Apr 20 20:25:26.084966 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:26.084933 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" Apr 20 20:25:26.196843 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:26.196801 2573 generic.go:358] "Generic (PLEG): container finished" podID="55817dee-9562-415a-aa34-340f14b7b88f" containerID="ee441bd50befe93e88e26d461a5190d4d9155993d2debf84efb963d5e3daff62" exitCode=2 Apr 20 20:25:26.197001 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:26.196879 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" event={"ID":"55817dee-9562-415a-aa34-340f14b7b88f","Type":"ContainerDied","Data":"ee441bd50befe93e88e26d461a5190d4d9155993d2debf84efb963d5e3daff62"} Apr 20 20:25:26.212266 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:26.212244 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s"] Apr 20 20:25:26.214097 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:25:26.214070 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4864b38c_ba48_4aa5_aabb_7a5a221ab350.slice/crio-cc28c94b94a3a88d0ad8e539a3770602c02a00e7937143e540f0b6229fb1fbe0 WatchSource:0}: Error finding container cc28c94b94a3a88d0ad8e539a3770602c02a00e7937143e540f0b6229fb1fbe0: Status 404 returned error can't find the container with id cc28c94b94a3a88d0ad8e539a3770602c02a00e7937143e540f0b6229fb1fbe0 Apr 20 20:25:27.201906 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:27.201864 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" event={"ID":"4864b38c-ba48-4aa5-aabb-7a5a221ab350","Type":"ContainerStarted","Data":"49ccc3d8c06a8602b51387174d76be56b631b941ace326c14f794525615c5507"} Apr 20 20:25:27.201906 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:27.201904 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" event={"ID":"4864b38c-ba48-4aa5-aabb-7a5a221ab350","Type":"ContainerStarted","Data":"148df6786df90fc3ce87616a56240a9ec68414492f8b7ef7d9596a749a3179b1"} Apr 20 20:25:27.201906 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:27.201915 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" event={"ID":"4864b38c-ba48-4aa5-aabb-7a5a221ab350","Type":"ContainerStarted","Data":"cc28c94b94a3a88d0ad8e539a3770602c02a00e7937143e540f0b6229fb1fbe0"} Apr 20 20:25:27.202437 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:27.202113 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" Apr 20 20:25:27.202437 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:27.202214 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" Apr 20 20:25:27.203417 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:27.203394 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" podUID="4864b38c-ba48-4aa5-aabb-7a5a221ab350" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 20 20:25:27.220427 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:27.220383 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" podStartSLOduration=2.220371742 podStartE2EDuration="2.220371742s" podCreationTimestamp="2026-04-20 20:25:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:25:27.217940138 +0000 UTC m=+1209.633232881" watchObservedRunningTime="2026-04-20 20:25:27.220371742 +0000 UTC m=+1209.635664483" Apr 20 20:25:28.205676 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:28.205633 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" podUID="4864b38c-ba48-4aa5-aabb-7a5a221ab350" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 20 20:25:28.960936 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:28.960885 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" podUID="55817dee-9562-415a-aa34-340f14b7b88f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.36:8643/healthz\": dial tcp 10.133.0.36:8643: connect: connection refused" Apr 20 20:25:29.066276 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:29.066254 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" Apr 20 20:25:29.178485 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:29.178416 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-7541f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/55817dee-9562-415a-aa34-340f14b7b88f-error-404-isvc-7541f-kube-rbac-proxy-sar-config\") pod \"55817dee-9562-415a-aa34-340f14b7b88f\" (UID: \"55817dee-9562-415a-aa34-340f14b7b88f\") " Apr 20 20:25:29.178601 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:29.178519 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wf65n\" (UniqueName: \"kubernetes.io/projected/55817dee-9562-415a-aa34-340f14b7b88f-kube-api-access-wf65n\") pod \"55817dee-9562-415a-aa34-340f14b7b88f\" (UID: \"55817dee-9562-415a-aa34-340f14b7b88f\") " Apr 20 20:25:29.178601 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:29.178569 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55817dee-9562-415a-aa34-340f14b7b88f-proxy-tls\") pod \"55817dee-9562-415a-aa34-340f14b7b88f\" (UID: \"55817dee-9562-415a-aa34-340f14b7b88f\") " Apr 20 20:25:29.178822 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:29.178803 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55817dee-9562-415a-aa34-340f14b7b88f-error-404-isvc-7541f-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-7541f-kube-rbac-proxy-sar-config") pod "55817dee-9562-415a-aa34-340f14b7b88f" (UID: "55817dee-9562-415a-aa34-340f14b7b88f"). InnerVolumeSpecName "error-404-isvc-7541f-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:25:29.180632 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:29.180600 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55817dee-9562-415a-aa34-340f14b7b88f-kube-api-access-wf65n" (OuterVolumeSpecName: "kube-api-access-wf65n") pod "55817dee-9562-415a-aa34-340f14b7b88f" (UID: "55817dee-9562-415a-aa34-340f14b7b88f"). InnerVolumeSpecName "kube-api-access-wf65n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:25:29.180732 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:29.180690 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55817dee-9562-415a-aa34-340f14b7b88f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "55817dee-9562-415a-aa34-340f14b7b88f" (UID: "55817dee-9562-415a-aa34-340f14b7b88f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:25:29.209692 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:29.209655 2573 generic.go:358] "Generic (PLEG): container finished" podID="55817dee-9562-415a-aa34-340f14b7b88f" containerID="1127c20d7419d41fe4ec86aa892dc69e65b8e8410c22436237f417bc9421eb68" exitCode=0 Apr 20 20:25:29.210068 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:29.209734 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" Apr 20 20:25:29.210068 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:29.209741 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" event={"ID":"55817dee-9562-415a-aa34-340f14b7b88f","Type":"ContainerDied","Data":"1127c20d7419d41fe4ec86aa892dc69e65b8e8410c22436237f417bc9421eb68"} Apr 20 20:25:29.210068 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:29.209791 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s" event={"ID":"55817dee-9562-415a-aa34-340f14b7b88f","Type":"ContainerDied","Data":"cbe2370295d004d449094ff1084d062a30892c1004daac9365497f1f068a82fa"} Apr 20 20:25:29.210068 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:29.209814 2573 scope.go:117] "RemoveContainer" containerID="ee441bd50befe93e88e26d461a5190d4d9155993d2debf84efb963d5e3daff62" Apr 20 20:25:29.217731 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:29.217710 2573 scope.go:117] "RemoveContainer" containerID="1127c20d7419d41fe4ec86aa892dc69e65b8e8410c22436237f417bc9421eb68" Apr 20 20:25:29.224568 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:29.224549 2573 scope.go:117] "RemoveContainer" containerID="ee441bd50befe93e88e26d461a5190d4d9155993d2debf84efb963d5e3daff62" Apr 20 20:25:29.224799 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:25:29.224782 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee441bd50befe93e88e26d461a5190d4d9155993d2debf84efb963d5e3daff62\": container with ID starting with ee441bd50befe93e88e26d461a5190d4d9155993d2debf84efb963d5e3daff62 not found: ID does not exist" containerID="ee441bd50befe93e88e26d461a5190d4d9155993d2debf84efb963d5e3daff62" Apr 20 20:25:29.224846 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:29.224808 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee441bd50befe93e88e26d461a5190d4d9155993d2debf84efb963d5e3daff62"} err="failed to get container status \"ee441bd50befe93e88e26d461a5190d4d9155993d2debf84efb963d5e3daff62\": rpc error: code = NotFound desc = could not find container \"ee441bd50befe93e88e26d461a5190d4d9155993d2debf84efb963d5e3daff62\": container with ID starting with ee441bd50befe93e88e26d461a5190d4d9155993d2debf84efb963d5e3daff62 not found: ID does not exist" Apr 20 20:25:29.224846 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:29.224825 2573 scope.go:117] "RemoveContainer" containerID="1127c20d7419d41fe4ec86aa892dc69e65b8e8410c22436237f417bc9421eb68" Apr 20 20:25:29.225095 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:25:29.225079 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1127c20d7419d41fe4ec86aa892dc69e65b8e8410c22436237f417bc9421eb68\": container with ID starting with 1127c20d7419d41fe4ec86aa892dc69e65b8e8410c22436237f417bc9421eb68 not found: ID does not exist" containerID="1127c20d7419d41fe4ec86aa892dc69e65b8e8410c22436237f417bc9421eb68" Apr 20 20:25:29.225151 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:29.225097 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1127c20d7419d41fe4ec86aa892dc69e65b8e8410c22436237f417bc9421eb68"} err="failed to get container status \"1127c20d7419d41fe4ec86aa892dc69e65b8e8410c22436237f417bc9421eb68\": rpc error: code = NotFound desc = could not find container \"1127c20d7419d41fe4ec86aa892dc69e65b8e8410c22436237f417bc9421eb68\": container with ID starting with 1127c20d7419d41fe4ec86aa892dc69e65b8e8410c22436237f417bc9421eb68 not found: ID does not exist" Apr 20 20:25:29.231960 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:29.231940 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s"] Apr 20 20:25:29.235302 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:29.235278 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-7541f-predictor-6b4d6f7547-62k8s"] Apr 20 20:25:29.279839 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:29.279818 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wf65n\" (UniqueName: \"kubernetes.io/projected/55817dee-9562-415a-aa34-340f14b7b88f-kube-api-access-wf65n\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:25:29.279839 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:29.279838 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/55817dee-9562-415a-aa34-340f14b7b88f-proxy-tls\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:25:29.279975 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:29.279876 2573 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-7541f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/55817dee-9562-415a-aa34-340f14b7b88f-error-404-isvc-7541f-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:25:30.171920 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:30.171891 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55817dee-9562-415a-aa34-340f14b7b88f" path="/var/lib/kubelet/pods/55817dee-9562-415a-aa34-340f14b7b88f/volumes" Apr 20 20:25:33.106407 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:33.106367 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" podUID="2e558e2f-8541-47ab-bd38-141408b5dbcb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 20 20:25:33.210276 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:33.210247 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" Apr 20 20:25:33.210892 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:33.210840 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" podUID="4864b38c-ba48-4aa5-aabb-7a5a221ab350" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 20 20:25:43.107015 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:43.106984 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" Apr 20 20:25:43.211651 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:43.211614 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" podUID="4864b38c-ba48-4aa5-aabb-7a5a221ab350" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 20 20:25:53.211644 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:25:53.211606 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" podUID="4864b38c-ba48-4aa5-aabb-7a5a221ab350" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 20 20:26:03.211569 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:03.211485 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" podUID="4864b38c-ba48-4aa5-aabb-7a5a221ab350" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 20 20:26:05.442383 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:05.442347 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj"] Apr 20 20:26:05.442825 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:05.442715 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" podUID="2e558e2f-8541-47ab-bd38-141408b5dbcb" containerName="kserve-container" containerID="cri-o://1c380e6a2c47b78d8357f924b4c72e191f6c6d6e00fd37615f9a8979244277ac" gracePeriod=30 Apr 20 20:26:05.442825 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:05.442722 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" podUID="2e558e2f-8541-47ab-bd38-141408b5dbcb" containerName="kube-rbac-proxy" containerID="cri-o://4f45416b9c53d308df574d6b50bebd3861b3518839929077f5939a253eb4fefb" gracePeriod=30 Apr 20 20:26:05.501678 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:05.501645 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp"] Apr 20 20:26:05.502039 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:05.502026 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55817dee-9562-415a-aa34-340f14b7b88f" containerName="kserve-container" Apr 20 20:26:05.502090 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:05.502041 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="55817dee-9562-415a-aa34-340f14b7b88f" containerName="kserve-container" Apr 20 20:26:05.502090 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:05.502062 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55817dee-9562-415a-aa34-340f14b7b88f" containerName="kube-rbac-proxy" Apr 20 20:26:05.502090 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:05.502068 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="55817dee-9562-415a-aa34-340f14b7b88f" containerName="kube-rbac-proxy" Apr 20 20:26:05.502184 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:05.502119 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="55817dee-9562-415a-aa34-340f14b7b88f" containerName="kserve-container" Apr 20 20:26:05.502184 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:05.502127 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="55817dee-9562-415a-aa34-340f14b7b88f" containerName="kube-rbac-proxy" Apr 20 20:26:05.504576 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:05.504561 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" Apr 20 20:26:05.506811 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:05.506787 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-fe410-predictor-serving-cert\"" Apr 20 20:26:05.506949 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:05.506904 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-fe410-kube-rbac-proxy-sar-config\"" Apr 20 20:26:05.513503 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:05.513480 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp"] Apr 20 20:26:05.591019 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:05.590987 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d52aa60-7277-4328-b3f1-c93fbd5c55a8-proxy-tls\") pod \"error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp\" (UID: \"9d52aa60-7277-4328-b3f1-c93fbd5c55a8\") " pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" Apr 20 20:26:05.591185 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:05.591051 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-fe410-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9d52aa60-7277-4328-b3f1-c93fbd5c55a8-error-404-isvc-fe410-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp\" (UID: \"9d52aa60-7277-4328-b3f1-c93fbd5c55a8\") " pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" Apr 20 20:26:05.591185 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:05.591157 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltlqf\" (UniqueName: \"kubernetes.io/projected/9d52aa60-7277-4328-b3f1-c93fbd5c55a8-kube-api-access-ltlqf\") pod \"error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp\" (UID: \"9d52aa60-7277-4328-b3f1-c93fbd5c55a8\") " pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" Apr 20 20:26:05.692566 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:05.692474 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d52aa60-7277-4328-b3f1-c93fbd5c55a8-proxy-tls\") pod \"error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp\" (UID: \"9d52aa60-7277-4328-b3f1-c93fbd5c55a8\") " pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" Apr 20 20:26:05.692566 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:05.692523 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-fe410-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9d52aa60-7277-4328-b3f1-c93fbd5c55a8-error-404-isvc-fe410-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp\" (UID: \"9d52aa60-7277-4328-b3f1-c93fbd5c55a8\") " pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" Apr 20 20:26:05.692566 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:05.692562 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ltlqf\" (UniqueName: \"kubernetes.io/projected/9d52aa60-7277-4328-b3f1-c93fbd5c55a8-kube-api-access-ltlqf\") pod \"error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp\" (UID: \"9d52aa60-7277-4328-b3f1-c93fbd5c55a8\") " pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" Apr 20 20:26:05.692815 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:26:05.692642 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-fe410-predictor-serving-cert: secret "error-404-isvc-fe410-predictor-serving-cert" not found Apr 20 20:26:05.692815 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:26:05.692713 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9d52aa60-7277-4328-b3f1-c93fbd5c55a8-proxy-tls podName:9d52aa60-7277-4328-b3f1-c93fbd5c55a8 nodeName:}" failed. No retries permitted until 2026-04-20 20:26:06.192692516 +0000 UTC m=+1248.607985251 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9d52aa60-7277-4328-b3f1-c93fbd5c55a8-proxy-tls") pod "error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" (UID: "9d52aa60-7277-4328-b3f1-c93fbd5c55a8") : secret "error-404-isvc-fe410-predictor-serving-cert" not found Apr 20 20:26:05.693279 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:05.693260 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-fe410-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9d52aa60-7277-4328-b3f1-c93fbd5c55a8-error-404-isvc-fe410-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp\" (UID: \"9d52aa60-7277-4328-b3f1-c93fbd5c55a8\") " pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" Apr 20 20:26:05.702916 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:05.702893 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltlqf\" (UniqueName: \"kubernetes.io/projected/9d52aa60-7277-4328-b3f1-c93fbd5c55a8-kube-api-access-ltlqf\") pod \"error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp\" (UID: \"9d52aa60-7277-4328-b3f1-c93fbd5c55a8\") " pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" Apr 20 20:26:06.197550 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:06.197500 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d52aa60-7277-4328-b3f1-c93fbd5c55a8-proxy-tls\") pod \"error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp\" (UID: \"9d52aa60-7277-4328-b3f1-c93fbd5c55a8\") " pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" Apr 20 20:26:06.200074 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:06.200047 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d52aa60-7277-4328-b3f1-c93fbd5c55a8-proxy-tls\") pod \"error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp\" (UID: \"9d52aa60-7277-4328-b3f1-c93fbd5c55a8\") " pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" Apr 20 20:26:06.343347 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:06.343313 2573 generic.go:358] "Generic (PLEG): container finished" podID="2e558e2f-8541-47ab-bd38-141408b5dbcb" containerID="4f45416b9c53d308df574d6b50bebd3861b3518839929077f5939a253eb4fefb" exitCode=2 Apr 20 20:26:06.343511 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:06.343392 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" event={"ID":"2e558e2f-8541-47ab-bd38-141408b5dbcb","Type":"ContainerDied","Data":"4f45416b9c53d308df574d6b50bebd3861b3518839929077f5939a253eb4fefb"} Apr 20 20:26:06.415514 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:06.415481 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" Apr 20 20:26:06.541623 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:06.541593 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp"] Apr 20 20:26:06.542993 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:26:06.542955 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d52aa60_7277_4328_b3f1_c93fbd5c55a8.slice/crio-eeaecb27ee7931f9e0ab06284101ffd574d3d64e35685e28b2059b9ccffe080e WatchSource:0}: Error finding container eeaecb27ee7931f9e0ab06284101ffd574d3d64e35685e28b2059b9ccffe080e: Status 404 returned error can't find the container with id eeaecb27ee7931f9e0ab06284101ffd574d3d64e35685e28b2059b9ccffe080e Apr 20 20:26:07.348982 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:07.348949 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" event={"ID":"9d52aa60-7277-4328-b3f1-c93fbd5c55a8","Type":"ContainerStarted","Data":"cfa5247a541048b58c8d2d21530e8d9419b18f182a1298c892295378b7357602"} Apr 20 20:26:07.348982 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:07.348983 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" event={"ID":"9d52aa60-7277-4328-b3f1-c93fbd5c55a8","Type":"ContainerStarted","Data":"bcc11de94a51e69f700153890469943eb7226198adc3bab5541891e7c6b3fbcb"} Apr 20 20:26:07.349183 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:07.348995 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" event={"ID":"9d52aa60-7277-4328-b3f1-c93fbd5c55a8","Type":"ContainerStarted","Data":"eeaecb27ee7931f9e0ab06284101ffd574d3d64e35685e28b2059b9ccffe080e"} Apr 20 20:26:07.349183 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:07.349079 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" Apr 20 20:26:07.372068 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:07.372022 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" podStartSLOduration=2.372008982 podStartE2EDuration="2.372008982s" podCreationTimestamp="2026-04-20 20:26:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:26:07.370288328 +0000 UTC m=+1249.785581071" watchObservedRunningTime="2026-04-20 20:26:07.372008982 +0000 UTC m=+1249.787301724" Apr 20 20:26:08.100731 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:08.100683 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" podUID="2e558e2f-8541-47ab-bd38-141408b5dbcb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.37:8643/healthz\": dial tcp 10.133.0.37:8643: connect: connection refused" Apr 20 20:26:08.353444 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:08.353374 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" Apr 20 20:26:08.354534 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:08.354512 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" podUID="9d52aa60-7277-4328-b3f1-c93fbd5c55a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 20 20:26:08.594574 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:08.594549 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" Apr 20 20:26:08.720357 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:08.720278 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e558e2f-8541-47ab-bd38-141408b5dbcb-proxy-tls\") pod \"2e558e2f-8541-47ab-bd38-141408b5dbcb\" (UID: \"2e558e2f-8541-47ab-bd38-141408b5dbcb\") " Apr 20 20:26:08.720357 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:08.720316 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2hgw\" (UniqueName: \"kubernetes.io/projected/2e558e2f-8541-47ab-bd38-141408b5dbcb-kube-api-access-f2hgw\") pod \"2e558e2f-8541-47ab-bd38-141408b5dbcb\" (UID: \"2e558e2f-8541-47ab-bd38-141408b5dbcb\") " Apr 20 20:26:08.720535 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:08.720384 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-63de8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2e558e2f-8541-47ab-bd38-141408b5dbcb-error-404-isvc-63de8-kube-rbac-proxy-sar-config\") pod \"2e558e2f-8541-47ab-bd38-141408b5dbcb\" (UID: \"2e558e2f-8541-47ab-bd38-141408b5dbcb\") " Apr 20 20:26:08.720787 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:08.720761 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e558e2f-8541-47ab-bd38-141408b5dbcb-error-404-isvc-63de8-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-63de8-kube-rbac-proxy-sar-config") pod "2e558e2f-8541-47ab-bd38-141408b5dbcb" (UID: "2e558e2f-8541-47ab-bd38-141408b5dbcb"). InnerVolumeSpecName "error-404-isvc-63de8-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:26:08.722477 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:08.722450 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e558e2f-8541-47ab-bd38-141408b5dbcb-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2e558e2f-8541-47ab-bd38-141408b5dbcb" (UID: "2e558e2f-8541-47ab-bd38-141408b5dbcb"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:26:08.722578 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:08.722473 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e558e2f-8541-47ab-bd38-141408b5dbcb-kube-api-access-f2hgw" (OuterVolumeSpecName: "kube-api-access-f2hgw") pod "2e558e2f-8541-47ab-bd38-141408b5dbcb" (UID: "2e558e2f-8541-47ab-bd38-141408b5dbcb"). InnerVolumeSpecName "kube-api-access-f2hgw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:26:08.821243 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:08.821208 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e558e2f-8541-47ab-bd38-141408b5dbcb-proxy-tls\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:26:08.821243 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:08.821242 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f2hgw\" (UniqueName: \"kubernetes.io/projected/2e558e2f-8541-47ab-bd38-141408b5dbcb-kube-api-access-f2hgw\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:26:08.821479 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:08.821254 2573 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-63de8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2e558e2f-8541-47ab-bd38-141408b5dbcb-error-404-isvc-63de8-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:26:09.357964 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:09.357929 2573 generic.go:358] "Generic (PLEG): container finished" podID="2e558e2f-8541-47ab-bd38-141408b5dbcb" containerID="1c380e6a2c47b78d8357f924b4c72e191f6c6d6e00fd37615f9a8979244277ac" exitCode=0 Apr 20 20:26:09.358423 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:09.357993 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" event={"ID":"2e558e2f-8541-47ab-bd38-141408b5dbcb","Type":"ContainerDied","Data":"1c380e6a2c47b78d8357f924b4c72e191f6c6d6e00fd37615f9a8979244277ac"} Apr 20 20:26:09.358423 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:09.358008 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" Apr 20 20:26:09.358423 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:09.358032 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj" event={"ID":"2e558e2f-8541-47ab-bd38-141408b5dbcb","Type":"ContainerDied","Data":"be24177f794dce7bdb9846c383c37e3d76c0f7dec950fd2cc2c808b92fadd594"} Apr 20 20:26:09.358423 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:09.358048 2573 scope.go:117] "RemoveContainer" containerID="4f45416b9c53d308df574d6b50bebd3861b3518839929077f5939a253eb4fefb" Apr 20 20:26:09.358746 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:09.358710 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" podUID="9d52aa60-7277-4328-b3f1-c93fbd5c55a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 20 20:26:09.369125 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:09.369077 2573 scope.go:117] "RemoveContainer" containerID="1c380e6a2c47b78d8357f924b4c72e191f6c6d6e00fd37615f9a8979244277ac" Apr 20 20:26:09.376486 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:09.376470 2573 scope.go:117] "RemoveContainer" containerID="4f45416b9c53d308df574d6b50bebd3861b3518839929077f5939a253eb4fefb" Apr 20 20:26:09.376727 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:26:09.376708 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f45416b9c53d308df574d6b50bebd3861b3518839929077f5939a253eb4fefb\": container with ID starting with 4f45416b9c53d308df574d6b50bebd3861b3518839929077f5939a253eb4fefb not found: ID does not exist" containerID="4f45416b9c53d308df574d6b50bebd3861b3518839929077f5939a253eb4fefb" Apr 20 20:26:09.376778 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:09.376735 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f45416b9c53d308df574d6b50bebd3861b3518839929077f5939a253eb4fefb"} err="failed to get container status \"4f45416b9c53d308df574d6b50bebd3861b3518839929077f5939a253eb4fefb\": rpc error: code = NotFound desc = could not find container \"4f45416b9c53d308df574d6b50bebd3861b3518839929077f5939a253eb4fefb\": container with ID starting with 4f45416b9c53d308df574d6b50bebd3861b3518839929077f5939a253eb4fefb not found: ID does not exist" Apr 20 20:26:09.376778 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:09.376751 2573 scope.go:117] "RemoveContainer" containerID="1c380e6a2c47b78d8357f924b4c72e191f6c6d6e00fd37615f9a8979244277ac" Apr 20 20:26:09.377053 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:26:09.377038 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c380e6a2c47b78d8357f924b4c72e191f6c6d6e00fd37615f9a8979244277ac\": container with ID starting with 1c380e6a2c47b78d8357f924b4c72e191f6c6d6e00fd37615f9a8979244277ac not found: ID does not exist" containerID="1c380e6a2c47b78d8357f924b4c72e191f6c6d6e00fd37615f9a8979244277ac" Apr 20 20:26:09.377095 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:09.377059 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c380e6a2c47b78d8357f924b4c72e191f6c6d6e00fd37615f9a8979244277ac"} err="failed to get container status \"1c380e6a2c47b78d8357f924b4c72e191f6c6d6e00fd37615f9a8979244277ac\": rpc error: code = NotFound desc = could not find container \"1c380e6a2c47b78d8357f924b4c72e191f6c6d6e00fd37615f9a8979244277ac\": container with ID starting with 1c380e6a2c47b78d8357f924b4c72e191f6c6d6e00fd37615f9a8979244277ac not found: ID does not exist" Apr 20 20:26:09.382275 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:09.382255 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj"] Apr 20 20:26:09.386020 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:09.385999 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-63de8-predictor-544568fdd5-wfwsj"] Apr 20 20:26:10.172026 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:10.171996 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e558e2f-8541-47ab-bd38-141408b5dbcb" path="/var/lib/kubelet/pods/2e558e2f-8541-47ab-bd38-141408b5dbcb/volumes" Apr 20 20:26:13.211839 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:13.211807 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" Apr 20 20:26:14.363948 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:14.363920 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" Apr 20 20:26:14.364480 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:14.364455 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" podUID="9d52aa60-7277-4328-b3f1-c93fbd5c55a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 20 20:26:24.364943 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:24.364899 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" podUID="9d52aa60-7277-4328-b3f1-c93fbd5c55a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 20 20:26:34.365493 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:34.365448 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" podUID="9d52aa60-7277-4328-b3f1-c93fbd5c55a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 20 20:26:44.364918 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:44.364875 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" podUID="9d52aa60-7277-4328-b3f1-c93fbd5c55a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 20 20:26:54.365504 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:26:54.365475 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" Apr 20 20:30:18.173617 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:30:18.173590 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/ovn-acl-logging/0.log" Apr 20 20:30:18.178206 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:30:18.178188 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/ovn-acl-logging/0.log" Apr 20 20:34:40.523274 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:40.523240 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s"] Apr 20 20:34:40.523740 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:40.523599 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" podUID="4864b38c-ba48-4aa5-aabb-7a5a221ab350" containerName="kserve-container" containerID="cri-o://148df6786df90fc3ce87616a56240a9ec68414492f8b7ef7d9596a749a3179b1" gracePeriod=30 Apr 20 20:34:40.523740 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:40.523687 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" podUID="4864b38c-ba48-4aa5-aabb-7a5a221ab350" containerName="kube-rbac-proxy" containerID="cri-o://49ccc3d8c06a8602b51387174d76be56b631b941ace326c14f794525615c5507" gracePeriod=30 Apr 20 20:34:40.605130 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:40.605098 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g"] Apr 20 20:34:40.605608 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:40.605586 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e558e2f-8541-47ab-bd38-141408b5dbcb" containerName="kube-rbac-proxy" Apr 20 20:34:40.605608 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:40.605608 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e558e2f-8541-47ab-bd38-141408b5dbcb" containerName="kube-rbac-proxy" Apr 20 20:34:40.605798 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:40.605625 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e558e2f-8541-47ab-bd38-141408b5dbcb" containerName="kserve-container" Apr 20 20:34:40.605798 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:40.605632 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e558e2f-8541-47ab-bd38-141408b5dbcb" containerName="kserve-container" Apr 20 20:34:40.605798 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:40.605689 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="2e558e2f-8541-47ab-bd38-141408b5dbcb" containerName="kserve-container" Apr 20 20:34:40.605798 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:40.605698 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="2e558e2f-8541-47ab-bd38-141408b5dbcb" containerName="kube-rbac-proxy" Apr 20 20:34:40.608837 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:40.608821 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" Apr 20 20:34:40.611233 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:40.611213 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-ab6b8-predictor-serving-cert\"" Apr 20 20:34:40.611340 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:40.611257 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-ab6b8-kube-rbac-proxy-sar-config\"" Apr 20 20:34:40.616229 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:40.616018 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g"] Apr 20 20:34:40.711364 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:40.711326 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf9f6842-25d2-484c-8a66-9010dadf1cb9-proxy-tls\") pod \"error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g\" (UID: \"bf9f6842-25d2-484c-8a66-9010dadf1cb9\") " pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" Apr 20 20:34:40.711523 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:40.711384 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-ab6b8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bf9f6842-25d2-484c-8a66-9010dadf1cb9-error-404-isvc-ab6b8-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g\" (UID: \"bf9f6842-25d2-484c-8a66-9010dadf1cb9\") " pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" Apr 20 20:34:40.711523 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:40.711421 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl9m4\" (UniqueName: \"kubernetes.io/projected/bf9f6842-25d2-484c-8a66-9010dadf1cb9-kube-api-access-xl9m4\") pod \"error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g\" (UID: \"bf9f6842-25d2-484c-8a66-9010dadf1cb9\") " pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" Apr 20 20:34:40.812888 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:40.812833 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf9f6842-25d2-484c-8a66-9010dadf1cb9-proxy-tls\") pod \"error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g\" (UID: \"bf9f6842-25d2-484c-8a66-9010dadf1cb9\") " pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" Apr 20 20:34:40.813061 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:40.812895 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-ab6b8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bf9f6842-25d2-484c-8a66-9010dadf1cb9-error-404-isvc-ab6b8-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g\" (UID: \"bf9f6842-25d2-484c-8a66-9010dadf1cb9\") " pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" Apr 20 20:34:40.813061 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:40.812918 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xl9m4\" (UniqueName: \"kubernetes.io/projected/bf9f6842-25d2-484c-8a66-9010dadf1cb9-kube-api-access-xl9m4\") pod \"error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g\" (UID: \"bf9f6842-25d2-484c-8a66-9010dadf1cb9\") " pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" Apr 20 20:34:40.813061 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:34:40.812980 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-serving-cert: secret "error-404-isvc-ab6b8-predictor-serving-cert" not found Apr 20 20:34:40.813061 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:34:40.813052 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf9f6842-25d2-484c-8a66-9010dadf1cb9-proxy-tls podName:bf9f6842-25d2-484c-8a66-9010dadf1cb9 nodeName:}" failed. No retries permitted until 2026-04-20 20:34:41.313034075 +0000 UTC m=+1763.728326800 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/bf9f6842-25d2-484c-8a66-9010dadf1cb9-proxy-tls") pod "error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" (UID: "bf9f6842-25d2-484c-8a66-9010dadf1cb9") : secret "error-404-isvc-ab6b8-predictor-serving-cert" not found Apr 20 20:34:40.813567 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:40.813547 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-ab6b8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bf9f6842-25d2-484c-8a66-9010dadf1cb9-error-404-isvc-ab6b8-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g\" (UID: \"bf9f6842-25d2-484c-8a66-9010dadf1cb9\") " pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" Apr 20 20:34:40.821539 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:40.821514 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl9m4\" (UniqueName: \"kubernetes.io/projected/bf9f6842-25d2-484c-8a66-9010dadf1cb9-kube-api-access-xl9m4\") pod \"error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g\" (UID: \"bf9f6842-25d2-484c-8a66-9010dadf1cb9\") " pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" Apr 20 20:34:41.098533 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:41.098442 2573 generic.go:358] "Generic (PLEG): container finished" podID="4864b38c-ba48-4aa5-aabb-7a5a221ab350" containerID="49ccc3d8c06a8602b51387174d76be56b631b941ace326c14f794525615c5507" exitCode=2 Apr 20 20:34:41.098667 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:41.098524 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" event={"ID":"4864b38c-ba48-4aa5-aabb-7a5a221ab350","Type":"ContainerDied","Data":"49ccc3d8c06a8602b51387174d76be56b631b941ace326c14f794525615c5507"} Apr 20 20:34:41.318201 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:41.318162 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf9f6842-25d2-484c-8a66-9010dadf1cb9-proxy-tls\") pod \"error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g\" (UID: \"bf9f6842-25d2-484c-8a66-9010dadf1cb9\") " pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" Apr 20 20:34:41.320723 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:41.320697 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf9f6842-25d2-484c-8a66-9010dadf1cb9-proxy-tls\") pod \"error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g\" (UID: \"bf9f6842-25d2-484c-8a66-9010dadf1cb9\") " pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" Apr 20 20:34:41.520441 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:41.520346 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" Apr 20 20:34:41.644664 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:41.644636 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g"] Apr 20 20:34:41.647234 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:34:41.647203 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf9f6842_25d2_484c_8a66_9010dadf1cb9.slice/crio-56dc015684510dc0f8216a82b88808fab9aa4454e03ba31dfe22634710c62db5 WatchSource:0}: Error finding container 56dc015684510dc0f8216a82b88808fab9aa4454e03ba31dfe22634710c62db5: Status 404 returned error can't find the container with id 56dc015684510dc0f8216a82b88808fab9aa4454e03ba31dfe22634710c62db5 Apr 20 20:34:41.648892 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:41.648875 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:34:42.103329 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:42.103241 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" event={"ID":"bf9f6842-25d2-484c-8a66-9010dadf1cb9","Type":"ContainerStarted","Data":"cbd8abf5396a8b960686b90d9edc683b628a3557b57f7badeaf85bd3c10c04e4"} Apr 20 20:34:42.103329 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:42.103283 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" event={"ID":"bf9f6842-25d2-484c-8a66-9010dadf1cb9","Type":"ContainerStarted","Data":"24fd6207382ad6cbb70d14c777e471766331ac6599a450f17f65d6cfc63d61ee"} Apr 20 20:34:42.103329 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:42.103293 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" event={"ID":"bf9f6842-25d2-484c-8a66-9010dadf1cb9","Type":"ContainerStarted","Data":"56dc015684510dc0f8216a82b88808fab9aa4454e03ba31dfe22634710c62db5"} Apr 20 20:34:42.103560 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:42.103358 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" Apr 20 20:34:42.125238 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:42.125179 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" podStartSLOduration=2.125161023 podStartE2EDuration="2.125161023s" podCreationTimestamp="2026-04-20 20:34:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:34:42.121875332 +0000 UTC m=+1764.537168065" watchObservedRunningTime="2026-04-20 20:34:42.125161023 +0000 UTC m=+1764.540453766" Apr 20 20:34:43.107087 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:43.107059 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" Apr 20 20:34:43.108457 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:43.108426 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" podUID="bf9f6842-25d2-484c-8a66-9010dadf1cb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 20 20:34:43.206287 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:43.206246 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" podUID="4864b38c-ba48-4aa5-aabb-7a5a221ab350" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.38:8643/healthz\": dial tcp 10.133.0.38:8643: connect: connection refused" Apr 20 20:34:43.211636 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:43.211609 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" podUID="4864b38c-ba48-4aa5-aabb-7a5a221ab350" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 20 20:34:43.770731 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:43.770701 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" Apr 20 20:34:43.839325 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:43.838454 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l572\" (UniqueName: \"kubernetes.io/projected/4864b38c-ba48-4aa5-aabb-7a5a221ab350-kube-api-access-9l572\") pod \"4864b38c-ba48-4aa5-aabb-7a5a221ab350\" (UID: \"4864b38c-ba48-4aa5-aabb-7a5a221ab350\") " Apr 20 20:34:43.839325 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:43.838521 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4864b38c-ba48-4aa5-aabb-7a5a221ab350-proxy-tls\") pod \"4864b38c-ba48-4aa5-aabb-7a5a221ab350\" (UID: \"4864b38c-ba48-4aa5-aabb-7a5a221ab350\") " Apr 20 20:34:43.845981 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:43.845941 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4864b38c-ba48-4aa5-aabb-7a5a221ab350-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4864b38c-ba48-4aa5-aabb-7a5a221ab350" (UID: "4864b38c-ba48-4aa5-aabb-7a5a221ab350"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:34:43.846088 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:43.846064 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4864b38c-ba48-4aa5-aabb-7a5a221ab350-kube-api-access-9l572" (OuterVolumeSpecName: "kube-api-access-9l572") pod "4864b38c-ba48-4aa5-aabb-7a5a221ab350" (UID: "4864b38c-ba48-4aa5-aabb-7a5a221ab350"). InnerVolumeSpecName "kube-api-access-9l572". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:34:43.939437 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:43.939396 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-38d71-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4864b38c-ba48-4aa5-aabb-7a5a221ab350-error-404-isvc-38d71-kube-rbac-proxy-sar-config\") pod \"4864b38c-ba48-4aa5-aabb-7a5a221ab350\" (UID: \"4864b38c-ba48-4aa5-aabb-7a5a221ab350\") " Apr 20 20:34:43.939587 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:43.939545 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9l572\" (UniqueName: \"kubernetes.io/projected/4864b38c-ba48-4aa5-aabb-7a5a221ab350-kube-api-access-9l572\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:34:43.939587 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:43.939559 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4864b38c-ba48-4aa5-aabb-7a5a221ab350-proxy-tls\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:34:43.939769 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:43.939743 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4864b38c-ba48-4aa5-aabb-7a5a221ab350-error-404-isvc-38d71-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-38d71-kube-rbac-proxy-sar-config") pod "4864b38c-ba48-4aa5-aabb-7a5a221ab350" (UID: "4864b38c-ba48-4aa5-aabb-7a5a221ab350"). InnerVolumeSpecName "error-404-isvc-38d71-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:34:44.040684 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:44.040639 2573 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-38d71-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4864b38c-ba48-4aa5-aabb-7a5a221ab350-error-404-isvc-38d71-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:34:44.111976 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:44.111884 2573 generic.go:358] "Generic (PLEG): container finished" podID="4864b38c-ba48-4aa5-aabb-7a5a221ab350" containerID="148df6786df90fc3ce87616a56240a9ec68414492f8b7ef7d9596a749a3179b1" exitCode=0 Apr 20 20:34:44.111976 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:44.111973 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" Apr 20 20:34:44.112463 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:44.111973 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" event={"ID":"4864b38c-ba48-4aa5-aabb-7a5a221ab350","Type":"ContainerDied","Data":"148df6786df90fc3ce87616a56240a9ec68414492f8b7ef7d9596a749a3179b1"} Apr 20 20:34:44.112463 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:44.112010 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s" event={"ID":"4864b38c-ba48-4aa5-aabb-7a5a221ab350","Type":"ContainerDied","Data":"cc28c94b94a3a88d0ad8e539a3770602c02a00e7937143e540f0b6229fb1fbe0"} Apr 20 20:34:44.112463 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:44.112027 2573 scope.go:117] "RemoveContainer" containerID="49ccc3d8c06a8602b51387174d76be56b631b941ace326c14f794525615c5507" Apr 20 20:34:44.112604 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:44.112534 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" podUID="bf9f6842-25d2-484c-8a66-9010dadf1cb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 20 20:34:44.121641 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:44.121619 2573 scope.go:117] "RemoveContainer" containerID="148df6786df90fc3ce87616a56240a9ec68414492f8b7ef7d9596a749a3179b1" Apr 20 20:34:44.129080 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:44.129064 2573 scope.go:117] "RemoveContainer" containerID="49ccc3d8c06a8602b51387174d76be56b631b941ace326c14f794525615c5507" Apr 20 20:34:44.129311 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:34:44.129290 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49ccc3d8c06a8602b51387174d76be56b631b941ace326c14f794525615c5507\": container with ID starting with 49ccc3d8c06a8602b51387174d76be56b631b941ace326c14f794525615c5507 not found: ID does not exist" containerID="49ccc3d8c06a8602b51387174d76be56b631b941ace326c14f794525615c5507" Apr 20 20:34:44.129362 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:44.129320 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49ccc3d8c06a8602b51387174d76be56b631b941ace326c14f794525615c5507"} err="failed to get container status \"49ccc3d8c06a8602b51387174d76be56b631b941ace326c14f794525615c5507\": rpc error: code = NotFound desc = could not find container \"49ccc3d8c06a8602b51387174d76be56b631b941ace326c14f794525615c5507\": container with ID starting with 49ccc3d8c06a8602b51387174d76be56b631b941ace326c14f794525615c5507 not found: ID does not exist" Apr 20 20:34:44.129362 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:44.129342 2573 scope.go:117] "RemoveContainer" containerID="148df6786df90fc3ce87616a56240a9ec68414492f8b7ef7d9596a749a3179b1" Apr 20 20:34:44.129577 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:34:44.129557 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"148df6786df90fc3ce87616a56240a9ec68414492f8b7ef7d9596a749a3179b1\": container with ID starting with 148df6786df90fc3ce87616a56240a9ec68414492f8b7ef7d9596a749a3179b1 not found: ID does not exist" containerID="148df6786df90fc3ce87616a56240a9ec68414492f8b7ef7d9596a749a3179b1" Apr 20 20:34:44.129614 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:44.129584 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"148df6786df90fc3ce87616a56240a9ec68414492f8b7ef7d9596a749a3179b1"} err="failed to get container status \"148df6786df90fc3ce87616a56240a9ec68414492f8b7ef7d9596a749a3179b1\": rpc error: code = NotFound desc = could not find container \"148df6786df90fc3ce87616a56240a9ec68414492f8b7ef7d9596a749a3179b1\": container with ID starting with 148df6786df90fc3ce87616a56240a9ec68414492f8b7ef7d9596a749a3179b1 not found: ID does not exist" Apr 20 20:34:44.134971 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:44.134947 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s"] Apr 20 20:34:44.138770 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:44.138744 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-38d71-predictor-5cd74bff75-4bl6s"] Apr 20 20:34:44.171708 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:44.171681 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4864b38c-ba48-4aa5-aabb-7a5a221ab350" path="/var/lib/kubelet/pods/4864b38c-ba48-4aa5-aabb-7a5a221ab350/volumes" Apr 20 20:34:49.117382 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:49.117348 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" Apr 20 20:34:49.117898 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:49.117829 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" podUID="bf9f6842-25d2-484c-8a66-9010dadf1cb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 20 20:34:59.118538 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:34:59.118450 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" podUID="bf9f6842-25d2-484c-8a66-9010dadf1cb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 20 20:35:09.118495 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:09.118455 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" podUID="bf9f6842-25d2-484c-8a66-9010dadf1cb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 20 20:35:18.196870 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:18.196745 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/ovn-acl-logging/0.log" Apr 20 20:35:18.202639 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:18.202617 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/ovn-acl-logging/0.log" Apr 20 20:35:19.118332 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:19.118287 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" podUID="bf9f6842-25d2-484c-8a66-9010dadf1cb9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 20 20:35:20.379654 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:20.379608 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp"] Apr 20 20:35:20.380198 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:20.380142 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" podUID="9d52aa60-7277-4328-b3f1-c93fbd5c55a8" containerName="kserve-container" containerID="cri-o://bcc11de94a51e69f700153890469943eb7226198adc3bab5541891e7c6b3fbcb" gracePeriod=30 Apr 20 20:35:20.380311 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:20.380197 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" podUID="9d52aa60-7277-4328-b3f1-c93fbd5c55a8" containerName="kube-rbac-proxy" containerID="cri-o://cfa5247a541048b58c8d2d21530e8d9419b18f182a1298c892295378b7357602" gracePeriod=30 Apr 20 20:35:20.429409 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:20.429373 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8"] Apr 20 20:35:20.429754 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:20.429742 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4864b38c-ba48-4aa5-aabb-7a5a221ab350" containerName="kserve-container" Apr 20 20:35:20.429800 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:20.429755 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="4864b38c-ba48-4aa5-aabb-7a5a221ab350" containerName="kserve-container" Apr 20 20:35:20.429800 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:20.429775 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4864b38c-ba48-4aa5-aabb-7a5a221ab350" containerName="kube-rbac-proxy" Apr 20 20:35:20.429800 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:20.429781 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="4864b38c-ba48-4aa5-aabb-7a5a221ab350" containerName="kube-rbac-proxy" Apr 20 20:35:20.429928 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:20.429834 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="4864b38c-ba48-4aa5-aabb-7a5a221ab350" containerName="kserve-container" Apr 20 20:35:20.429928 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:20.429846 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="4864b38c-ba48-4aa5-aabb-7a5a221ab350" containerName="kube-rbac-proxy" Apr 20 20:35:20.434255 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:20.434232 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" Apr 20 20:35:20.436565 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:20.436543 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-60a5c-predictor-serving-cert\"" Apr 20 20:35:20.436676 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:20.436571 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-60a5c-kube-rbac-proxy-sar-config\"" Apr 20 20:35:20.443404 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:20.443374 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8"] Apr 20 20:35:20.565174 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:20.565129 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtqfr\" (UniqueName: \"kubernetes.io/projected/d712806c-4d72-49b1-a918-2a2aff1ec86f-kube-api-access-vtqfr\") pod \"error-404-isvc-60a5c-predictor-74955c648c-g4rt8\" (UID: \"d712806c-4d72-49b1-a918-2a2aff1ec86f\") " pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" Apr 20 20:35:20.565374 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:20.565196 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-60a5c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d712806c-4d72-49b1-a918-2a2aff1ec86f-error-404-isvc-60a5c-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-60a5c-predictor-74955c648c-g4rt8\" (UID: \"d712806c-4d72-49b1-a918-2a2aff1ec86f\") " pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" Apr 20 20:35:20.565374 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:20.565335 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d712806c-4d72-49b1-a918-2a2aff1ec86f-proxy-tls\") pod \"error-404-isvc-60a5c-predictor-74955c648c-g4rt8\" (UID: \"d712806c-4d72-49b1-a918-2a2aff1ec86f\") " pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" Apr 20 20:35:20.665829 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:20.665737 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d712806c-4d72-49b1-a918-2a2aff1ec86f-proxy-tls\") pod \"error-404-isvc-60a5c-predictor-74955c648c-g4rt8\" (UID: \"d712806c-4d72-49b1-a918-2a2aff1ec86f\") " pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" Apr 20 20:35:20.665829 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:20.665802 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtqfr\" (UniqueName: \"kubernetes.io/projected/d712806c-4d72-49b1-a918-2a2aff1ec86f-kube-api-access-vtqfr\") pod \"error-404-isvc-60a5c-predictor-74955c648c-g4rt8\" (UID: \"d712806c-4d72-49b1-a918-2a2aff1ec86f\") " pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" Apr 20 20:35:20.666110 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:20.665840 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-60a5c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d712806c-4d72-49b1-a918-2a2aff1ec86f-error-404-isvc-60a5c-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-60a5c-predictor-74955c648c-g4rt8\" (UID: \"d712806c-4d72-49b1-a918-2a2aff1ec86f\") " pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" Apr 20 20:35:20.666110 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:35:20.665895 2573 secret.go:189] Couldn't get secret kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-serving-cert: secret "error-404-isvc-60a5c-predictor-serving-cert" not found Apr 20 20:35:20.666110 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:35:20.665978 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d712806c-4d72-49b1-a918-2a2aff1ec86f-proxy-tls podName:d712806c-4d72-49b1-a918-2a2aff1ec86f nodeName:}" failed. No retries permitted until 2026-04-20 20:35:21.165956037 +0000 UTC m=+1803.581248756 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d712806c-4d72-49b1-a918-2a2aff1ec86f-proxy-tls") pod "error-404-isvc-60a5c-predictor-74955c648c-g4rt8" (UID: "d712806c-4d72-49b1-a918-2a2aff1ec86f") : secret "error-404-isvc-60a5c-predictor-serving-cert" not found Apr 20 20:35:20.666538 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:20.666519 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-60a5c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d712806c-4d72-49b1-a918-2a2aff1ec86f-error-404-isvc-60a5c-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-60a5c-predictor-74955c648c-g4rt8\" (UID: \"d712806c-4d72-49b1-a918-2a2aff1ec86f\") " pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" Apr 20 20:35:20.674279 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:20.674251 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtqfr\" (UniqueName: \"kubernetes.io/projected/d712806c-4d72-49b1-a918-2a2aff1ec86f-kube-api-access-vtqfr\") pod \"error-404-isvc-60a5c-predictor-74955c648c-g4rt8\" (UID: \"d712806c-4d72-49b1-a918-2a2aff1ec86f\") " pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" Apr 20 20:35:21.169055 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:21.169014 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d712806c-4d72-49b1-a918-2a2aff1ec86f-proxy-tls\") pod \"error-404-isvc-60a5c-predictor-74955c648c-g4rt8\" (UID: \"d712806c-4d72-49b1-a918-2a2aff1ec86f\") " pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" Apr 20 20:35:21.171382 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:21.171356 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d712806c-4d72-49b1-a918-2a2aff1ec86f-proxy-tls\") pod \"error-404-isvc-60a5c-predictor-74955c648c-g4rt8\" (UID: \"d712806c-4d72-49b1-a918-2a2aff1ec86f\") " pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" Apr 20 20:35:21.244278 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:21.244243 2573 generic.go:358] "Generic (PLEG): container finished" podID="9d52aa60-7277-4328-b3f1-c93fbd5c55a8" containerID="cfa5247a541048b58c8d2d21530e8d9419b18f182a1298c892295378b7357602" exitCode=2 Apr 20 20:35:21.244447 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:21.244317 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" event={"ID":"9d52aa60-7277-4328-b3f1-c93fbd5c55a8","Type":"ContainerDied","Data":"cfa5247a541048b58c8d2d21530e8d9419b18f182a1298c892295378b7357602"} Apr 20 20:35:21.346508 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:21.346461 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" Apr 20 20:35:21.473750 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:21.473712 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8"] Apr 20 20:35:21.476564 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:35:21.476533 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd712806c_4d72_49b1_a918_2a2aff1ec86f.slice/crio-2b2394c7ae75ba8a75fd581af4e3b0766a5db9b3009b39943a2747c9b74af72a WatchSource:0}: Error finding container 2b2394c7ae75ba8a75fd581af4e3b0766a5db9b3009b39943a2747c9b74af72a: Status 404 returned error can't find the container with id 2b2394c7ae75ba8a75fd581af4e3b0766a5db9b3009b39943a2747c9b74af72a Apr 20 20:35:22.249172 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:22.249135 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" event={"ID":"d712806c-4d72-49b1-a918-2a2aff1ec86f","Type":"ContainerStarted","Data":"cb266e1cee2c7aa3a42cd74ab2a364ade5c9a69835fcd8d1d157a773e22fdaa3"} Apr 20 20:35:22.249172 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:22.249177 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" event={"ID":"d712806c-4d72-49b1-a918-2a2aff1ec86f","Type":"ContainerStarted","Data":"01310f7c51a109b165ef8fca8fab385b23ff31eb7740037b05059ecfae1f3b24"} Apr 20 20:35:22.249425 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:22.249196 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" event={"ID":"d712806c-4d72-49b1-a918-2a2aff1ec86f","Type":"ContainerStarted","Data":"2b2394c7ae75ba8a75fd581af4e3b0766a5db9b3009b39943a2747c9b74af72a"} Apr 20 20:35:22.249425 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:22.249412 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" Apr 20 20:35:22.249569 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:22.249552 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" Apr 20 20:35:22.250787 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:22.250765 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" podUID="d712806c-4d72-49b1-a918-2a2aff1ec86f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 20 20:35:22.265659 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:22.265619 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" podStartSLOduration=2.265606223 podStartE2EDuration="2.265606223s" podCreationTimestamp="2026-04-20 20:35:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:35:22.264458143 +0000 UTC m=+1804.679750884" watchObservedRunningTime="2026-04-20 20:35:22.265606223 +0000 UTC m=+1804.680898964" Apr 20 20:35:23.252212 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:23.252176 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" podUID="d712806c-4d72-49b1-a918-2a2aff1ec86f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 20 20:35:23.534244 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:23.534215 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" Apr 20 20:35:23.694103 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:23.694066 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d52aa60-7277-4328-b3f1-c93fbd5c55a8-proxy-tls\") pod \"9d52aa60-7277-4328-b3f1-c93fbd5c55a8\" (UID: \"9d52aa60-7277-4328-b3f1-c93fbd5c55a8\") " Apr 20 20:35:23.694103 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:23.694107 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-fe410-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9d52aa60-7277-4328-b3f1-c93fbd5c55a8-error-404-isvc-fe410-kube-rbac-proxy-sar-config\") pod \"9d52aa60-7277-4328-b3f1-c93fbd5c55a8\" (UID: \"9d52aa60-7277-4328-b3f1-c93fbd5c55a8\") " Apr 20 20:35:23.694336 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:23.694165 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltlqf\" (UniqueName: \"kubernetes.io/projected/9d52aa60-7277-4328-b3f1-c93fbd5c55a8-kube-api-access-ltlqf\") pod \"9d52aa60-7277-4328-b3f1-c93fbd5c55a8\" (UID: \"9d52aa60-7277-4328-b3f1-c93fbd5c55a8\") " Apr 20 20:35:23.694505 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:23.694475 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d52aa60-7277-4328-b3f1-c93fbd5c55a8-error-404-isvc-fe410-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-fe410-kube-rbac-proxy-sar-config") pod "9d52aa60-7277-4328-b3f1-c93fbd5c55a8" (UID: "9d52aa60-7277-4328-b3f1-c93fbd5c55a8"). InnerVolumeSpecName "error-404-isvc-fe410-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:35:23.696145 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:23.696124 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d52aa60-7277-4328-b3f1-c93fbd5c55a8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9d52aa60-7277-4328-b3f1-c93fbd5c55a8" (UID: "9d52aa60-7277-4328-b3f1-c93fbd5c55a8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:35:23.696335 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:23.696310 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d52aa60-7277-4328-b3f1-c93fbd5c55a8-kube-api-access-ltlqf" (OuterVolumeSpecName: "kube-api-access-ltlqf") pod "9d52aa60-7277-4328-b3f1-c93fbd5c55a8" (UID: "9d52aa60-7277-4328-b3f1-c93fbd5c55a8"). InnerVolumeSpecName "kube-api-access-ltlqf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:35:23.794983 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:23.794946 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ltlqf\" (UniqueName: \"kubernetes.io/projected/9d52aa60-7277-4328-b3f1-c93fbd5c55a8-kube-api-access-ltlqf\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:35:23.794983 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:23.794979 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d52aa60-7277-4328-b3f1-c93fbd5c55a8-proxy-tls\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:35:23.795179 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:23.794994 2573 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-fe410-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9d52aa60-7277-4328-b3f1-c93fbd5c55a8-error-404-isvc-fe410-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:35:24.256126 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:24.256092 2573 generic.go:358] "Generic (PLEG): container finished" podID="9d52aa60-7277-4328-b3f1-c93fbd5c55a8" containerID="bcc11de94a51e69f700153890469943eb7226198adc3bab5541891e7c6b3fbcb" exitCode=0 Apr 20 20:35:24.256528 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:24.256153 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" event={"ID":"9d52aa60-7277-4328-b3f1-c93fbd5c55a8","Type":"ContainerDied","Data":"bcc11de94a51e69f700153890469943eb7226198adc3bab5541891e7c6b3fbcb"} Apr 20 20:35:24.256528 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:24.256175 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" Apr 20 20:35:24.256528 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:24.256186 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp" event={"ID":"9d52aa60-7277-4328-b3f1-c93fbd5c55a8","Type":"ContainerDied","Data":"eeaecb27ee7931f9e0ab06284101ffd574d3d64e35685e28b2059b9ccffe080e"} Apr 20 20:35:24.256528 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:24.256206 2573 scope.go:117] "RemoveContainer" containerID="cfa5247a541048b58c8d2d21530e8d9419b18f182a1298c892295378b7357602" Apr 20 20:35:24.264684 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:24.264662 2573 scope.go:117] "RemoveContainer" containerID="bcc11de94a51e69f700153890469943eb7226198adc3bab5541891e7c6b3fbcb" Apr 20 20:35:24.271963 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:24.271832 2573 scope.go:117] "RemoveContainer" containerID="cfa5247a541048b58c8d2d21530e8d9419b18f182a1298c892295378b7357602" Apr 20 20:35:24.272407 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:35:24.272377 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfa5247a541048b58c8d2d21530e8d9419b18f182a1298c892295378b7357602\": container with ID starting with cfa5247a541048b58c8d2d21530e8d9419b18f182a1298c892295378b7357602 not found: ID does not exist" containerID="cfa5247a541048b58c8d2d21530e8d9419b18f182a1298c892295378b7357602" Apr 20 20:35:24.272514 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:24.272420 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfa5247a541048b58c8d2d21530e8d9419b18f182a1298c892295378b7357602"} err="failed to get container status \"cfa5247a541048b58c8d2d21530e8d9419b18f182a1298c892295378b7357602\": rpc error: code = NotFound desc = could not find container \"cfa5247a541048b58c8d2d21530e8d9419b18f182a1298c892295378b7357602\": container with ID starting with cfa5247a541048b58c8d2d21530e8d9419b18f182a1298c892295378b7357602 not found: ID does not exist" Apr 20 20:35:24.272514 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:24.272444 2573 scope.go:117] "RemoveContainer" containerID="bcc11de94a51e69f700153890469943eb7226198adc3bab5541891e7c6b3fbcb" Apr 20 20:35:24.272740 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:35:24.272715 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcc11de94a51e69f700153890469943eb7226198adc3bab5541891e7c6b3fbcb\": container with ID starting with bcc11de94a51e69f700153890469943eb7226198adc3bab5541891e7c6b3fbcb not found: ID does not exist" containerID="bcc11de94a51e69f700153890469943eb7226198adc3bab5541891e7c6b3fbcb" Apr 20 20:35:24.272831 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:24.272748 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcc11de94a51e69f700153890469943eb7226198adc3bab5541891e7c6b3fbcb"} err="failed to get container status \"bcc11de94a51e69f700153890469943eb7226198adc3bab5541891e7c6b3fbcb\": rpc error: code = NotFound desc = could not find container \"bcc11de94a51e69f700153890469943eb7226198adc3bab5541891e7c6b3fbcb\": container with ID starting with bcc11de94a51e69f700153890469943eb7226198adc3bab5541891e7c6b3fbcb not found: ID does not exist" Apr 20 20:35:24.274415 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:24.274395 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp"] Apr 20 20:35:24.278130 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:24.278109 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-fe410-predictor-6b76d4c7f7-bwcdp"] Apr 20 20:35:26.173222 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:26.173176 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d52aa60-7277-4328-b3f1-c93fbd5c55a8" path="/var/lib/kubelet/pods/9d52aa60-7277-4328-b3f1-c93fbd5c55a8/volumes" Apr 20 20:35:28.257454 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:28.257424 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" Apr 20 20:35:28.258074 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:28.257973 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" podUID="d712806c-4d72-49b1-a918-2a2aff1ec86f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 20 20:35:29.119024 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:29.118995 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" Apr 20 20:35:38.258376 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:38.258337 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" podUID="d712806c-4d72-49b1-a918-2a2aff1ec86f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 20 20:35:48.258766 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:48.258718 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" podUID="d712806c-4d72-49b1-a918-2a2aff1ec86f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 20 20:35:50.745486 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:50.745402 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g"] Apr 20 20:35:50.745922 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:50.745767 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" podUID="bf9f6842-25d2-484c-8a66-9010dadf1cb9" containerName="kserve-container" containerID="cri-o://24fd6207382ad6cbb70d14c777e471766331ac6599a450f17f65d6cfc63d61ee" gracePeriod=30 Apr 20 20:35:50.745922 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:50.745835 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" podUID="bf9f6842-25d2-484c-8a66-9010dadf1cb9" containerName="kube-rbac-proxy" containerID="cri-o://cbd8abf5396a8b960686b90d9edc683b628a3557b57f7badeaf85bd3c10c04e4" gracePeriod=30 Apr 20 20:35:50.921266 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:50.921232 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv"] Apr 20 20:35:50.921604 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:50.921592 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d52aa60-7277-4328-b3f1-c93fbd5c55a8" containerName="kube-rbac-proxy" Apr 20 20:35:50.921691 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:50.921605 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d52aa60-7277-4328-b3f1-c93fbd5c55a8" containerName="kube-rbac-proxy" Apr 20 20:35:50.921691 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:50.921627 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d52aa60-7277-4328-b3f1-c93fbd5c55a8" containerName="kserve-container" Apr 20 20:35:50.921691 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:50.921633 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d52aa60-7277-4328-b3f1-c93fbd5c55a8" containerName="kserve-container" Apr 20 20:35:50.921691 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:50.921691 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d52aa60-7277-4328-b3f1-c93fbd5c55a8" containerName="kube-rbac-proxy" Apr 20 20:35:50.921831 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:50.921705 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d52aa60-7277-4328-b3f1-c93fbd5c55a8" containerName="kserve-container" Apr 20 20:35:50.924807 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:50.924790 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" Apr 20 20:35:50.927275 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:50.927255 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-57557-predictor-serving-cert\"" Apr 20 20:35:50.927275 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:50.927264 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"error-404-isvc-57557-kube-rbac-proxy-sar-config\"" Apr 20 20:35:50.932237 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:50.932198 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv"] Apr 20 20:35:51.036261 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:51.036227 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3e37579e-e9e4-4880-b2c5-355442b475b1-proxy-tls\") pod \"error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv\" (UID: \"3e37579e-e9e4-4880-b2c5-355442b475b1\") " pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" Apr 20 20:35:51.036468 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:51.036274 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"error-404-isvc-57557-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3e37579e-e9e4-4880-b2c5-355442b475b1-error-404-isvc-57557-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv\" (UID: \"3e37579e-e9e4-4880-b2c5-355442b475b1\") " pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" Apr 20 20:35:51.036468 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:51.036324 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nmtw\" (UniqueName: \"kubernetes.io/projected/3e37579e-e9e4-4880-b2c5-355442b475b1-kube-api-access-2nmtw\") pod \"error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv\" (UID: \"3e37579e-e9e4-4880-b2c5-355442b475b1\") " pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" Apr 20 20:35:51.137481 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:51.137452 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3e37579e-e9e4-4880-b2c5-355442b475b1-proxy-tls\") pod \"error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv\" (UID: \"3e37579e-e9e4-4880-b2c5-355442b475b1\") " pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" Apr 20 20:35:51.137682 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:51.137494 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"error-404-isvc-57557-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3e37579e-e9e4-4880-b2c5-355442b475b1-error-404-isvc-57557-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv\" (UID: \"3e37579e-e9e4-4880-b2c5-355442b475b1\") " pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" Apr 20 20:35:51.137682 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:51.137583 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2nmtw\" (UniqueName: \"kubernetes.io/projected/3e37579e-e9e4-4880-b2c5-355442b475b1-kube-api-access-2nmtw\") pod \"error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv\" (UID: \"3e37579e-e9e4-4880-b2c5-355442b475b1\") " pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" Apr 20 20:35:51.138187 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:51.138162 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"error-404-isvc-57557-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3e37579e-e9e4-4880-b2c5-355442b475b1-error-404-isvc-57557-kube-rbac-proxy-sar-config\") pod \"error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv\" (UID: \"3e37579e-e9e4-4880-b2c5-355442b475b1\") " pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" Apr 20 20:35:51.140052 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:51.140033 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3e37579e-e9e4-4880-b2c5-355442b475b1-proxy-tls\") pod \"error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv\" (UID: \"3e37579e-e9e4-4880-b2c5-355442b475b1\") " pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" Apr 20 20:35:51.145319 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:51.145299 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nmtw\" (UniqueName: \"kubernetes.io/projected/3e37579e-e9e4-4880-b2c5-355442b475b1-kube-api-access-2nmtw\") pod \"error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv\" (UID: \"3e37579e-e9e4-4880-b2c5-355442b475b1\") " pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" Apr 20 20:35:51.237365 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:51.237312 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" Apr 20 20:35:51.360412 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:51.360386 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv"] Apr 20 20:35:51.362314 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:35:51.362283 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e37579e_e9e4_4880_b2c5_355442b475b1.slice/crio-ade2460069cd9c2cbbbf3d7ea8ceb2c830e947821da5db45a6694db47deb2f40 WatchSource:0}: Error finding container ade2460069cd9c2cbbbf3d7ea8ceb2c830e947821da5db45a6694db47deb2f40: Status 404 returned error can't find the container with id ade2460069cd9c2cbbbf3d7ea8ceb2c830e947821da5db45a6694db47deb2f40 Apr 20 20:35:51.362450 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:51.362307 2573 generic.go:358] "Generic (PLEG): container finished" podID="bf9f6842-25d2-484c-8a66-9010dadf1cb9" containerID="cbd8abf5396a8b960686b90d9edc683b628a3557b57f7badeaf85bd3c10c04e4" exitCode=2 Apr 20 20:35:51.362450 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:51.362377 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" event={"ID":"bf9f6842-25d2-484c-8a66-9010dadf1cb9","Type":"ContainerDied","Data":"cbd8abf5396a8b960686b90d9edc683b628a3557b57f7badeaf85bd3c10c04e4"} Apr 20 20:35:52.367085 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:52.367047 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" event={"ID":"3e37579e-e9e4-4880-b2c5-355442b475b1","Type":"ContainerStarted","Data":"d15bb897a6772c6ca9ad11cd3a2fd5641b99e8a265597925fca63364623ee8c7"} Apr 20 20:35:52.367085 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:52.367088 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" event={"ID":"3e37579e-e9e4-4880-b2c5-355442b475b1","Type":"ContainerStarted","Data":"51749c24c7b2f7514107175adb124b21e6b0b2de62f6ebf684007a8d892703ca"} Apr 20 20:35:52.367511 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:52.367098 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" event={"ID":"3e37579e-e9e4-4880-b2c5-355442b475b1","Type":"ContainerStarted","Data":"ade2460069cd9c2cbbbf3d7ea8ceb2c830e947821da5db45a6694db47deb2f40"} Apr 20 20:35:52.367511 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:52.367130 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" Apr 20 20:35:52.384302 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:52.384245 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" podStartSLOduration=2.384231718 podStartE2EDuration="2.384231718s" podCreationTimestamp="2026-04-20 20:35:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:35:52.382764316 +0000 UTC m=+1834.798057051" watchObservedRunningTime="2026-04-20 20:35:52.384231718 +0000 UTC m=+1834.799524457" Apr 20 20:35:53.370517 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:53.370482 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" Apr 20 20:35:53.371873 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:53.371830 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" podUID="3e37579e-e9e4-4880-b2c5-355442b475b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 20 20:35:54.113729 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:54.113690 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" podUID="bf9f6842-25d2-484c-8a66-9010dadf1cb9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.40:8643/healthz\": dial tcp 10.133.0.40:8643: connect: connection refused" Apr 20 20:35:54.375663 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:54.375628 2573 generic.go:358] "Generic (PLEG): container finished" podID="bf9f6842-25d2-484c-8a66-9010dadf1cb9" containerID="24fd6207382ad6cbb70d14c777e471766331ac6599a450f17f65d6cfc63d61ee" exitCode=0 Apr 20 20:35:54.376024 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:54.375698 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" event={"ID":"bf9f6842-25d2-484c-8a66-9010dadf1cb9","Type":"ContainerDied","Data":"24fd6207382ad6cbb70d14c777e471766331ac6599a450f17f65d6cfc63d61ee"} Apr 20 20:35:54.376155 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:54.376131 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" podUID="3e37579e-e9e4-4880-b2c5-355442b475b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 20 20:35:54.392469 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:54.392441 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" Apr 20 20:35:54.470390 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:54.470353 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-ab6b8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bf9f6842-25d2-484c-8a66-9010dadf1cb9-error-404-isvc-ab6b8-kube-rbac-proxy-sar-config\") pod \"bf9f6842-25d2-484c-8a66-9010dadf1cb9\" (UID: \"bf9f6842-25d2-484c-8a66-9010dadf1cb9\") " Apr 20 20:35:54.470390 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:54.470396 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf9f6842-25d2-484c-8a66-9010dadf1cb9-proxy-tls\") pod \"bf9f6842-25d2-484c-8a66-9010dadf1cb9\" (UID: \"bf9f6842-25d2-484c-8a66-9010dadf1cb9\") " Apr 20 20:35:54.470590 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:54.470448 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl9m4\" (UniqueName: \"kubernetes.io/projected/bf9f6842-25d2-484c-8a66-9010dadf1cb9-kube-api-access-xl9m4\") pod \"bf9f6842-25d2-484c-8a66-9010dadf1cb9\" (UID: \"bf9f6842-25d2-484c-8a66-9010dadf1cb9\") " Apr 20 20:35:54.470762 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:54.470734 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf9f6842-25d2-484c-8a66-9010dadf1cb9-error-404-isvc-ab6b8-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-ab6b8-kube-rbac-proxy-sar-config") pod "bf9f6842-25d2-484c-8a66-9010dadf1cb9" (UID: "bf9f6842-25d2-484c-8a66-9010dadf1cb9"). InnerVolumeSpecName "error-404-isvc-ab6b8-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:35:54.472449 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:54.472425 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf9f6842-25d2-484c-8a66-9010dadf1cb9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "bf9f6842-25d2-484c-8a66-9010dadf1cb9" (UID: "bf9f6842-25d2-484c-8a66-9010dadf1cb9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:35:54.472543 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:54.472511 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf9f6842-25d2-484c-8a66-9010dadf1cb9-kube-api-access-xl9m4" (OuterVolumeSpecName: "kube-api-access-xl9m4") pod "bf9f6842-25d2-484c-8a66-9010dadf1cb9" (UID: "bf9f6842-25d2-484c-8a66-9010dadf1cb9"). InnerVolumeSpecName "kube-api-access-xl9m4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:35:54.572021 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:54.571974 2573 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-ab6b8-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bf9f6842-25d2-484c-8a66-9010dadf1cb9-error-404-isvc-ab6b8-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:35:54.572021 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:54.572014 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bf9f6842-25d2-484c-8a66-9010dadf1cb9-proxy-tls\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:35:54.572021 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:54.572025 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xl9m4\" (UniqueName: \"kubernetes.io/projected/bf9f6842-25d2-484c-8a66-9010dadf1cb9-kube-api-access-xl9m4\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:35:55.380375 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:55.380335 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" event={"ID":"bf9f6842-25d2-484c-8a66-9010dadf1cb9","Type":"ContainerDied","Data":"56dc015684510dc0f8216a82b88808fab9aa4454e03ba31dfe22634710c62db5"} Apr 20 20:35:55.380375 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:55.380382 2573 scope.go:117] "RemoveContainer" containerID="cbd8abf5396a8b960686b90d9edc683b628a3557b57f7badeaf85bd3c10c04e4" Apr 20 20:35:55.380927 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:55.380387 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g" Apr 20 20:35:55.389039 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:55.389021 2573 scope.go:117] "RemoveContainer" containerID="24fd6207382ad6cbb70d14c777e471766331ac6599a450f17f65d6cfc63d61ee" Apr 20 20:35:55.402222 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:55.402195 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g"] Apr 20 20:35:55.405020 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:55.404985 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-ab6b8-predictor-7fb5d66599-zms7g"] Apr 20 20:35:56.171378 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:56.171336 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf9f6842-25d2-484c-8a66-9010dadf1cb9" path="/var/lib/kubelet/pods/bf9f6842-25d2-484c-8a66-9010dadf1cb9/volumes" Apr 20 20:35:58.258506 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:58.258466 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" podUID="d712806c-4d72-49b1-a918-2a2aff1ec86f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 20 20:35:59.380078 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:59.380046 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" Apr 20 20:35:59.380551 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:35:59.380524 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" podUID="3e37579e-e9e4-4880-b2c5-355442b475b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 20 20:36:08.259022 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:36:08.258988 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" Apr 20 20:36:09.381346 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:36:09.381310 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" podUID="3e37579e-e9e4-4880-b2c5-355442b475b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 20 20:36:19.380866 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:36:19.380820 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" podUID="3e37579e-e9e4-4880-b2c5-355442b475b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 20 20:36:29.381528 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:36:29.381440 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" podUID="3e37579e-e9e4-4880-b2c5-355442b475b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 20 20:36:39.382041 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:36:39.382009 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" Apr 20 20:40:18.221306 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:40:18.221206 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/ovn-acl-logging/0.log" Apr 20 20:40:18.228063 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:40:18.228038 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/ovn-acl-logging/0.log" Apr 20 20:45:05.602894 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:45:05.602846 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv"] Apr 20 20:45:05.603412 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:45:05.603175 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" podUID="3e37579e-e9e4-4880-b2c5-355442b475b1" containerName="kserve-container" containerID="cri-o://51749c24c7b2f7514107175adb124b21e6b0b2de62f6ebf684007a8d892703ca" gracePeriod=30 Apr 20 20:45:05.603412 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:45:05.603340 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" podUID="3e37579e-e9e4-4880-b2c5-355442b475b1" containerName="kube-rbac-proxy" containerID="cri-o://d15bb897a6772c6ca9ad11cd3a2fd5641b99e8a265597925fca63364623ee8c7" gracePeriod=30 Apr 20 20:45:06.251201 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:45:06.251169 2573 generic.go:358] "Generic (PLEG): container finished" podID="3e37579e-e9e4-4880-b2c5-355442b475b1" containerID="d15bb897a6772c6ca9ad11cd3a2fd5641b99e8a265597925fca63364623ee8c7" exitCode=2 Apr 20 20:45:06.251368 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:45:06.251215 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" event={"ID":"3e37579e-e9e4-4880-b2c5-355442b475b1","Type":"ContainerDied","Data":"d15bb897a6772c6ca9ad11cd3a2fd5641b99e8a265597925fca63364623ee8c7"} Apr 20 20:45:08.654751 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:45:08.654723 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" Apr 20 20:45:08.752728 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:45:08.752658 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nmtw\" (UniqueName: \"kubernetes.io/projected/3e37579e-e9e4-4880-b2c5-355442b475b1-kube-api-access-2nmtw\") pod \"3e37579e-e9e4-4880-b2c5-355442b475b1\" (UID: \"3e37579e-e9e4-4880-b2c5-355442b475b1\") " Apr 20 20:45:08.752728 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:45:08.752701 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3e37579e-e9e4-4880-b2c5-355442b475b1-proxy-tls\") pod \"3e37579e-e9e4-4880-b2c5-355442b475b1\" (UID: \"3e37579e-e9e4-4880-b2c5-355442b475b1\") " Apr 20 20:45:08.752945 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:45:08.752744 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-57557-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3e37579e-e9e4-4880-b2c5-355442b475b1-error-404-isvc-57557-kube-rbac-proxy-sar-config\") pod \"3e37579e-e9e4-4880-b2c5-355442b475b1\" (UID: \"3e37579e-e9e4-4880-b2c5-355442b475b1\") " Apr 20 20:45:08.753171 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:45:08.753142 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e37579e-e9e4-4880-b2c5-355442b475b1-error-404-isvc-57557-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-57557-kube-rbac-proxy-sar-config") pod "3e37579e-e9e4-4880-b2c5-355442b475b1" (UID: "3e37579e-e9e4-4880-b2c5-355442b475b1"). InnerVolumeSpecName "error-404-isvc-57557-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:45:08.754760 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:45:08.754736 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e37579e-e9e4-4880-b2c5-355442b475b1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3e37579e-e9e4-4880-b2c5-355442b475b1" (UID: "3e37579e-e9e4-4880-b2c5-355442b475b1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:45:08.754977 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:45:08.754958 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e37579e-e9e4-4880-b2c5-355442b475b1-kube-api-access-2nmtw" (OuterVolumeSpecName: "kube-api-access-2nmtw") pod "3e37579e-e9e4-4880-b2c5-355442b475b1" (UID: "3e37579e-e9e4-4880-b2c5-355442b475b1"). InnerVolumeSpecName "kube-api-access-2nmtw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:45:08.853510 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:45:08.853479 2573 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-57557-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3e37579e-e9e4-4880-b2c5-355442b475b1-error-404-isvc-57557-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:45:08.853510 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:45:08.853505 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2nmtw\" (UniqueName: \"kubernetes.io/projected/3e37579e-e9e4-4880-b2c5-355442b475b1-kube-api-access-2nmtw\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:45:08.853510 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:45:08.853516 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3e37579e-e9e4-4880-b2c5-355442b475b1-proxy-tls\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:45:09.262227 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:45:09.262191 2573 generic.go:358] "Generic (PLEG): container finished" podID="3e37579e-e9e4-4880-b2c5-355442b475b1" containerID="51749c24c7b2f7514107175adb124b21e6b0b2de62f6ebf684007a8d892703ca" exitCode=0 Apr 20 20:45:09.262391 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:45:09.262277 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" Apr 20 20:45:09.262391 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:45:09.262274 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" event={"ID":"3e37579e-e9e4-4880-b2c5-355442b475b1","Type":"ContainerDied","Data":"51749c24c7b2f7514107175adb124b21e6b0b2de62f6ebf684007a8d892703ca"} Apr 20 20:45:09.262391 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:45:09.262382 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv" event={"ID":"3e37579e-e9e4-4880-b2c5-355442b475b1","Type":"ContainerDied","Data":"ade2460069cd9c2cbbbf3d7ea8ceb2c830e947821da5db45a6694db47deb2f40"} Apr 20 20:45:09.262498 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:45:09.262398 2573 scope.go:117] "RemoveContainer" containerID="d15bb897a6772c6ca9ad11cd3a2fd5641b99e8a265597925fca63364623ee8c7" Apr 20 20:45:09.271319 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:45:09.271300 2573 scope.go:117] "RemoveContainer" containerID="51749c24c7b2f7514107175adb124b21e6b0b2de62f6ebf684007a8d892703ca" Apr 20 20:45:09.278345 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:45:09.278329 2573 scope.go:117] "RemoveContainer" containerID="d15bb897a6772c6ca9ad11cd3a2fd5641b99e8a265597925fca63364623ee8c7" Apr 20 20:45:09.278589 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:45:09.278567 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d15bb897a6772c6ca9ad11cd3a2fd5641b99e8a265597925fca63364623ee8c7\": container with ID starting with d15bb897a6772c6ca9ad11cd3a2fd5641b99e8a265597925fca63364623ee8c7 not found: ID does not exist" containerID="d15bb897a6772c6ca9ad11cd3a2fd5641b99e8a265597925fca63364623ee8c7" Apr 20 20:45:09.278641 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:45:09.278596 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d15bb897a6772c6ca9ad11cd3a2fd5641b99e8a265597925fca63364623ee8c7"} err="failed to get container status \"d15bb897a6772c6ca9ad11cd3a2fd5641b99e8a265597925fca63364623ee8c7\": rpc error: code = NotFound desc = could not find container \"d15bb897a6772c6ca9ad11cd3a2fd5641b99e8a265597925fca63364623ee8c7\": container with ID starting with d15bb897a6772c6ca9ad11cd3a2fd5641b99e8a265597925fca63364623ee8c7 not found: ID does not exist" Apr 20 20:45:09.278641 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:45:09.278615 2573 scope.go:117] "RemoveContainer" containerID="51749c24c7b2f7514107175adb124b21e6b0b2de62f6ebf684007a8d892703ca" Apr 20 20:45:09.278836 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:45:09.278818 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51749c24c7b2f7514107175adb124b21e6b0b2de62f6ebf684007a8d892703ca\": container with ID starting with 51749c24c7b2f7514107175adb124b21e6b0b2de62f6ebf684007a8d892703ca not found: ID does not exist" containerID="51749c24c7b2f7514107175adb124b21e6b0b2de62f6ebf684007a8d892703ca" Apr 20 20:45:09.278925 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:45:09.278844 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51749c24c7b2f7514107175adb124b21e6b0b2de62f6ebf684007a8d892703ca"} err="failed to get container status \"51749c24c7b2f7514107175adb124b21e6b0b2de62f6ebf684007a8d892703ca\": rpc error: code = NotFound desc = could not find container \"51749c24c7b2f7514107175adb124b21e6b0b2de62f6ebf684007a8d892703ca\": container with ID starting with 51749c24c7b2f7514107175adb124b21e6b0b2de62f6ebf684007a8d892703ca not found: ID does not exist" Apr 20 20:45:09.282442 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:45:09.282418 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv"] Apr 20 20:45:09.286101 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:45:09.286079 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-57557-predictor-7b7cb58b6c-hbbsv"] Apr 20 20:45:10.171294 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:45:10.171264 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e37579e-e9e4-4880-b2c5-355442b475b1" path="/var/lib/kubelet/pods/3e37579e-e9e4-4880-b2c5-355442b475b1/volumes" Apr 20 20:45:18.246899 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:45:18.246788 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/ovn-acl-logging/0.log" Apr 20 20:45:18.262804 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:45:18.255281 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/ovn-acl-logging/0.log" Apr 20 20:47:53.619706 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:47:53.619649 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-227.ec2.internal" podUID="1e12d5376c1eb684628dd9cf17dac4a9" containerName="haproxy" probeResult="failure" output="Get \"https://172.20.0.1:6443/version\": context deadline exceeded" Apr 20 20:50:18.270052 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:50:18.269943 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/ovn-acl-logging/0.log" Apr 20 20:50:18.282305 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:50:18.282284 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/ovn-acl-logging/0.log" Apr 20 20:52:40.086414 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:52:40.086382 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8"] Apr 20 20:52:40.086964 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:52:40.086647 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" podUID="d712806c-4d72-49b1-a918-2a2aff1ec86f" containerName="kserve-container" containerID="cri-o://01310f7c51a109b165ef8fca8fab385b23ff31eb7740037b05059ecfae1f3b24" gracePeriod=30 Apr 20 20:52:40.086964 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:52:40.086715 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" podUID="d712806c-4d72-49b1-a918-2a2aff1ec86f" containerName="kube-rbac-proxy" containerID="cri-o://cb266e1cee2c7aa3a42cd74ab2a364ade5c9a69835fcd8d1d157a773e22fdaa3" gracePeriod=30 Apr 20 20:52:40.761611 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:52:40.761574 2573 generic.go:358] "Generic (PLEG): container finished" podID="d712806c-4d72-49b1-a918-2a2aff1ec86f" containerID="cb266e1cee2c7aa3a42cd74ab2a364ade5c9a69835fcd8d1d157a773e22fdaa3" exitCode=2 Apr 20 20:52:40.761787 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:52:40.761639 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" event={"ID":"d712806c-4d72-49b1-a918-2a2aff1ec86f","Type":"ContainerDied","Data":"cb266e1cee2c7aa3a42cd74ab2a364ade5c9a69835fcd8d1d157a773e22fdaa3"} Apr 20 20:52:43.141787 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:52:43.141765 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" Apr 20 20:52:43.250522 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:52:43.250455 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"error-404-isvc-60a5c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d712806c-4d72-49b1-a918-2a2aff1ec86f-error-404-isvc-60a5c-kube-rbac-proxy-sar-config\") pod \"d712806c-4d72-49b1-a918-2a2aff1ec86f\" (UID: \"d712806c-4d72-49b1-a918-2a2aff1ec86f\") " Apr 20 20:52:43.250522 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:52:43.250513 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtqfr\" (UniqueName: \"kubernetes.io/projected/d712806c-4d72-49b1-a918-2a2aff1ec86f-kube-api-access-vtqfr\") pod \"d712806c-4d72-49b1-a918-2a2aff1ec86f\" (UID: \"d712806c-4d72-49b1-a918-2a2aff1ec86f\") " Apr 20 20:52:43.250683 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:52:43.250550 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d712806c-4d72-49b1-a918-2a2aff1ec86f-proxy-tls\") pod \"d712806c-4d72-49b1-a918-2a2aff1ec86f\" (UID: \"d712806c-4d72-49b1-a918-2a2aff1ec86f\") " Apr 20 20:52:43.250876 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:52:43.250836 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d712806c-4d72-49b1-a918-2a2aff1ec86f-error-404-isvc-60a5c-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "error-404-isvc-60a5c-kube-rbac-proxy-sar-config") pod "d712806c-4d72-49b1-a918-2a2aff1ec86f" (UID: "d712806c-4d72-49b1-a918-2a2aff1ec86f"). InnerVolumeSpecName "error-404-isvc-60a5c-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:52:43.252547 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:52:43.252522 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d712806c-4d72-49b1-a918-2a2aff1ec86f-kube-api-access-vtqfr" (OuterVolumeSpecName: "kube-api-access-vtqfr") pod "d712806c-4d72-49b1-a918-2a2aff1ec86f" (UID: "d712806c-4d72-49b1-a918-2a2aff1ec86f"). InnerVolumeSpecName "kube-api-access-vtqfr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:52:43.252634 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:52:43.252584 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d712806c-4d72-49b1-a918-2a2aff1ec86f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d712806c-4d72-49b1-a918-2a2aff1ec86f" (UID: "d712806c-4d72-49b1-a918-2a2aff1ec86f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:52:43.352000 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:52:43.351974 2573 reconciler_common.go:299] "Volume detached for volume \"error-404-isvc-60a5c-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d712806c-4d72-49b1-a918-2a2aff1ec86f-error-404-isvc-60a5c-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:52:43.352000 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:52:43.351999 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vtqfr\" (UniqueName: \"kubernetes.io/projected/d712806c-4d72-49b1-a918-2a2aff1ec86f-kube-api-access-vtqfr\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:52:43.352127 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:52:43.352009 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d712806c-4d72-49b1-a918-2a2aff1ec86f-proxy-tls\") on node \"ip-10-0-130-227.ec2.internal\" DevicePath \"\"" Apr 20 20:52:43.773412 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:52:43.773377 2573 generic.go:358] "Generic (PLEG): container finished" podID="d712806c-4d72-49b1-a918-2a2aff1ec86f" containerID="01310f7c51a109b165ef8fca8fab385b23ff31eb7740037b05059ecfae1f3b24" exitCode=0 Apr 20 20:52:43.773592 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:52:43.773449 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" Apr 20 20:52:43.773592 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:52:43.773453 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" event={"ID":"d712806c-4d72-49b1-a918-2a2aff1ec86f","Type":"ContainerDied","Data":"01310f7c51a109b165ef8fca8fab385b23ff31eb7740037b05059ecfae1f3b24"} Apr 20 20:52:43.773592 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:52:43.773487 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8" event={"ID":"d712806c-4d72-49b1-a918-2a2aff1ec86f","Type":"ContainerDied","Data":"2b2394c7ae75ba8a75fd581af4e3b0766a5db9b3009b39943a2747c9b74af72a"} Apr 20 20:52:43.773592 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:52:43.773503 2573 scope.go:117] "RemoveContainer" containerID="cb266e1cee2c7aa3a42cd74ab2a364ade5c9a69835fcd8d1d157a773e22fdaa3" Apr 20 20:52:43.781958 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:52:43.781941 2573 scope.go:117] "RemoveContainer" containerID="01310f7c51a109b165ef8fca8fab385b23ff31eb7740037b05059ecfae1f3b24" Apr 20 20:52:43.789334 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:52:43.789320 2573 scope.go:117] "RemoveContainer" containerID="cb266e1cee2c7aa3a42cd74ab2a364ade5c9a69835fcd8d1d157a773e22fdaa3" Apr 20 20:52:43.789580 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:52:43.789562 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb266e1cee2c7aa3a42cd74ab2a364ade5c9a69835fcd8d1d157a773e22fdaa3\": container with ID starting with cb266e1cee2c7aa3a42cd74ab2a364ade5c9a69835fcd8d1d157a773e22fdaa3 not found: ID does not exist" containerID="cb266e1cee2c7aa3a42cd74ab2a364ade5c9a69835fcd8d1d157a773e22fdaa3" Apr 20 20:52:43.789627 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:52:43.789587 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb266e1cee2c7aa3a42cd74ab2a364ade5c9a69835fcd8d1d157a773e22fdaa3"} err="failed to get container status \"cb266e1cee2c7aa3a42cd74ab2a364ade5c9a69835fcd8d1d157a773e22fdaa3\": rpc error: code = NotFound desc = could not find container \"cb266e1cee2c7aa3a42cd74ab2a364ade5c9a69835fcd8d1d157a773e22fdaa3\": container with ID starting with cb266e1cee2c7aa3a42cd74ab2a364ade5c9a69835fcd8d1d157a773e22fdaa3 not found: ID does not exist" Apr 20 20:52:43.789627 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:52:43.789603 2573 scope.go:117] "RemoveContainer" containerID="01310f7c51a109b165ef8fca8fab385b23ff31eb7740037b05059ecfae1f3b24" Apr 20 20:52:43.789803 ip-10-0-130-227 kubenswrapper[2573]: E0420 20:52:43.789788 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01310f7c51a109b165ef8fca8fab385b23ff31eb7740037b05059ecfae1f3b24\": container with ID starting with 01310f7c51a109b165ef8fca8fab385b23ff31eb7740037b05059ecfae1f3b24 not found: ID does not exist" containerID="01310f7c51a109b165ef8fca8fab385b23ff31eb7740037b05059ecfae1f3b24" Apr 20 20:52:43.789864 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:52:43.789809 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01310f7c51a109b165ef8fca8fab385b23ff31eb7740037b05059ecfae1f3b24"} err="failed to get container status \"01310f7c51a109b165ef8fca8fab385b23ff31eb7740037b05059ecfae1f3b24\": rpc error: code = NotFound desc = could not find container \"01310f7c51a109b165ef8fca8fab385b23ff31eb7740037b05059ecfae1f3b24\": container with ID starting with 01310f7c51a109b165ef8fca8fab385b23ff31eb7740037b05059ecfae1f3b24 not found: ID does not exist" Apr 20 20:52:43.794417 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:52:43.794392 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8"] Apr 20 20:52:43.798088 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:52:43.798067 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/error-404-isvc-60a5c-predictor-74955c648c-g4rt8"] Apr 20 20:52:44.171159 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:52:44.171130 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d712806c-4d72-49b1-a918-2a2aff1ec86f" path="/var/lib/kubelet/pods/d712806c-4d72-49b1-a918-2a2aff1ec86f/volumes" Apr 20 20:53:07.930714 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:07.930636 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-k96mb_d1323090-2026-43df-829f-115ae2bc0438/global-pull-secret-syncer/0.log" Apr 20 20:53:08.008100 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:08.008065 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-lvg7k_09577d69-c26a-4291-a3bd-a9d8b123245a/konnectivity-agent/0.log" Apr 20 20:53:08.092645 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:08.092622 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-227.ec2.internal_1e12d5376c1eb684628dd9cf17dac4a9/haproxy/0.log" Apr 20 20:53:11.542825 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:11.542796 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4dfd3363-cf4b-40c1-954f-e78a2323e7be/alertmanager/0.log" Apr 20 20:53:11.565250 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:11.565228 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4dfd3363-cf4b-40c1-954f-e78a2323e7be/config-reloader/0.log" Apr 20 20:53:11.584231 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:11.584207 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4dfd3363-cf4b-40c1-954f-e78a2323e7be/kube-rbac-proxy-web/0.log" Apr 20 20:53:11.603555 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:11.603528 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4dfd3363-cf4b-40c1-954f-e78a2323e7be/kube-rbac-proxy/0.log" Apr 20 20:53:11.626014 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:11.625993 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4dfd3363-cf4b-40c1-954f-e78a2323e7be/kube-rbac-proxy-metric/0.log" Apr 20 20:53:11.646842 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:11.646820 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4dfd3363-cf4b-40c1-954f-e78a2323e7be/prom-label-proxy/0.log" Apr 20 20:53:11.666148 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:11.666127 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4dfd3363-cf4b-40c1-954f-e78a2323e7be/init-config-reloader/0.log" Apr 20 20:53:11.730556 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:11.730525 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-szw4p_b659d2f4-02dc-4e48-8c03-8ad876c8b7d9/kube-state-metrics/0.log" Apr 20 20:53:11.754385 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:11.754353 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-szw4p_b659d2f4-02dc-4e48-8c03-8ad876c8b7d9/kube-rbac-proxy-main/0.log" Apr 20 20:53:11.777388 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:11.777369 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-szw4p_b659d2f4-02dc-4e48-8c03-8ad876c8b7d9/kube-rbac-proxy-self/0.log" Apr 20 20:53:11.866372 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:11.866301 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cxwts_69b6688a-37f6-4688-a5fd-d2ef9e639109/node-exporter/0.log" Apr 20 20:53:11.885954 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:11.885927 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cxwts_69b6688a-37f6-4688-a5fd-d2ef9e639109/kube-rbac-proxy/0.log" Apr 20 20:53:11.904185 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:11.904160 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cxwts_69b6688a-37f6-4688-a5fd-d2ef9e639109/init-textfile/0.log" Apr 20 20:53:12.074678 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:12.074648 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-djt8k_a39b1ed6-ec22-43b1-83cd-adbacbe16fdd/kube-rbac-proxy-main/0.log" Apr 20 20:53:12.094356 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:12.094326 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-djt8k_a39b1ed6-ec22-43b1-83cd-adbacbe16fdd/kube-rbac-proxy-self/0.log" Apr 20 20:53:12.113685 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:12.113655 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-djt8k_a39b1ed6-ec22-43b1-83cd-adbacbe16fdd/openshift-state-metrics/0.log" Apr 20 20:53:12.355420 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:12.355374 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-rc42b_f056ed27-f10a-4fac-bf71-741c0d02ce55/prometheus-operator-admission-webhook/0.log" Apr 20 20:53:13.718203 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:13.718172 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-qv9dl_14f2d422-2dc1-4c64-8da4-e881d350c667/networking-console-plugin/0.log" Apr 20 20:53:14.483380 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:14.483357 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56b9cc76b8-4d2sk_4265a32c-5867-48f2-b24f-1e561f30967e/console/0.log" Apr 20 20:53:14.520328 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:14.520305 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-77c6w_f0e0866b-98b5-43bb-811b-a80d0fe3e428/download-server/0.log" Apr 20 20:53:14.986364 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:14.986335 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2h9np/perf-node-gather-daemonset-78x9j"] Apr 20 20:53:14.986703 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:14.986658 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf9f6842-25d2-484c-8a66-9010dadf1cb9" containerName="kserve-container" Apr 20 20:53:14.986703 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:14.986669 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf9f6842-25d2-484c-8a66-9010dadf1cb9" containerName="kserve-container" Apr 20 20:53:14.986703 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:14.986680 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d712806c-4d72-49b1-a918-2a2aff1ec86f" containerName="kube-rbac-proxy" Apr 20 20:53:14.986703 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:14.986686 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d712806c-4d72-49b1-a918-2a2aff1ec86f" containerName="kube-rbac-proxy" Apr 20 20:53:14.986703 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:14.986694 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf9f6842-25d2-484c-8a66-9010dadf1cb9" containerName="kube-rbac-proxy" Apr 20 20:53:14.986703 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:14.986701 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf9f6842-25d2-484c-8a66-9010dadf1cb9" containerName="kube-rbac-proxy" Apr 20 20:53:14.986923 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:14.986714 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d712806c-4d72-49b1-a918-2a2aff1ec86f" containerName="kserve-container" Apr 20 20:53:14.986923 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:14.986719 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d712806c-4d72-49b1-a918-2a2aff1ec86f" containerName="kserve-container" Apr 20 20:53:14.986923 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:14.986729 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e37579e-e9e4-4880-b2c5-355442b475b1" containerName="kserve-container" Apr 20 20:53:14.986923 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:14.986734 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e37579e-e9e4-4880-b2c5-355442b475b1" containerName="kserve-container" Apr 20 20:53:14.986923 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:14.986747 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e37579e-e9e4-4880-b2c5-355442b475b1" containerName="kube-rbac-proxy" Apr 20 20:53:14.986923 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:14.986752 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e37579e-e9e4-4880-b2c5-355442b475b1" containerName="kube-rbac-proxy" Apr 20 20:53:14.986923 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:14.986814 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf9f6842-25d2-484c-8a66-9010dadf1cb9" containerName="kube-rbac-proxy" Apr 20 20:53:14.986923 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:14.986822 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d712806c-4d72-49b1-a918-2a2aff1ec86f" containerName="kserve-container" Apr 20 20:53:14.986923 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:14.986829 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3e37579e-e9e4-4880-b2c5-355442b475b1" containerName="kserve-container" Apr 20 20:53:14.986923 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:14.986836 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3e37579e-e9e4-4880-b2c5-355442b475b1" containerName="kube-rbac-proxy" Apr 20 20:53:14.986923 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:14.986843 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d712806c-4d72-49b1-a918-2a2aff1ec86f" containerName="kube-rbac-proxy" Apr 20 20:53:14.986923 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:14.986866 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf9f6842-25d2-484c-8a66-9010dadf1cb9" containerName="kserve-container" Apr 20 20:53:14.989813 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:14.989792 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-78x9j" Apr 20 20:53:14.992630 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:14.992608 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-2h9np\"/\"default-dockercfg-9jbc7\"" Apr 20 20:53:14.992630 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:14.992621 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2h9np\"/\"openshift-service-ca.crt\"" Apr 20 20:53:14.993567 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:14.993539 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2h9np\"/\"kube-root-ca.crt\"" Apr 20 20:53:15.000121 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:15.000093 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2h9np/perf-node-gather-daemonset-78x9j"] Apr 20 20:53:15.088962 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:15.088930 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc6rf\" (UniqueName: \"kubernetes.io/projected/cf3dea46-49ce-47b8-b5c3-40d079baed5d-kube-api-access-tc6rf\") pod \"perf-node-gather-daemonset-78x9j\" (UID: \"cf3dea46-49ce-47b8-b5c3-40d079baed5d\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-78x9j" Apr 20 20:53:15.089108 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:15.088967 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cf3dea46-49ce-47b8-b5c3-40d079baed5d-podres\") pod \"perf-node-gather-daemonset-78x9j\" (UID: \"cf3dea46-49ce-47b8-b5c3-40d079baed5d\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-78x9j" Apr 20 20:53:15.089108 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:15.089043 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cf3dea46-49ce-47b8-b5c3-40d079baed5d-lib-modules\") pod \"perf-node-gather-daemonset-78x9j\" (UID: \"cf3dea46-49ce-47b8-b5c3-40d079baed5d\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-78x9j" Apr 20 20:53:15.089186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:15.089121 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cf3dea46-49ce-47b8-b5c3-40d079baed5d-proc\") pod \"perf-node-gather-daemonset-78x9j\" (UID: \"cf3dea46-49ce-47b8-b5c3-40d079baed5d\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-78x9j" Apr 20 20:53:15.089186 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:15.089144 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cf3dea46-49ce-47b8-b5c3-40d079baed5d-sys\") pod \"perf-node-gather-daemonset-78x9j\" (UID: \"cf3dea46-49ce-47b8-b5c3-40d079baed5d\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-78x9j" Apr 20 20:53:15.189991 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:15.189963 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cf3dea46-49ce-47b8-b5c3-40d079baed5d-proc\") pod \"perf-node-gather-daemonset-78x9j\" (UID: \"cf3dea46-49ce-47b8-b5c3-40d079baed5d\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-78x9j" Apr 20 20:53:15.190106 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:15.189996 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cf3dea46-49ce-47b8-b5c3-40d079baed5d-sys\") pod \"perf-node-gather-daemonset-78x9j\" (UID: \"cf3dea46-49ce-47b8-b5c3-40d079baed5d\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-78x9j" Apr 20 20:53:15.190106 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:15.190021 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tc6rf\" (UniqueName: \"kubernetes.io/projected/cf3dea46-49ce-47b8-b5c3-40d079baed5d-kube-api-access-tc6rf\") pod \"perf-node-gather-daemonset-78x9j\" (UID: \"cf3dea46-49ce-47b8-b5c3-40d079baed5d\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-78x9j" Apr 20 20:53:15.190106 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:15.190080 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cf3dea46-49ce-47b8-b5c3-40d079baed5d-podres\") pod \"perf-node-gather-daemonset-78x9j\" (UID: \"cf3dea46-49ce-47b8-b5c3-40d079baed5d\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-78x9j" Apr 20 20:53:15.190227 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:15.190152 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cf3dea46-49ce-47b8-b5c3-40d079baed5d-lib-modules\") pod \"perf-node-gather-daemonset-78x9j\" (UID: \"cf3dea46-49ce-47b8-b5c3-40d079baed5d\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-78x9j" Apr 20 20:53:15.190227 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:15.190089 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cf3dea46-49ce-47b8-b5c3-40d079baed5d-proc\") pod \"perf-node-gather-daemonset-78x9j\" (UID: \"cf3dea46-49ce-47b8-b5c3-40d079baed5d\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-78x9j" Apr 20 20:53:15.190227 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:15.190190 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cf3dea46-49ce-47b8-b5c3-40d079baed5d-podres\") pod \"perf-node-gather-daemonset-78x9j\" (UID: \"cf3dea46-49ce-47b8-b5c3-40d079baed5d\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-78x9j" Apr 20 20:53:15.190334 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:15.190090 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cf3dea46-49ce-47b8-b5c3-40d079baed5d-sys\") pod \"perf-node-gather-daemonset-78x9j\" (UID: \"cf3dea46-49ce-47b8-b5c3-40d079baed5d\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-78x9j" Apr 20 20:53:15.190334 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:15.190271 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cf3dea46-49ce-47b8-b5c3-40d079baed5d-lib-modules\") pod \"perf-node-gather-daemonset-78x9j\" (UID: \"cf3dea46-49ce-47b8-b5c3-40d079baed5d\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-78x9j" Apr 20 20:53:15.198420 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:15.198395 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc6rf\" (UniqueName: \"kubernetes.io/projected/cf3dea46-49ce-47b8-b5c3-40d079baed5d-kube-api-access-tc6rf\") pod \"perf-node-gather-daemonset-78x9j\" (UID: \"cf3dea46-49ce-47b8-b5c3-40d079baed5d\") " pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-78x9j" Apr 20 20:53:15.301114 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:15.301089 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-78x9j" Apr 20 20:53:15.417592 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:15.417560 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2h9np/perf-node-gather-daemonset-78x9j"] Apr 20 20:53:15.420015 ip-10-0-130-227 kubenswrapper[2573]: W0420 20:53:15.419987 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcf3dea46_49ce_47b8_b5c3_40d079baed5d.slice/crio-0c858ea9e3d527cba705019cd651a891c95348af7d2099aefee554bfdea2f922 WatchSource:0}: Error finding container 0c858ea9e3d527cba705019cd651a891c95348af7d2099aefee554bfdea2f922: Status 404 returned error can't find the container with id 0c858ea9e3d527cba705019cd651a891c95348af7d2099aefee554bfdea2f922 Apr 20 20:53:15.421635 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:15.421617 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:53:15.627463 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:15.627400 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-x7nqw_ddb92e25-31c0-49bc-9084-b1a08aad3877/dns/0.log" Apr 20 20:53:15.645361 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:15.645337 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-x7nqw_ddb92e25-31c0-49bc-9084-b1a08aad3877/kube-rbac-proxy/0.log" Apr 20 20:53:15.692134 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:15.692114 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-pgbf5_d645d5ae-1405-421d-8f27-65e056976e28/dns-node-resolver/0.log" Apr 20 20:53:15.883044 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:15.882975 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-78x9j" event={"ID":"cf3dea46-49ce-47b8-b5c3-40d079baed5d","Type":"ContainerStarted","Data":"2c2040580f4a8571134a64613db33d2132f0734b52d0d9a7d88d6ff5a6375966"} Apr 20 20:53:15.883044 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:15.883008 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-78x9j" event={"ID":"cf3dea46-49ce-47b8-b5c3-40d079baed5d","Type":"ContainerStarted","Data":"0c858ea9e3d527cba705019cd651a891c95348af7d2099aefee554bfdea2f922"} Apr 20 20:53:15.883196 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:15.883066 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-78x9j" Apr 20 20:53:15.898033 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:15.897975 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-78x9j" podStartSLOduration=1.897962622 podStartE2EDuration="1.897962622s" podCreationTimestamp="2026-04-20 20:53:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:53:15.896828334 +0000 UTC m=+2878.312121076" watchObservedRunningTime="2026-04-20 20:53:15.897962622 +0000 UTC m=+2878.313255364" Apr 20 20:53:16.092729 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:16.092699 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6754dfd45f-tfgml_84920716-a237-4371-8e3b-ecc46291eb90/registry/0.log" Apr 20 20:53:16.112969 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:16.112945 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-knsf9_7f6b4fe9-415b-4c1d-91f4-70456b92ec7e/node-ca/0.log" Apr 20 20:53:16.860216 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:16.860186 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-59745ff96d-mll2m_1da8a235-923c-4d64-a78a-ee3ef677d15d/router/0.log" Apr 20 20:53:17.192537 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:17.192450 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-67td9_9104f378-d15a-480e-aae0-cb20f3c35f2c/serve-healthcheck-canary/0.log" Apr 20 20:53:17.700961 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:17.700934 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-zzlvj_4cc09a9f-b574-4629-92a5-1121adb73396/kube-rbac-proxy/0.log" Apr 20 20:53:17.720131 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:17.720105 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-zzlvj_4cc09a9f-b574-4629-92a5-1121adb73396/exporter/0.log" Apr 20 20:53:17.739727 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:17.739700 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-zzlvj_4cc09a9f-b574-4629-92a5-1121adb73396/extractor/0.log" Apr 20 20:53:19.588297 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:19.588265 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-6f655776dd-8rrmj_f5b8eb9d-98f7-4d39-be5d-103edbf66559/manager/0.log" Apr 20 20:53:19.606209 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:19.606182 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-lnvqt_a878566e-78f7-46cc-a5ee-04fcc3b13829/manager/0.log" Apr 20 20:53:21.896540 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:21.896511 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-2h9np/perf-node-gather-daemonset-78x9j" Apr 20 20:53:23.699421 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:23.699380 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-mpjng_06eaab54-dbb4-4d67-8fc9-d22d03b1e5a8/kube-storage-version-migrator-operator/1.log" Apr 20 20:53:23.701092 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:23.701061 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-mpjng_06eaab54-dbb4-4d67-8fc9-d22d03b1e5a8/kube-storage-version-migrator-operator/0.log" Apr 20 20:53:24.577466 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:24.577435 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6ksh7_f81027c8-2ac8-4e5e-b754-45c3af3ec095/kube-multus/0.log" Apr 20 20:53:24.965794 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:24.965769 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rcsnj_ba60b2b3-08e4-40aa-842f-6be514920597/kube-multus-additional-cni-plugins/0.log" Apr 20 20:53:24.990011 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:24.989983 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rcsnj_ba60b2b3-08e4-40aa-842f-6be514920597/egress-router-binary-copy/0.log" Apr 20 20:53:25.013007 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:25.012982 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rcsnj_ba60b2b3-08e4-40aa-842f-6be514920597/cni-plugins/0.log" Apr 20 20:53:25.038873 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:25.038831 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rcsnj_ba60b2b3-08e4-40aa-842f-6be514920597/bond-cni-plugin/0.log" Apr 20 20:53:25.059941 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:25.059881 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rcsnj_ba60b2b3-08e4-40aa-842f-6be514920597/routeoverride-cni/0.log" Apr 20 20:53:25.081521 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:25.081500 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rcsnj_ba60b2b3-08e4-40aa-842f-6be514920597/whereabouts-cni-bincopy/0.log" Apr 20 20:53:25.103729 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:25.103710 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rcsnj_ba60b2b3-08e4-40aa-842f-6be514920597/whereabouts-cni/0.log" Apr 20 20:53:25.179472 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:25.179448 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-npkgv_923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d/network-metrics-daemon/0.log" Apr 20 20:53:25.197384 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:25.197354 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-npkgv_923d6e1f-e27c-45f6-a24a-ccb3fc0e1c3d/kube-rbac-proxy/0.log" Apr 20 20:53:26.318370 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:26.318347 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/ovn-controller/0.log" Apr 20 20:53:26.338225 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:26.338202 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/ovn-acl-logging/0.log" Apr 20 20:53:26.362014 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:26.361981 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/ovn-acl-logging/1.log" Apr 20 20:53:26.384265 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:26.384239 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/kube-rbac-proxy-node/0.log" Apr 20 20:53:26.403844 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:26.403819 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 20:53:26.419231 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:26.419207 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/northd/0.log" Apr 20 20:53:26.438307 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:26.438282 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/nbdb/0.log" Apr 20 20:53:26.457446 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:26.457420 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/sbdb/0.log" Apr 20 20:53:26.632168 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:26.632095 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ks7s9_54652553-5a61-482e-b688-ce64c57b917b/ovnkube-controller/0.log" Apr 20 20:53:27.864449 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:27.864418 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-6zzrt_a2d2805c-7734-4363-99a7-f1fc0f7b91a5/check-endpoints/0.log" Apr 20 20:53:27.905830 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:27.905806 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-hwpzm_983cba91-1490-41d1-acd9-67e8ffb4ce55/network-check-target-container/0.log" Apr 20 20:53:28.764487 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:28.764459 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-n68c4_c02e57e8-2b76-4827-a61a-dac826a87aa2/iptables-alerter/0.log" Apr 20 20:53:29.373527 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:29.373495 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-qb6xj_26b28667-df5c-4e73-a021-1d3b5430daaf/tuned/0.log" Apr 20 20:53:30.959529 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:30.959491 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-qwgxr_dfd612b2-5ec7-4a68-9cd6-29a94ae37e78/cluster-samples-operator/0.log" Apr 20 20:53:30.974255 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:30.974227 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-qwgxr_dfd612b2-5ec7-4a68-9cd6-29a94ae37e78/cluster-samples-operator-watch/0.log" Apr 20 20:53:31.833170 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:31.833137 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-zl54z_51239262-468e-4240-9144-dfb1b3010a21/service-ca-operator/1.log" Apr 20 20:53:31.834891 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:31.834864 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-zl54z_51239262-468e-4240-9144-dfb1b3010a21/service-ca-operator/0.log" Apr 20 20:53:32.540949 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:32.540924 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-gmvq6_6e906053-dbcb-4d63-8f1f-4eb6a911e9e3/csi-driver/0.log" Apr 20 20:53:32.562257 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:32.562228 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-gmvq6_6e906053-dbcb-4d63-8f1f-4eb6a911e9e3/csi-node-driver-registrar/0.log" Apr 20 20:53:32.580450 ip-10-0-130-227 kubenswrapper[2573]: I0420 20:53:32.580421 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-gmvq6_6e906053-dbcb-4d63-8f1f-4eb6a911e9e3/csi-liveness-probe/0.log"