Apr 16 17:40:05.644596 ip-10-0-128-241 systemd[1]: Starting Kubernetes Kubelet... Apr 16 17:40:06.175286 ip-10-0-128-241 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 17:40:06.175286 ip-10-0-128-241 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 17:40:06.175286 ip-10-0-128-241 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 17:40:06.175286 ip-10-0-128-241 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 17:40:06.175286 ip-10-0-128-241 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 17:40:06.177291 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.177138 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 17:40:06.181958 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.181936 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:40:06.181958 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.181954 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:40:06.181958 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.181957 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:40:06.181958 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.181960 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:40:06.181958 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.181963 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:40:06.181958 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.181966 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:40:06.182183 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.181969 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:40:06.182183 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.181972 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:40:06.182183 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.181975 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:40:06.182183 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.181977 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:40:06.182183 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.181980 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:40:06.182183 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.181983 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:40:06.182183 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.181985 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:40:06.182183 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.181988 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:40:06.182183 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.181990 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:40:06.182183 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.181994 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:40:06.182183 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.181996 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:40:06.182183 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.181999 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:40:06.182183 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182001 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:40:06.182183 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182004 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:40:06.182183 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182007 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:40:06.182183 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182010 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:40:06.182183 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182012 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:40:06.182183 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182016 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:40:06.182183 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182019 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:40:06.182183 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182022 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:40:06.182712 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182024 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:40:06.182712 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182027 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:40:06.182712 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182030 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:40:06.182712 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182032 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:40:06.182712 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182035 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:40:06.182712 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182038 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:40:06.182712 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182040 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:40:06.182712 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182043 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:40:06.182712 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182045 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:40:06.182712 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182048 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:40:06.182712 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182050 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:40:06.182712 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182053 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:40:06.182712 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182056 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:40:06.182712 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182059 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:40:06.182712 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182062 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:40:06.182712 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182067 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:40:06.182712 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182071 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:40:06.182712 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182074 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:40:06.182712 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182076 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:40:06.183195 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182080 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:40:06.183195 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182083 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:40:06.183195 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182086 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:40:06.183195 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182089 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:40:06.183195 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182091 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:40:06.183195 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182095 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:40:06.183195 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182098 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:40:06.183195 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182101 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:40:06.183195 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182103 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:40:06.183195 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182106 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:40:06.183195 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182109 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:40:06.183195 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182111 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:40:06.183195 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182114 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:40:06.183195 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182117 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:40:06.183195 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182120 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:40:06.183195 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182122 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:40:06.183195 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182125 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:40:06.183195 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182127 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:40:06.183195 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182130 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:40:06.183195 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182133 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:40:06.183774 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182135 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:40:06.183774 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182138 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:40:06.183774 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182141 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:40:06.183774 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182143 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:40:06.183774 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182146 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:40:06.183774 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182149 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:40:06.183774 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182152 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:40:06.183774 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182157 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:40:06.183774 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182160 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:40:06.183774 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182164 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:40:06.183774 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182167 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:40:06.183774 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182170 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:40:06.183774 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182173 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:40:06.183774 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182177 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:40:06.183774 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182180 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:40:06.183774 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182183 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:40:06.183774 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182187 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:40:06.183774 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182190 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:40:06.183774 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182193 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:40:06.183774 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182196 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:40:06.184297 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.182199 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:40:06.184297 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184045 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:40:06.184297 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184053 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:40:06.184297 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184057 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:40:06.184297 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184061 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:40:06.184297 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184064 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:40:06.184297 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184066 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:40:06.184297 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184069 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:40:06.184297 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184072 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:40:06.184297 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184075 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:40:06.184297 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184078 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:40:06.184297 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184080 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:40:06.184297 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184083 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:40:06.184297 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184086 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:40:06.184297 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184089 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:40:06.184297 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184092 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:40:06.184297 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184094 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:40:06.184297 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184097 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:40:06.184297 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184100 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:40:06.184758 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184103 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:40:06.184758 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184105 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:40:06.184758 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184108 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:40:06.184758 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184110 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:40:06.184758 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184114 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:40:06.184758 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184118 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:40:06.184758 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184122 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:40:06.184758 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184125 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:40:06.184758 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184137 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:40:06.184758 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184140 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:40:06.184758 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184143 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:40:06.184758 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184146 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:40:06.184758 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184149 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:40:06.184758 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184151 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:40:06.184758 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184154 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:40:06.184758 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184157 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:40:06.184758 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184160 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:40:06.184758 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184163 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:40:06.184758 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184165 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:40:06.185277 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184168 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:40:06.185277 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184171 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:40:06.185277 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184173 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:40:06.185277 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184176 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:40:06.185277 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184179 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:40:06.185277 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184181 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:40:06.185277 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184184 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:40:06.185277 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184186 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:40:06.185277 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184189 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:40:06.185277 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184191 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:40:06.185277 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184199 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:40:06.185277 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184202 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:40:06.185277 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184206 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:40:06.185277 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184208 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:40:06.185277 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184211 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:40:06.185277 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184229 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:40:06.185277 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184232 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:40:06.185277 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184234 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:40:06.185277 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184238 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:40:06.185277 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184241 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:40:06.185781 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184244 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:40:06.185781 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184247 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:40:06.185781 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184249 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:40:06.185781 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184252 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:40:06.185781 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184254 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:40:06.185781 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184258 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:40:06.185781 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184260 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:40:06.185781 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184263 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:40:06.185781 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184266 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:40:06.185781 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184268 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:40:06.185781 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184271 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:40:06.185781 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184274 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:40:06.185781 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184276 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:40:06.185781 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184279 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:40:06.185781 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184282 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:40:06.185781 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184284 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:40:06.185781 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184287 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:40:06.185781 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184289 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:40:06.185781 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184292 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:40:06.185781 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184295 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:40:06.186297 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184300 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:40:06.186297 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184304 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:40:06.186297 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184307 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:40:06.186297 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184310 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:40:06.186297 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184314 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:40:06.186297 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184316 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:40:06.186297 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184319 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:40:06.186297 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184321 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:40:06.186297 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.184324 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:40:06.186297 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184402 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 17:40:06.186297 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184410 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 17:40:06.186297 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184416 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 17:40:06.186297 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184420 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 17:40:06.186297 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184426 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 17:40:06.186297 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184429 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 17:40:06.186297 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184437 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 17:40:06.186297 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184441 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 17:40:06.186297 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184444 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 17:40:06.186297 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184448 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 17:40:06.186297 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184451 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 17:40:06.186297 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184454 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 17:40:06.186808 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184458 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 17:40:06.186808 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184461 2573 flags.go:64] FLAG: --cgroup-root="" Apr 16 17:40:06.186808 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184464 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 17:40:06.186808 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184467 2573 flags.go:64] FLAG: --client-ca-file="" Apr 16 17:40:06.186808 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184470 2573 flags.go:64] FLAG: --cloud-config="" Apr 16 17:40:06.186808 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184473 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 16 17:40:06.186808 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184476 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 17:40:06.186808 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184480 2573 flags.go:64] FLAG: --cluster-domain="" Apr 16 17:40:06.186808 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184483 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 17:40:06.186808 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184487 2573 flags.go:64] FLAG: --config-dir="" Apr 16 17:40:06.186808 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184490 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 17:40:06.186808 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184493 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 17:40:06.186808 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184497 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 17:40:06.186808 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184500 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 17:40:06.186808 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184503 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 17:40:06.186808 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184508 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 17:40:06.186808 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184511 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 16 17:40:06.186808 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184514 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 17:40:06.186808 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184517 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 17:40:06.186808 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184520 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 17:40:06.186808 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184523 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 17:40:06.186808 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184528 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 17:40:06.186808 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184530 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 17:40:06.186808 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184533 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 17:40:06.186808 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184536 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 17:40:06.187454 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184541 2573 flags.go:64] FLAG: --enable-server="true" Apr 16 17:40:06.187454 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184545 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 17:40:06.187454 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184550 2573 flags.go:64] FLAG: --event-burst="100" Apr 16 17:40:06.187454 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184553 2573 flags.go:64] FLAG: --event-qps="50" Apr 16 17:40:06.187454 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184556 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 17:40:06.187454 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184560 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 17:40:06.187454 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184563 2573 flags.go:64] FLAG: --eviction-hard="" Apr 16 17:40:06.187454 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184566 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 17:40:06.187454 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184569 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 17:40:06.187454 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184573 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 17:40:06.187454 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184576 2573 flags.go:64] FLAG: --eviction-soft="" Apr 16 17:40:06.187454 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184579 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 17:40:06.187454 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184582 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 17:40:06.187454 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184585 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 17:40:06.187454 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184588 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 17:40:06.187454 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184591 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 17:40:06.187454 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184594 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 17:40:06.187454 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184597 2573 flags.go:64] FLAG: --feature-gates="" Apr 16 17:40:06.187454 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184601 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 17:40:06.187454 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184604 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 17:40:06.187454 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184607 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 17:40:06.187454 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184610 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 17:40:06.187454 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184618 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 16 17:40:06.187454 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184621 2573 flags.go:64] FLAG: --help="false" Apr 16 17:40:06.187454 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184624 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-128-241.ec2.internal" Apr 16 17:40:06.188046 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184627 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 17:40:06.188046 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184630 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 17:40:06.188046 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184633 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 17:40:06.188046 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184636 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 17:40:06.188046 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184640 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 17:40:06.188046 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184644 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 17:40:06.188046 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184646 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 17:40:06.188046 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184649 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 17:40:06.188046 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184654 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 17:40:06.188046 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184657 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 17:40:06.188046 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184660 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 17:40:06.188046 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184663 2573 flags.go:64] FLAG: --kube-reserved="" Apr 16 17:40:06.188046 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184666 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 17:40:06.188046 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184669 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 17:40:06.188046 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184672 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 17:40:06.188046 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184675 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 17:40:06.188046 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184678 2573 flags.go:64] FLAG: --lock-file="" Apr 16 17:40:06.188046 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184681 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 17:40:06.188046 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184684 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 17:40:06.188046 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184687 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 17:40:06.188046 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184693 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 17:40:06.188046 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184696 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 17:40:06.188046 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184699 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 17:40:06.188046 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184702 2573 flags.go:64] FLAG: --logging-format="text" Apr 16 17:40:06.188650 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184704 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 17:40:06.188650 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184708 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 17:40:06.188650 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184711 2573 flags.go:64] FLAG: --manifest-url="" Apr 16 17:40:06.188650 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184714 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 16 17:40:06.188650 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184719 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 17:40:06.188650 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184723 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 17:40:06.188650 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184727 2573 flags.go:64] FLAG: --max-pods="110" Apr 16 17:40:06.188650 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184730 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 17:40:06.188650 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184733 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 17:40:06.188650 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184736 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 17:40:06.188650 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184739 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 17:40:06.188650 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184742 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 17:40:06.188650 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184745 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 17:40:06.188650 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184748 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 17:40:06.188650 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184756 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 17:40:06.188650 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184760 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 17:40:06.188650 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184764 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 17:40:06.188650 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184768 2573 flags.go:64] FLAG: --pod-cidr="" Apr 16 17:40:06.188650 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184771 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 17:40:06.188650 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184777 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 17:40:06.188650 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184780 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 17:40:06.188650 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184783 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 16 17:40:06.188650 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184786 2573 flags.go:64] FLAG: --port="10250" Apr 16 17:40:06.188650 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184789 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 17:40:06.189231 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184792 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0bd641af7676965a9" Apr 16 17:40:06.189231 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184796 2573 flags.go:64] FLAG: --qos-reserved="" Apr 16 17:40:06.189231 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184799 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 16 17:40:06.189231 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184802 2573 flags.go:64] FLAG: --register-node="true" Apr 16 17:40:06.189231 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184805 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 16 17:40:06.189231 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184807 2573 flags.go:64] FLAG: --register-with-taints="" Apr 16 17:40:06.189231 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184811 2573 flags.go:64] FLAG: --registry-burst="10" Apr 16 17:40:06.189231 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184814 2573 flags.go:64] FLAG: --registry-qps="5" Apr 16 17:40:06.189231 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184817 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 16 17:40:06.189231 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184820 2573 flags.go:64] FLAG: --reserved-memory="" Apr 16 17:40:06.189231 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184824 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 17:40:06.189231 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184827 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 17:40:06.189231 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184829 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 17:40:06.189231 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184834 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 17:40:06.189231 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184837 2573 flags.go:64] FLAG: --runonce="false" Apr 16 17:40:06.189231 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184840 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 17:40:06.189231 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184843 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 17:40:06.189231 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184846 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 16 17:40:06.189231 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184849 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 17:40:06.189231 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184852 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 17:40:06.189231 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184855 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 17:40:06.189231 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184858 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 17:40:06.189231 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184861 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 17:40:06.189231 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184864 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 17:40:06.189231 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184867 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 17:40:06.189231 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184870 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 17:40:06.189852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184874 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 17:40:06.189852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184877 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 17:40:06.189852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184880 2573 flags.go:64] FLAG: --system-cgroups="" Apr 16 17:40:06.189852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184883 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 17:40:06.189852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184888 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 17:40:06.189852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184891 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 16 17:40:06.189852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184894 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 17:40:06.189852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184899 2573 flags.go:64] FLAG: --tls-min-version="" Apr 16 17:40:06.189852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184902 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 17:40:06.189852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184904 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 17:40:06.189852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184907 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 17:40:06.189852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184910 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 17:40:06.189852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184913 2573 flags.go:64] FLAG: --v="2" Apr 16 17:40:06.189852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184918 2573 flags.go:64] FLAG: --version="false" Apr 16 17:40:06.189852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184921 2573 flags.go:64] FLAG: --vmodule="" Apr 16 17:40:06.189852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184926 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 17:40:06.189852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.184929 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 17:40:06.189852 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185023 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:40:06.189852 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185027 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:40:06.189852 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185031 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:40:06.189852 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185033 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:40:06.189852 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185036 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:40:06.189852 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185039 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:40:06.189852 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185042 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:40:06.190472 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185044 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:40:06.190472 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185047 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:40:06.190472 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185050 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:40:06.190472 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185052 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:40:06.190472 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185055 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:40:06.190472 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185057 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:40:06.190472 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185060 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:40:06.190472 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185063 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:40:06.190472 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185066 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:40:06.190472 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185069 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:40:06.190472 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185072 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:40:06.190472 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185075 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:40:06.190472 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185078 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:40:06.190472 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185080 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:40:06.190472 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185083 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:40:06.190472 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185085 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:40:06.190472 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185088 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:40:06.190472 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185091 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:40:06.190472 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185093 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:40:06.190472 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185096 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:40:06.190983 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185098 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:40:06.190983 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185101 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:40:06.190983 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185103 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:40:06.190983 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185106 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:40:06.190983 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185109 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:40:06.190983 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185111 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:40:06.190983 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185114 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:40:06.190983 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185117 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:40:06.190983 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185120 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:40:06.190983 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185123 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:40:06.190983 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185125 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:40:06.190983 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185128 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:40:06.190983 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185130 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:40:06.190983 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185133 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:40:06.190983 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185135 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:40:06.190983 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185138 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:40:06.190983 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185141 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:40:06.190983 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185144 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:40:06.190983 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185146 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:40:06.190983 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185149 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:40:06.191491 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185152 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:40:06.191491 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185154 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:40:06.191491 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185158 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:40:06.191491 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185161 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:40:06.191491 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185163 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:40:06.191491 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185166 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:40:06.191491 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185169 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:40:06.191491 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185171 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:40:06.191491 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185174 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:40:06.191491 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185176 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:40:06.191491 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185179 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:40:06.191491 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185181 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:40:06.191491 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185184 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:40:06.191491 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185187 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:40:06.191491 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185191 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:40:06.191491 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185194 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:40:06.191491 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185197 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:40:06.191491 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185200 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:40:06.191491 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185202 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:40:06.191994 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185206 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:40:06.191994 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185209 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:40:06.191994 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185225 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:40:06.191994 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185230 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:40:06.191994 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185233 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:40:06.191994 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185236 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:40:06.191994 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185239 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:40:06.191994 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185241 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:40:06.191994 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185244 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:40:06.191994 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185247 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:40:06.191994 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185249 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:40:06.191994 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185252 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:40:06.191994 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185255 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:40:06.191994 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185259 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:40:06.191994 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185262 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:40:06.191994 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185266 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:40:06.191994 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185270 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:40:06.191994 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185273 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:40:06.191994 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185276 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:40:06.191994 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.185279 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:40:06.192517 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.186074 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 17:40:06.193243 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.193212 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 17:40:06.193275 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.193244 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 17:40:06.193307 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193293 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:40:06.193307 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193298 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:40:06.193307 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193301 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:40:06.193307 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193305 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:40:06.193307 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193308 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:40:06.193438 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193312 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:40:06.193438 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193315 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:40:06.193438 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193318 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:40:06.193438 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193321 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:40:06.193438 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193324 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:40:06.193438 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193327 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:40:06.193438 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193330 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:40:06.193438 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193333 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:40:06.193438 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193336 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:40:06.193438 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193339 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:40:06.193438 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193342 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:40:06.193438 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193345 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:40:06.193438 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193347 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:40:06.193438 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193350 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:40:06.193438 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193353 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:40:06.193438 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193357 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:40:06.193438 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193362 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:40:06.193438 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193365 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:40:06.193438 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193368 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:40:06.193438 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193371 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:40:06.193931 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193374 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:40:06.193931 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193376 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:40:06.193931 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193379 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:40:06.193931 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193382 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:40:06.193931 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193384 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:40:06.193931 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193387 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:40:06.193931 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193390 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:40:06.193931 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193393 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:40:06.193931 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193396 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:40:06.193931 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193399 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:40:06.193931 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193401 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:40:06.193931 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193404 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:40:06.193931 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193407 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:40:06.193931 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193410 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:40:06.193931 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193413 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:40:06.193931 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193416 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:40:06.193931 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193419 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:40:06.193931 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193422 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:40:06.193931 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193424 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:40:06.193931 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193427 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:40:06.194446 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193430 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:40:06.194446 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193432 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:40:06.194446 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193435 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:40:06.194446 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193438 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:40:06.194446 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193440 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:40:06.194446 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193443 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:40:06.194446 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193446 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:40:06.194446 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193449 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:40:06.194446 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193451 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:40:06.194446 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193454 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:40:06.194446 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193457 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:40:06.194446 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193459 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:40:06.194446 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193462 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:40:06.194446 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193465 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:40:06.194446 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193467 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:40:06.194446 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193470 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:40:06.194446 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193472 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:40:06.194446 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193475 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:40:06.194446 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193478 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:40:06.194941 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193481 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:40:06.194941 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193484 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:40:06.194941 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193486 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:40:06.194941 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193489 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:40:06.194941 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193492 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:40:06.194941 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193495 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:40:06.194941 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193499 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:40:06.194941 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193501 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:40:06.194941 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193504 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:40:06.194941 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193507 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:40:06.194941 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193509 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:40:06.194941 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193512 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:40:06.194941 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193517 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:40:06.194941 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193521 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:40:06.194941 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193524 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:40:06.194941 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193527 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:40:06.194941 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193530 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:40:06.194941 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193533 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:40:06.194941 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193536 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:40:06.195426 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193539 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:40:06.195426 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193541 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:40:06.195426 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193544 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:40:06.195426 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.193549 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 17:40:06.195426 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193648 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 17:40:06.195426 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193653 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 17:40:06.195426 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193656 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 17:40:06.195426 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193660 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 17:40:06.195426 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193662 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 17:40:06.195426 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193665 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 17:40:06.195426 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193668 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 17:40:06.195426 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193671 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 17:40:06.195426 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193673 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 17:40:06.195426 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193676 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 17:40:06.195426 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193679 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 17:40:06.195803 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193683 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 17:40:06.195803 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193688 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 17:40:06.195803 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193691 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 17:40:06.195803 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193694 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 17:40:06.195803 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193697 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 17:40:06.195803 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193700 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 17:40:06.195803 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193702 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 17:40:06.195803 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193705 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 17:40:06.195803 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193709 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 17:40:06.195803 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193713 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 17:40:06.195803 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193716 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 17:40:06.195803 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193719 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 17:40:06.195803 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193722 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 17:40:06.195803 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193724 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 17:40:06.195803 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193727 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 17:40:06.195803 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193730 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 17:40:06.195803 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193733 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 17:40:06.195803 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193736 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 17:40:06.195803 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193739 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 17:40:06.196286 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193742 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 17:40:06.196286 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193744 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 17:40:06.196286 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193747 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 17:40:06.196286 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193750 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 17:40:06.196286 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193752 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 17:40:06.196286 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193755 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 17:40:06.196286 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193757 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 17:40:06.196286 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193760 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 17:40:06.196286 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193763 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 17:40:06.196286 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193766 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 17:40:06.196286 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193768 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 17:40:06.196286 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193771 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 17:40:06.196286 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193773 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 17:40:06.196286 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193777 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 17:40:06.196286 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193779 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 17:40:06.196286 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193782 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 17:40:06.196286 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193784 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 17:40:06.196286 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193787 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 17:40:06.196286 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193790 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 17:40:06.196286 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193793 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 17:40:06.196759 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193795 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 17:40:06.196759 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193798 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 17:40:06.196759 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193800 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 17:40:06.196759 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193803 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 17:40:06.196759 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193805 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 17:40:06.196759 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193808 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 17:40:06.196759 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193810 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 17:40:06.196759 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193813 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 17:40:06.196759 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193816 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 17:40:06.196759 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193818 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 17:40:06.196759 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193821 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 17:40:06.196759 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193824 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 17:40:06.196759 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193826 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 17:40:06.196759 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193828 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 17:40:06.196759 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193831 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 17:40:06.196759 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193834 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 17:40:06.196759 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193837 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 17:40:06.196759 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193839 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 17:40:06.196759 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193842 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 17:40:06.196759 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193845 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 17:40:06.197264 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193847 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 17:40:06.197264 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193850 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 17:40:06.197264 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193853 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 17:40:06.197264 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193855 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 17:40:06.197264 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193858 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 17:40:06.197264 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193860 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 17:40:06.197264 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193864 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 17:40:06.197264 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193867 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 17:40:06.197264 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193869 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 17:40:06.197264 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193872 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 17:40:06.197264 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193874 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 17:40:06.197264 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193877 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 17:40:06.197264 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193879 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 17:40:06.197264 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193882 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 17:40:06.197264 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193884 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 17:40:06.197264 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:06.193887 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 17:40:06.197656 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.193892 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 17:40:06.197656 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.194700 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 17:40:06.197656 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.197247 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 17:40:06.198342 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.198330 2573 server.go:1019] "Starting client certificate rotation" Apr 16 17:40:06.198446 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.198429 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 17:40:06.198484 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.198475 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 17:40:06.225668 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.225651 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 17:40:06.228278 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.228257 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 17:40:06.248284 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.248260 2573 log.go:25] "Validated CRI v1 runtime API" Apr 16 17:40:06.254702 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.254684 2573 log.go:25] "Validated CRI v1 image API" Apr 16 17:40:06.256813 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.256795 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 17:40:06.260997 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.260973 2573 fs.go:135] Filesystem UUIDs: map[614529fe-350f-4ded-bca2-c834f41eca3a:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 b835c2a2-ac70-41b9-ba98-edb4301fe6ec:/dev/nvme0n1p3] Apr 16 17:40:06.261073 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.260995 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 17:40:06.264164 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.264145 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 17:40:06.266914 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.266805 2573 manager.go:217] Machine: {Timestamp:2026-04-16 17:40:06.264737877 +0000 UTC m=+0.477771016 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099683 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2d54870ae6b83ddc35706052cb222e SystemUUID:ec2d5487-0ae6-b83d-dc35-706052cb222e BootID:f41eae7d-6cff-460f-a8b5-fc8f640cd7fe Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:e7:58:ee:76:33 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:e7:58:ee:76:33 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:3e:28:14:2a:40:62 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 17:40:06.267661 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.267651 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 17:40:06.267746 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.267734 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 17:40:06.268063 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.268037 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 17:40:06.268258 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.268064 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-241.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 17:40:06.268349 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.268271 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 17:40:06.268349 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.268284 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 17:40:06.268349 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.268302 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 17:40:06.269226 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.269198 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 17:40:06.271597 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.271583 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 16 17:40:06.271729 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.271717 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 17:40:06.274507 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.274494 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 16 17:40:06.274574 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.274518 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 17:40:06.274574 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.274536 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 17:40:06.274574 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.274550 2573 kubelet.go:397] "Adding apiserver pod source" Apr 16 17:40:06.274574 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.274562 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 17:40:06.275805 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.275791 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 17:40:06.275871 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.275815 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 17:40:06.279272 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.279253 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 17:40:06.281489 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.281476 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 17:40:06.283165 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.283152 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 17:40:06.283227 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.283169 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 17:40:06.283227 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.283175 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 17:40:06.283227 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.283181 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 17:40:06.283227 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.283186 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 17:40:06.283227 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.283192 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 17:40:06.283227 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.283198 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 17:40:06.283227 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.283204 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 17:40:06.283227 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.283226 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 17:40:06.283462 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.283234 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 17:40:06.283462 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.283242 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 17:40:06.283462 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.283250 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 17:40:06.284321 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.284307 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 17:40:06.284358 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.284323 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 17:40:06.287990 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.287972 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-241.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 17:40:06.287990 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:06.287961 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 17:40:06.288093 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:06.287994 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-241.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 17:40:06.288093 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.288081 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 17:40:06.288176 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.288122 2573 server.go:1295] "Started kubelet" Apr 16 17:40:06.288176 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.288163 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 17:40:06.288295 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.288243 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 17:40:06.288348 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.288320 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 17:40:06.288875 ip-10-0-128-241 systemd[1]: Started Kubernetes Kubelet. Apr 16 17:40:06.290984 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.290973 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 16 17:40:06.292743 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.292724 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 17:40:06.295409 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:06.294465 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-241.ec2.internal.18a6e71b059601e7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-241.ec2.internal,UID:ip-10-0-128-241.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-241.ec2.internal,},FirstTimestamp:2026-04-16 17:40:06.288089575 +0000 UTC m=+0.501122714,LastTimestamp:2026-04-16 17:40:06.288089575 +0000 UTC m=+0.501122714,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-241.ec2.internal,}" Apr 16 17:40:06.298971 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.298954 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 17:40:06.299649 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.299632 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 17:40:06.302654 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.302593 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 17:40:06.302757 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.302738 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 17:40:06.302966 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:06.302890 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-241.ec2.internal\" not found" Apr 16 17:40:06.303091 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.303070 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 16 17:40:06.303091 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.303091 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 16 17:40:06.303506 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.303487 2573 factory.go:55] Registering systemd factory Apr 16 17:40:06.303574 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.303510 2573 factory.go:223] Registration of the systemd container factory successfully Apr 16 17:40:06.303735 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.303697 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 17:40:06.303844 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.303831 2573 factory.go:153] Registering CRI-O factory Apr 16 17:40:06.303942 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.303928 2573 factory.go:223] Registration of the crio container factory successfully Apr 16 17:40:06.303942 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.303930 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-st2nv" Apr 16 17:40:06.304079 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.303998 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 17:40:06.304079 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:06.303998 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 17:40:06.304079 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.304024 2573 factory.go:103] Registering Raw factory Apr 16 17:40:06.304079 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.304048 2573 manager.go:1196] Started watching for new ooms in manager Apr 16 17:40:06.304848 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.304832 2573 manager.go:319] Starting recovery of all containers Apr 16 17:40:06.310769 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.310741 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-st2nv" Apr 16 17:40:06.313405 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:06.312951 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 17:40:06.313405 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:06.312972 2573 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-128-241.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 17:40:06.316786 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.316623 2573 manager.go:324] Recovery completed Apr 16 17:40:06.320865 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.320849 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:40:06.323642 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.323625 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-241.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:40:06.323707 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.323652 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-241.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:40:06.323707 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.323667 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-241.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:40:06.324097 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.324085 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 17:40:06.324097 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.324096 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 17:40:06.324172 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.324117 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 16 17:40:06.328254 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.328240 2573 policy_none.go:49] "None policy: Start" Apr 16 17:40:06.328254 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.328256 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 17:40:06.328365 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.328275 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 16 17:40:06.389016 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.371722 2573 manager.go:341] "Starting Device Plugin manager" Apr 16 17:40:06.389016 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:06.371759 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 17:40:06.389016 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.371773 2573 server.go:85] "Starting device plugin registration server" Apr 16 17:40:06.389016 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.372031 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 17:40:06.389016 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.372046 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 17:40:06.389016 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.372131 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 17:40:06.389016 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.372205 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 17:40:06.389016 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.372240 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 17:40:06.389016 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:06.372785 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 17:40:06.389016 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:06.372825 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-241.ec2.internal\" not found" Apr 16 17:40:06.436374 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.436300 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 17:40:06.437558 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.437543 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 17:40:06.437610 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.437570 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 17:40:06.437610 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.437588 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 17:40:06.437610 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.437594 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 17:40:06.437733 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:06.437628 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 17:40:06.443570 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.443549 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:40:06.472808 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.472786 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:40:06.473668 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.473649 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-241.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:40:06.473752 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.473686 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-241.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:40:06.473752 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.473709 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-241.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:40:06.473752 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.473731 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-241.ec2.internal" Apr 16 17:40:06.483173 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.483157 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-241.ec2.internal" Apr 16 17:40:06.483252 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:06.483176 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-241.ec2.internal\": node \"ip-10-0-128-241.ec2.internal\" not found" Apr 16 17:40:06.495384 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:06.495358 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-241.ec2.internal\" not found" Apr 16 17:40:06.537891 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.537871 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-241.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-241.ec2.internal"] Apr 16 17:40:06.537961 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.537929 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:40:06.539572 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.539558 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-241.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:40:06.539628 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.539585 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-241.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:40:06.539628 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.539594 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-241.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:40:06.541896 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.541884 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:40:06.542033 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.542021 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-241.ec2.internal" Apr 16 17:40:06.542073 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.542046 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:40:06.542533 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.542521 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-241.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:40:06.542608 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.542542 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-241.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:40:06.542608 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.542552 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-241.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:40:06.542954 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.542936 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-241.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:40:06.543066 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.542965 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-241.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:40:06.543066 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.542978 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-241.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:40:06.545613 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.545599 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-241.ec2.internal" Apr 16 17:40:06.545687 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.545629 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 17:40:06.546318 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.546295 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-241.ec2.internal" event="NodeHasSufficientMemory" Apr 16 17:40:06.546402 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.546329 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-241.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 17:40:06.546402 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.546340 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-241.ec2.internal" event="NodeHasSufficientPID" Apr 16 17:40:06.566799 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:06.566776 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-241.ec2.internal\" not found" node="ip-10-0-128-241.ec2.internal" Apr 16 17:40:06.572308 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:06.572293 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-241.ec2.internal\" not found" node="ip-10-0-128-241.ec2.internal" Apr 16 17:40:06.595620 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:06.595600 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-241.ec2.internal\" not found" Apr 16 17:40:06.604899 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.604882 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/59def4faf8392fa6f55a2e9f2ba8a155-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-241.ec2.internal\" (UID: \"59def4faf8392fa6f55a2e9f2ba8a155\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-241.ec2.internal" Apr 16 17:40:06.604958 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.604911 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/59def4faf8392fa6f55a2e9f2ba8a155-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-241.ec2.internal\" (UID: \"59def4faf8392fa6f55a2e9f2ba8a155\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-241.ec2.internal" Apr 16 17:40:06.604958 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.604929 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cba24747f92b6b0c65ecaea92412a09c-config\") pod \"kube-apiserver-proxy-ip-10-0-128-241.ec2.internal\" (UID: \"cba24747f92b6b0c65ecaea92412a09c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-241.ec2.internal" Apr 16 17:40:06.696597 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:06.696536 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-241.ec2.internal\" not found" Apr 16 17:40:06.706003 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.705983 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/59def4faf8392fa6f55a2e9f2ba8a155-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-241.ec2.internal\" (UID: \"59def4faf8392fa6f55a2e9f2ba8a155\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-241.ec2.internal" Apr 16 17:40:06.706060 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.706009 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/59def4faf8392fa6f55a2e9f2ba8a155-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-241.ec2.internal\" (UID: \"59def4faf8392fa6f55a2e9f2ba8a155\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-241.ec2.internal" Apr 16 17:40:06.706060 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.706026 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cba24747f92b6b0c65ecaea92412a09c-config\") pod \"kube-apiserver-proxy-ip-10-0-128-241.ec2.internal\" (UID: \"cba24747f92b6b0c65ecaea92412a09c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-241.ec2.internal" Apr 16 17:40:06.706135 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.706088 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/59def4faf8392fa6f55a2e9f2ba8a155-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-241.ec2.internal\" (UID: \"59def4faf8392fa6f55a2e9f2ba8a155\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-241.ec2.internal" Apr 16 17:40:06.706135 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.706094 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/59def4faf8392fa6f55a2e9f2ba8a155-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-241.ec2.internal\" (UID: \"59def4faf8392fa6f55a2e9f2ba8a155\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-241.ec2.internal" Apr 16 17:40:06.706135 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.706088 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cba24747f92b6b0c65ecaea92412a09c-config\") pod \"kube-apiserver-proxy-ip-10-0-128-241.ec2.internal\" (UID: \"cba24747f92b6b0c65ecaea92412a09c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-241.ec2.internal" Apr 16 17:40:06.797404 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:06.797377 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-241.ec2.internal\" not found" Apr 16 17:40:06.870040 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.870006 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-241.ec2.internal" Apr 16 17:40:06.874415 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:06.874397 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-241.ec2.internal" Apr 16 17:40:06.898508 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:06.898482 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-241.ec2.internal\" not found" Apr 16 17:40:06.999144 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:06.999074 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-241.ec2.internal\" not found" Apr 16 17:40:07.099756 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:07.099727 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-241.ec2.internal\" not found" Apr 16 17:40:07.199301 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:07.199267 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 17:40:07.199848 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:07.199445 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 17:40:07.200371 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:07.200354 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-241.ec2.internal\" not found" Apr 16 17:40:07.227872 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:07.227846 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:40:07.299962 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:07.299932 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 17:40:07.301049 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:07.301020 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-241.ec2.internal\" not found" Apr 16 17:40:07.310468 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:07.310440 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 17:40:07.313478 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:07.313450 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 17:35:06 +0000 UTC" deadline="2028-01-13 13:18:00.044777058 +0000 UTC" Apr 16 17:40:07.313478 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:07.313472 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15283h37m52.731307579s" Apr 16 17:40:07.336473 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:07.336438 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-sdfl5" Apr 16 17:40:07.346428 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:07.346407 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-sdfl5" Apr 16 17:40:07.402084 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:07.402050 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-241.ec2.internal\" not found" Apr 16 17:40:07.412783 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:07.412747 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcba24747f92b6b0c65ecaea92412a09c.slice/crio-f65495144de4c912d7669fbbbcc42fde5a69d4c6bcee7a482611d49136b876f6 WatchSource:0}: Error finding container f65495144de4c912d7669fbbbcc42fde5a69d4c6bcee7a482611d49136b876f6: Status 404 returned error can't find the container with id f65495144de4c912d7669fbbbcc42fde5a69d4c6bcee7a482611d49136b876f6 Apr 16 17:40:07.413077 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:07.413053 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59def4faf8392fa6f55a2e9f2ba8a155.slice/crio-9b1839cd514830c37f60c636e160bb79d20c9488f4d844443358b7dcba20c330 WatchSource:0}: Error finding container 9b1839cd514830c37f60c636e160bb79d20c9488f4d844443358b7dcba20c330: Status 404 returned error can't find the container with id 9b1839cd514830c37f60c636e160bb79d20c9488f4d844443358b7dcba20c330 Apr 16 17:40:07.416975 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:07.416960 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:40:07.440688 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:07.440644 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-241.ec2.internal" event={"ID":"cba24747f92b6b0c65ecaea92412a09c","Type":"ContainerStarted","Data":"f65495144de4c912d7669fbbbcc42fde5a69d4c6bcee7a482611d49136b876f6"} Apr 16 17:40:07.441635 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:07.441608 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-241.ec2.internal" event={"ID":"59def4faf8392fa6f55a2e9f2ba8a155","Type":"ContainerStarted","Data":"9b1839cd514830c37f60c636e160bb79d20c9488f4d844443358b7dcba20c330"} Apr 16 17:40:07.470675 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:07.470654 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:40:07.502338 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:07.502317 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-241.ec2.internal\" not found" Apr 16 17:40:07.602859 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:07.602800 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-241.ec2.internal\" not found" Apr 16 17:40:07.703262 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:07.703212 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-241.ec2.internal\" not found" Apr 16 17:40:07.803874 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:07.803843 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-241.ec2.internal\" not found" Apr 16 17:40:07.806420 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:07.806402 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:40:07.903683 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:07.903602 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-241.ec2.internal" Apr 16 17:40:07.913245 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:07.913207 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 17:40:07.914025 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:07.913999 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-241.ec2.internal" Apr 16 17:40:07.930229 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:07.930196 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 17:40:08.250009 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.249910 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 17:40:08.275324 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.275287 2573 apiserver.go:52] "Watching apiserver" Apr 16 17:40:08.286083 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.286058 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 17:40:08.287268 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.287238 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-tbmqq","kube-system/kube-apiserver-proxy-ip-10-0-128-241.ec2.internal","openshift-cluster-node-tuning-operator/tuned-xz9gn","openshift-dns/node-resolver-clpc9","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-241.ec2.internal","openshift-multus/multus-additional-cni-plugins-6fl9q","openshift-multus/network-metrics-daemon-n4qhr","openshift-ovn-kubernetes/ovnkube-node-9m58b","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8nzc","openshift-image-registry/node-ca-l2szv","openshift-multus/multus-wrnk9","openshift-network-diagnostics/network-check-target-769fj","openshift-network-operator/iptables-alerter-jm645"] Apr 16 17:40:08.290377 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.290299 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:40:08.290505 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:08.290390 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n4qhr" podUID="7db52a98-86b8-46da-a83e-8f6ee99d696d" Apr 16 17:40:08.294742 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.294704 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.294867 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.294790 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-clpc9" Apr 16 17:40:08.297379 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.297354 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 17:40:08.297545 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.297531 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 17:40:08.297618 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.297552 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 17:40:08.297618 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.297586 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-lwjrt\"" Apr 16 17:40:08.297721 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.297678 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8m9jf\"" Apr 16 17:40:08.297800 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.297784 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 17:40:08.299519 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.299233 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6fl9q" Apr 16 17:40:08.301492 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.301457 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tbmqq" Apr 16 17:40:08.302072 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.301934 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 17:40:08.302072 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.301950 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 17:40:08.302072 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.301972 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 17:40:08.302281 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.302203 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 17:40:08.302435 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.302405 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-l7cw4\"" Apr 16 17:40:08.302598 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.302581 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 17:40:08.305325 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.304889 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8nzc" Apr 16 17:40:08.305455 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.305421 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-xm98m\"" Apr 16 17:40:08.306197 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.305899 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 17:40:08.306197 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.305988 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.307668 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.307593 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 17:40:08.308180 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.308158 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 17:40:08.308530 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.308166 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-4n2dn\"" Apr 16 17:40:08.309163 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.308761 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-l2szv" Apr 16 17:40:08.309163 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.308851 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 17:40:08.310201 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.310000 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 17:40:08.310201 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.310174 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 17:40:08.310362 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.310269 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 17:40:08.310362 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.310283 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 17:40:08.310362 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.310338 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-9zc72\"" Apr 16 17:40:08.310510 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.310388 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 17:40:08.310510 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.310435 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 17:40:08.310842 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.310799 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 17:40:08.312429 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.312395 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 17:40:08.312518 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.312491 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 17:40:08.312862 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.312840 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-dqnd6\"" Apr 16 17:40:08.313465 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.313302 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 17:40:08.315852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.314308 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0068bf9d-29ac-42bf-95e1-3f57493e68f0-registration-dir\") pod \"aws-ebs-csi-driver-node-q8nzc\" (UID: \"0068bf9d-29ac-42bf-95e1-3f57493e68f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8nzc" Apr 16 17:40:08.315852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.314342 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/76b37f14-34f7-4661-ad91-459fb138a436-host\") pod \"node-ca-l2szv\" (UID: \"76b37f14-34f7-4661-ad91-459fb138a436\") " pod="openshift-image-registry/node-ca-l2szv" Apr 16 17:40:08.315852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.314368 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-etc-systemd\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.315852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.314399 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-run\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.315852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.314425 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3af705fc-ec69-4117-8797-2dacaf0f64e4-os-release\") pod \"multus-additional-cni-plugins-6fl9q\" (UID: \"3af705fc-ec69-4117-8797-2dacaf0f64e4\") " pod="openshift-multus/multus-additional-cni-plugins-6fl9q" Apr 16 17:40:08.315852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.314451 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0068bf9d-29ac-42bf-95e1-3f57493e68f0-etc-selinux\") pod \"aws-ebs-csi-driver-node-q8nzc\" (UID: \"0068bf9d-29ac-42bf-95e1-3f57493e68f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8nzc" Apr 16 17:40:08.315852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.314475 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-etc-modprobe-d\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.315852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.314498 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-lib-modules\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.315852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.314526 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-host-run-ovn-kubernetes\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.315852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.314552 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrq4g\" (UniqueName: \"kubernetes.io/projected/7db52a98-86b8-46da-a83e-8f6ee99d696d-kube-api-access-wrq4g\") pod \"network-metrics-daemon-n4qhr\" (UID: \"7db52a98-86b8-46da-a83e-8f6ee99d696d\") " pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:40:08.315852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.314605 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.315852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.314618 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0068bf9d-29ac-42bf-95e1-3f57493e68f0-socket-dir\") pod \"aws-ebs-csi-driver-node-q8nzc\" (UID: \"0068bf9d-29ac-42bf-95e1-3f57493e68f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8nzc" Apr 16 17:40:08.315852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.314646 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3af705fc-ec69-4117-8797-2dacaf0f64e4-cni-binary-copy\") pod \"multus-additional-cni-plugins-6fl9q\" (UID: \"3af705fc-ec69-4117-8797-2dacaf0f64e4\") " pod="openshift-multus/multus-additional-cni-plugins-6fl9q" Apr 16 17:40:08.315852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.314674 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-host-run-netns\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.315852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.314700 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-run-openvswitch\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.315852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.314750 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5626e70b-1b0e-424a-af3b-d0dba055fd1b-tmp-dir\") pod \"node-resolver-clpc9\" (UID: \"5626e70b-1b0e-424a-af3b-d0dba055fd1b\") " pod="openshift-dns/node-resolver-clpc9" Apr 16 17:40:08.315852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.314778 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmddh\" (UniqueName: \"kubernetes.io/projected/0068bf9d-29ac-42bf-95e1-3f57493e68f0-kube-api-access-nmddh\") pod \"aws-ebs-csi-driver-node-q8nzc\" (UID: \"0068bf9d-29ac-42bf-95e1-3f57493e68f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8nzc" Apr 16 17:40:08.316903 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.314801 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-sys\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.316903 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.314823 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlzdl\" (UniqueName: \"kubernetes.io/projected/3af705fc-ec69-4117-8797-2dacaf0f64e4-kube-api-access-hlzdl\") pod \"multus-additional-cni-plugins-6fl9q\" (UID: \"3af705fc-ec69-4117-8797-2dacaf0f64e4\") " pod="openshift-multus/multus-additional-cni-plugins-6fl9q" Apr 16 17:40:08.316903 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.314846 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-host-cni-netd\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.316903 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.314877 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3af705fc-ec69-4117-8797-2dacaf0f64e4-system-cni-dir\") pod \"multus-additional-cni-plugins-6fl9q\" (UID: \"3af705fc-ec69-4117-8797-2dacaf0f64e4\") " pod="openshift-multus/multus-additional-cni-plugins-6fl9q" Apr 16 17:40:08.316903 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.314903 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c07dd47a-0eea-4a37-908a-194889b059cd-agent-certs\") pod \"konnectivity-agent-tbmqq\" (UID: \"c07dd47a-0eea-4a37-908a-194889b059cd\") " pod="kube-system/konnectivity-agent-tbmqq" Apr 16 17:40:08.316903 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.314927 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c07dd47a-0eea-4a37-908a-194889b059cd-konnectivity-ca\") pod \"konnectivity-agent-tbmqq\" (UID: \"c07dd47a-0eea-4a37-908a-194889b059cd\") " pod="kube-system/konnectivity-agent-tbmqq" Apr 16 17:40:08.316903 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.314995 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-host-kubelet\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.316903 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.315039 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0068bf9d-29ac-42bf-95e1-3f57493e68f0-sys-fs\") pod \"aws-ebs-csi-driver-node-q8nzc\" (UID: \"0068bf9d-29ac-42bf-95e1-3f57493e68f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8nzc" Apr 16 17:40:08.316903 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.315071 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-etc-sysctl-conf\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.316903 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.315110 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3af705fc-ec69-4117-8797-2dacaf0f64e4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6fl9q\" (UID: \"3af705fc-ec69-4117-8797-2dacaf0f64e4\") " pod="openshift-multus/multus-additional-cni-plugins-6fl9q" Apr 16 17:40:08.316903 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.315139 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0068bf9d-29ac-42bf-95e1-3f57493e68f0-device-dir\") pod \"aws-ebs-csi-driver-node-q8nzc\" (UID: \"0068bf9d-29ac-42bf-95e1-3f57493e68f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8nzc" Apr 16 17:40:08.316903 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.315177 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws2qj\" (UniqueName: \"kubernetes.io/projected/efbec2a2-d66f-4338-b932-105a8b5ba652-kube-api-access-ws2qj\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.316903 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.315203 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-run-ovn\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.316903 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.315261 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-host-cni-bin\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.316903 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.315328 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/af1b1635-1312-477a-9354-2b356990c171-ovnkube-script-lib\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.316903 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.315363 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.317703 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.315394 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/af1b1635-1312-477a-9354-2b356990c171-env-overrides\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.317703 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.315421 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-host\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.317703 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.315444 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-log-socket\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.317703 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.315494 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0068bf9d-29ac-42bf-95e1-3f57493e68f0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-q8nzc\" (UID: \"0068bf9d-29ac-42bf-95e1-3f57493e68f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8nzc" Apr 16 17:40:08.317703 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.315517 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-etc-sysctl-d\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.317703 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.315541 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-systemd-units\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.317703 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.315592 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-host-slash\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.317703 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.315624 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/af1b1635-1312-477a-9354-2b356990c171-ovnkube-config\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.317703 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.315650 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/af1b1635-1312-477a-9354-2b356990c171-ovn-node-metrics-cert\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.317703 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.315677 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr27w\" (UniqueName: \"kubernetes.io/projected/af1b1635-1312-477a-9354-2b356990c171-kube-api-access-lr27w\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.317703 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.315704 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9vz2\" (UniqueName: \"kubernetes.io/projected/5626e70b-1b0e-424a-af3b-d0dba055fd1b-kube-api-access-w9vz2\") pod \"node-resolver-clpc9\" (UID: \"5626e70b-1b0e-424a-af3b-d0dba055fd1b\") " pod="openshift-dns/node-resolver-clpc9" Apr 16 17:40:08.317703 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.315726 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/efbec2a2-d66f-4338-b932-105a8b5ba652-tmp\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.317703 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.315752 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3af705fc-ec69-4117-8797-2dacaf0f64e4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6fl9q\" (UID: \"3af705fc-ec69-4117-8797-2dacaf0f64e4\") " pod="openshift-multus/multus-additional-cni-plugins-6fl9q" Apr 16 17:40:08.317703 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.315779 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-run-systemd\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.317703 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.315804 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-var-lib-openvswitch\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.317703 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.315827 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-etc-sysconfig\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.318464 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.315851 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-etc-kubernetes\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.318464 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.315873 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-node-log\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.318464 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.315908 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7db52a98-86b8-46da-a83e-8f6ee99d696d-metrics-certs\") pod \"network-metrics-daemon-n4qhr\" (UID: \"7db52a98-86b8-46da-a83e-8f6ee99d696d\") " pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:40:08.318464 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.315956 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/76b37f14-34f7-4661-ad91-459fb138a436-serviceca\") pod \"node-ca-l2szv\" (UID: \"76b37f14-34f7-4661-ad91-459fb138a436\") " pod="openshift-image-registry/node-ca-l2szv" Apr 16 17:40:08.318464 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.315986 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-294rl\" (UniqueName: \"kubernetes.io/projected/76b37f14-34f7-4661-ad91-459fb138a436-kube-api-access-294rl\") pod \"node-ca-l2szv\" (UID: \"76b37f14-34f7-4661-ad91-459fb138a436\") " pod="openshift-image-registry/node-ca-l2szv" Apr 16 17:40:08.318464 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.316019 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3af705fc-ec69-4117-8797-2dacaf0f64e4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6fl9q\" (UID: \"3af705fc-ec69-4117-8797-2dacaf0f64e4\") " pod="openshift-multus/multus-additional-cni-plugins-6fl9q" Apr 16 17:40:08.318464 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.316050 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5626e70b-1b0e-424a-af3b-d0dba055fd1b-hosts-file\") pod \"node-resolver-clpc9\" (UID: \"5626e70b-1b0e-424a-af3b-d0dba055fd1b\") " pod="openshift-dns/node-resolver-clpc9" Apr 16 17:40:08.318464 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.316084 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-var-lib-kubelet\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.318464 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.316244 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-etc-openvswitch\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.318464 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.316303 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/efbec2a2-d66f-4338-b932-105a8b5ba652-etc-tuned\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.318464 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.316338 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3af705fc-ec69-4117-8797-2dacaf0f64e4-cnibin\") pod \"multus-additional-cni-plugins-6fl9q\" (UID: \"3af705fc-ec69-4117-8797-2dacaf0f64e4\") " pod="openshift-multus/multus-additional-cni-plugins-6fl9q" Apr 16 17:40:08.318464 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.316905 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-769fj" Apr 16 17:40:08.318464 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:08.316959 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-769fj" podUID="cac5b68b-21bc-4998-8cf4-855cf71cdc45" Apr 16 17:40:08.318464 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.317012 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jm645" Apr 16 17:40:08.319274 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.318644 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-dx7fm\"" Apr 16 17:40:08.319274 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.318857 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 17:40:08.319382 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.319363 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 17:40:08.319508 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.319492 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 17:40:08.319916 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.319882 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 17:40:08.320041 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.319996 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-k5ft7\"" Apr 16 17:40:08.347786 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.347726 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 17:35:07 +0000 UTC" deadline="2027-11-06 08:57:15.608784274 +0000 UTC" Apr 16 17:40:08.347786 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.347757 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13647h17m7.261032713s" Apr 16 17:40:08.405749 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.405715 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 17:40:08.416934 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.416904 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-host-run-multus-certs\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.417111 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.416950 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5626e70b-1b0e-424a-af3b-d0dba055fd1b-tmp-dir\") pod \"node-resolver-clpc9\" (UID: \"5626e70b-1b0e-424a-af3b-d0dba055fd1b\") " pod="openshift-dns/node-resolver-clpc9" Apr 16 17:40:08.417111 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.416980 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmddh\" (UniqueName: \"kubernetes.io/projected/0068bf9d-29ac-42bf-95e1-3f57493e68f0-kube-api-access-nmddh\") pod \"aws-ebs-csi-driver-node-q8nzc\" (UID: \"0068bf9d-29ac-42bf-95e1-3f57493e68f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8nzc" Apr 16 17:40:08.417111 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417073 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-sys\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.417299 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417126 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlzdl\" (UniqueName: \"kubernetes.io/projected/3af705fc-ec69-4117-8797-2dacaf0f64e4-kube-api-access-hlzdl\") pod \"multus-additional-cni-plugins-6fl9q\" (UID: \"3af705fc-ec69-4117-8797-2dacaf0f64e4\") " pod="openshift-multus/multus-additional-cni-plugins-6fl9q" Apr 16 17:40:08.417299 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417150 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-host-cni-netd\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.417299 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417173 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3af705fc-ec69-4117-8797-2dacaf0f64e4-system-cni-dir\") pod \"multus-additional-cni-plugins-6fl9q\" (UID: \"3af705fc-ec69-4117-8797-2dacaf0f64e4\") " pod="openshift-multus/multus-additional-cni-plugins-6fl9q" Apr 16 17:40:08.417299 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417196 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c07dd47a-0eea-4a37-908a-194889b059cd-agent-certs\") pod \"konnectivity-agent-tbmqq\" (UID: \"c07dd47a-0eea-4a37-908a-194889b059cd\") " pod="kube-system/konnectivity-agent-tbmqq" Apr 16 17:40:08.417299 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417236 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c07dd47a-0eea-4a37-908a-194889b059cd-konnectivity-ca\") pod \"konnectivity-agent-tbmqq\" (UID: \"c07dd47a-0eea-4a37-908a-194889b059cd\") " pod="kube-system/konnectivity-agent-tbmqq" Apr 16 17:40:08.417299 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417260 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-host-kubelet\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.417299 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417287 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-host-cni-netd\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.417299 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417294 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-multus-socket-dir-parent\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.417683 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417320 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-host-var-lib-cni-bin\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.417683 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417349 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0068bf9d-29ac-42bf-95e1-3f57493e68f0-sys-fs\") pod \"aws-ebs-csi-driver-node-q8nzc\" (UID: \"0068bf9d-29ac-42bf-95e1-3f57493e68f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8nzc" Apr 16 17:40:08.417683 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417351 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-sys\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.417683 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417363 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5626e70b-1b0e-424a-af3b-d0dba055fd1b-tmp-dir\") pod \"node-resolver-clpc9\" (UID: \"5626e70b-1b0e-424a-af3b-d0dba055fd1b\") " pod="openshift-dns/node-resolver-clpc9" Apr 16 17:40:08.417683 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417373 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-etc-sysctl-conf\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.417683 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417397 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3af705fc-ec69-4117-8797-2dacaf0f64e4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6fl9q\" (UID: \"3af705fc-ec69-4117-8797-2dacaf0f64e4\") " pod="openshift-multus/multus-additional-cni-plugins-6fl9q" Apr 16 17:40:08.417683 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417423 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-multus-conf-dir\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.417683 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417438 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-host-kubelet\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.417683 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417449 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0068bf9d-29ac-42bf-95e1-3f57493e68f0-device-dir\") pod \"aws-ebs-csi-driver-node-q8nzc\" (UID: \"0068bf9d-29ac-42bf-95e1-3f57493e68f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8nzc" Apr 16 17:40:08.417683 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417486 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3af705fc-ec69-4117-8797-2dacaf0f64e4-system-cni-dir\") pod \"multus-additional-cni-plugins-6fl9q\" (UID: \"3af705fc-ec69-4117-8797-2dacaf0f64e4\") " pod="openshift-multus/multus-additional-cni-plugins-6fl9q" Apr 16 17:40:08.417683 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417504 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0068bf9d-29ac-42bf-95e1-3f57493e68f0-device-dir\") pod \"aws-ebs-csi-driver-node-q8nzc\" (UID: \"0068bf9d-29ac-42bf-95e1-3f57493e68f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8nzc" Apr 16 17:40:08.417683 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417516 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ws2qj\" (UniqueName: \"kubernetes.io/projected/efbec2a2-d66f-4338-b932-105a8b5ba652-kube-api-access-ws2qj\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.417683 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417542 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-run-ovn\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.417683 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417568 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-host-cni-bin\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.417683 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417576 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0068bf9d-29ac-42bf-95e1-3f57493e68f0-sys-fs\") pod \"aws-ebs-csi-driver-node-q8nzc\" (UID: \"0068bf9d-29ac-42bf-95e1-3f57493e68f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8nzc" Apr 16 17:40:08.417683 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417628 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/af1b1635-1312-477a-9354-2b356990c171-ovnkube-script-lib\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.417683 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417661 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7559\" (UniqueName: \"kubernetes.io/projected/cac5b68b-21bc-4998-8cf4-855cf71cdc45-kube-api-access-l7559\") pod \"network-check-target-769fj\" (UID: \"cac5b68b-21bc-4998-8cf4-855cf71cdc45\") " pod="openshift-network-diagnostics/network-check-target-769fj" Apr 16 17:40:08.418414 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417690 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.418414 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417715 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/af1b1635-1312-477a-9354-2b356990c171-env-overrides\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.418414 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417741 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b1fc7afb-d9c4-43bd-8d20-fbefd9221162-host-slash\") pod \"iptables-alerter-jm645\" (UID: \"b1fc7afb-d9c4-43bd-8d20-fbefd9221162\") " pod="openshift-network-operator/iptables-alerter-jm645" Apr 16 17:40:08.418414 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417766 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-host\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.418414 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417792 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-log-socket\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.418414 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417818 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-host-run-netns\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.418414 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417846 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0068bf9d-29ac-42bf-95e1-3f57493e68f0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-q8nzc\" (UID: \"0068bf9d-29ac-42bf-95e1-3f57493e68f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8nzc" Apr 16 17:40:08.418414 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417869 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-etc-sysctl-d\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.418414 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417891 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-systemd-units\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.418414 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417917 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-host-slash\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.418414 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417946 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/af1b1635-1312-477a-9354-2b356990c171-ovnkube-config\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.418414 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417969 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/af1b1635-1312-477a-9354-2b356990c171-ovn-node-metrics-cert\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.418414 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.417994 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lr27w\" (UniqueName: \"kubernetes.io/projected/af1b1635-1312-477a-9354-2b356990c171-kube-api-access-lr27w\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.418414 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418027 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9vz2\" (UniqueName: \"kubernetes.io/projected/5626e70b-1b0e-424a-af3b-d0dba055fd1b-kube-api-access-w9vz2\") pod \"node-resolver-clpc9\" (UID: \"5626e70b-1b0e-424a-af3b-d0dba055fd1b\") " pod="openshift-dns/node-resolver-clpc9" Apr 16 17:40:08.418414 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418051 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/efbec2a2-d66f-4338-b932-105a8b5ba652-tmp\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.418414 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418075 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3af705fc-ec69-4117-8797-2dacaf0f64e4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6fl9q\" (UID: \"3af705fc-ec69-4117-8797-2dacaf0f64e4\") " pod="openshift-multus/multus-additional-cni-plugins-6fl9q" Apr 16 17:40:08.418414 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418099 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-run-systemd\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.419184 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418100 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3af705fc-ec69-4117-8797-2dacaf0f64e4-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6fl9q\" (UID: \"3af705fc-ec69-4117-8797-2dacaf0f64e4\") " pod="openshift-multus/multus-additional-cni-plugins-6fl9q" Apr 16 17:40:08.419184 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418127 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-var-lib-openvswitch\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.419184 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418156 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-os-release\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.419184 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418196 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b1fc7afb-d9c4-43bd-8d20-fbefd9221162-iptables-alerter-script\") pod \"iptables-alerter-jm645\" (UID: \"b1fc7afb-d9c4-43bd-8d20-fbefd9221162\") " pod="openshift-network-operator/iptables-alerter-jm645" Apr 16 17:40:08.419184 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418241 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-etc-sysconfig\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.419184 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418271 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-etc-kubernetes\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.419184 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418286 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-etc-sysctl-conf\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.419184 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418287 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c07dd47a-0eea-4a37-908a-194889b059cd-konnectivity-ca\") pod \"konnectivity-agent-tbmqq\" (UID: \"c07dd47a-0eea-4a37-908a-194889b059cd\") " pod="kube-system/konnectivity-agent-tbmqq" Apr 16 17:40:08.419184 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418297 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-etc-kubernetes\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.419184 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418321 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-node-log\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.419184 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418338 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-systemd-units\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.419184 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418348 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7db52a98-86b8-46da-a83e-8f6ee99d696d-metrics-certs\") pod \"network-metrics-daemon-n4qhr\" (UID: \"7db52a98-86b8-46da-a83e-8f6ee99d696d\") " pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:40:08.419184 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418358 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-run-ovn\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.419184 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418376 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/76b37f14-34f7-4661-ad91-459fb138a436-serviceca\") pod \"node-ca-l2szv\" (UID: \"76b37f14-34f7-4661-ad91-459fb138a436\") " pod="openshift-image-registry/node-ca-l2szv" Apr 16 17:40:08.419184 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418386 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-host\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.419184 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418396 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-host-cni-bin\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.419184 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418402 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-294rl\" (UniqueName: \"kubernetes.io/projected/76b37f14-34f7-4661-ad91-459fb138a436-kube-api-access-294rl\") pod \"node-ca-l2szv\" (UID: \"76b37f14-34f7-4661-ad91-459fb138a436\") " pod="openshift-image-registry/node-ca-l2szv" Apr 16 17:40:08.419935 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418429 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3af705fc-ec69-4117-8797-2dacaf0f64e4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6fl9q\" (UID: \"3af705fc-ec69-4117-8797-2dacaf0f64e4\") " pod="openshift-multus/multus-additional-cni-plugins-6fl9q" Apr 16 17:40:08.419935 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418455 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c185fd76-c69c-433e-9b66-55227ea35aa0-multus-daemon-config\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.419935 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418482 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5626e70b-1b0e-424a-af3b-d0dba055fd1b-hosts-file\") pod \"node-resolver-clpc9\" (UID: \"5626e70b-1b0e-424a-af3b-d0dba055fd1b\") " pod="openshift-dns/node-resolver-clpc9" Apr 16 17:40:08.419935 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418508 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-var-lib-kubelet\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.419935 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418537 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-etc-openvswitch\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.419935 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418565 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-system-cni-dir\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.419935 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418589 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-cnibin\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.419935 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418614 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-host-var-lib-kubelet\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.419935 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418651 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbqcq\" (UniqueName: \"kubernetes.io/projected/c185fd76-c69c-433e-9b66-55227ea35aa0-kube-api-access-jbqcq\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.419935 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418679 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/efbec2a2-d66f-4338-b932-105a8b5ba652-etc-tuned\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.419935 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418684 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 17:40:08.419935 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418703 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3af705fc-ec69-4117-8797-2dacaf0f64e4-cnibin\") pod \"multus-additional-cni-plugins-6fl9q\" (UID: \"3af705fc-ec69-4117-8797-2dacaf0f64e4\") " pod="openshift-multus/multus-additional-cni-plugins-6fl9q" Apr 16 17:40:08.419935 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418728 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c185fd76-c69c-433e-9b66-55227ea35aa0-cni-binary-copy\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.419935 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418756 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-host-run-k8s-cni-cncf-io\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.419935 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418782 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdhw4\" (UniqueName: \"kubernetes.io/projected/b1fc7afb-d9c4-43bd-8d20-fbefd9221162-kube-api-access-xdhw4\") pod \"iptables-alerter-jm645\" (UID: \"b1fc7afb-d9c4-43bd-8d20-fbefd9221162\") " pod="openshift-network-operator/iptables-alerter-jm645" Apr 16 17:40:08.419935 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418815 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0068bf9d-29ac-42bf-95e1-3f57493e68f0-registration-dir\") pod \"aws-ebs-csi-driver-node-q8nzc\" (UID: \"0068bf9d-29ac-42bf-95e1-3f57493e68f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8nzc" Apr 16 17:40:08.419935 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418842 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/76b37f14-34f7-4661-ad91-459fb138a436-host\") pod \"node-ca-l2szv\" (UID: \"76b37f14-34f7-4661-ad91-459fb138a436\") " pod="openshift-image-registry/node-ca-l2szv" Apr 16 17:40:08.420629 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418870 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-etc-systemd\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.420629 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418895 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-run\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.420629 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418911 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/af1b1635-1312-477a-9354-2b356990c171-ovnkube-script-lib\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.420629 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418929 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3af705fc-ec69-4117-8797-2dacaf0f64e4-os-release\") pod \"multus-additional-cni-plugins-6fl9q\" (UID: \"3af705fc-ec69-4117-8797-2dacaf0f64e4\") " pod="openshift-multus/multus-additional-cni-plugins-6fl9q" Apr 16 17:40:08.420629 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418955 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0068bf9d-29ac-42bf-95e1-3f57493e68f0-etc-selinux\") pod \"aws-ebs-csi-driver-node-q8nzc\" (UID: \"0068bf9d-29ac-42bf-95e1-3f57493e68f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8nzc" Apr 16 17:40:08.420629 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418982 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-etc-modprobe-d\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.420629 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.419012 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-lib-modules\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.420629 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.419039 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-host-run-ovn-kubernetes\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.420629 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.419066 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-multus-cni-dir\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.420629 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.419075 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5626e70b-1b0e-424a-af3b-d0dba055fd1b-hosts-file\") pod \"node-resolver-clpc9\" (UID: \"5626e70b-1b0e-424a-af3b-d0dba055fd1b\") " pod="openshift-dns/node-resolver-clpc9" Apr 16 17:40:08.420629 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.419096 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wrq4g\" (UniqueName: \"kubernetes.io/projected/7db52a98-86b8-46da-a83e-8f6ee99d696d-kube-api-access-wrq4g\") pod \"network-metrics-daemon-n4qhr\" (UID: \"7db52a98-86b8-46da-a83e-8f6ee99d696d\") " pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:40:08.420629 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.419129 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-host-slash\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.420629 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.419155 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0068bf9d-29ac-42bf-95e1-3f57493e68f0-socket-dir\") pod \"aws-ebs-csi-driver-node-q8nzc\" (UID: \"0068bf9d-29ac-42bf-95e1-3f57493e68f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8nzc" Apr 16 17:40:08.420629 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.419186 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3af705fc-ec69-4117-8797-2dacaf0f64e4-cni-binary-copy\") pod \"multus-additional-cni-plugins-6fl9q\" (UID: \"3af705fc-ec69-4117-8797-2dacaf0f64e4\") " pod="openshift-multus/multus-additional-cni-plugins-6fl9q" Apr 16 17:40:08.420629 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.419231 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-host-run-netns\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.420629 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.419258 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-run-openvswitch\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.420629 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.419289 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-host-var-lib-cni-multus\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.421402 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.419321 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-hostroot\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.421402 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.419329 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/af1b1635-1312-477a-9354-2b356990c171-env-overrides\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.421402 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.418981 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.421402 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.419614 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/af1b1635-1312-477a-9354-2b356990c171-ovnkube-config\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.421402 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.419725 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-log-socket\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.421402 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.419844 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-var-lib-kubelet\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.421402 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.419950 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0068bf9d-29ac-42bf-95e1-3f57493e68f0-socket-dir\") pod \"aws-ebs-csi-driver-node-q8nzc\" (UID: \"0068bf9d-29ac-42bf-95e1-3f57493e68f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8nzc" Apr 16 17:40:08.421402 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.420302 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-etc-openvswitch\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.421402 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.420358 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3af705fc-ec69-4117-8797-2dacaf0f64e4-cni-binary-copy\") pod \"multus-additional-cni-plugins-6fl9q\" (UID: \"3af705fc-ec69-4117-8797-2dacaf0f64e4\") " pod="openshift-multus/multus-additional-cni-plugins-6fl9q" Apr 16 17:40:08.421402 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.420409 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-host-run-netns\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.421402 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.420443 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-run-openvswitch\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.421402 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:08.420535 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:08.421402 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:08.420608 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7db52a98-86b8-46da-a83e-8f6ee99d696d-metrics-certs podName:7db52a98-86b8-46da-a83e-8f6ee99d696d nodeName:}" failed. No retries permitted until 2026-04-16 17:40:08.920587044 +0000 UTC m=+3.133620196 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7db52a98-86b8-46da-a83e-8f6ee99d696d-metrics-certs") pod "network-metrics-daemon-n4qhr" (UID: "7db52a98-86b8-46da-a83e-8f6ee99d696d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:08.421402 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.420816 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-etc-sysconfig\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.421402 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.420863 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-etc-kubernetes\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.421402 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.420901 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-node-log\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.421402 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.421230 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-etc-systemd\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.422143 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.421276 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3af705fc-ec69-4117-8797-2dacaf0f64e4-cnibin\") pod \"multus-additional-cni-plugins-6fl9q\" (UID: \"3af705fc-ec69-4117-8797-2dacaf0f64e4\") " pod="openshift-multus/multus-additional-cni-plugins-6fl9q" Apr 16 17:40:08.422143 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.421292 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/76b37f14-34f7-4661-ad91-459fb138a436-serviceca\") pod \"node-ca-l2szv\" (UID: \"76b37f14-34f7-4661-ad91-459fb138a436\") " pod="openshift-image-registry/node-ca-l2szv" Apr 16 17:40:08.422143 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.421325 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0068bf9d-29ac-42bf-95e1-3f57493e68f0-registration-dir\") pod \"aws-ebs-csi-driver-node-q8nzc\" (UID: \"0068bf9d-29ac-42bf-95e1-3f57493e68f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8nzc" Apr 16 17:40:08.422143 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.421358 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/76b37f14-34f7-4661-ad91-459fb138a436-host\") pod \"node-ca-l2szv\" (UID: \"76b37f14-34f7-4661-ad91-459fb138a436\") " pod="openshift-image-registry/node-ca-l2szv" Apr 16 17:40:08.422143 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.421513 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-run-systemd\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.422143 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.421595 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0068bf9d-29ac-42bf-95e1-3f57493e68f0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-q8nzc\" (UID: \"0068bf9d-29ac-42bf-95e1-3f57493e68f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8nzc" Apr 16 17:40:08.422143 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.421618 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-etc-sysctl-d\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.422143 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.421672 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-etc-modprobe-d\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.422143 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.421714 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-run\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.422143 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.421768 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3af705fc-ec69-4117-8797-2dacaf0f64e4-os-release\") pod \"multus-additional-cni-plugins-6fl9q\" (UID: \"3af705fc-ec69-4117-8797-2dacaf0f64e4\") " pod="openshift-multus/multus-additional-cni-plugins-6fl9q" Apr 16 17:40:08.422143 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.421829 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0068bf9d-29ac-42bf-95e1-3f57493e68f0-etc-selinux\") pod \"aws-ebs-csi-driver-node-q8nzc\" (UID: \"0068bf9d-29ac-42bf-95e1-3f57493e68f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8nzc" Apr 16 17:40:08.422143 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.421887 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-host-run-ovn-kubernetes\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.422143 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.421894 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/af1b1635-1312-477a-9354-2b356990c171-var-lib-openvswitch\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.422143 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.421963 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/efbec2a2-d66f-4338-b932-105a8b5ba652-lib-modules\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.423563 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.422279 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3af705fc-ec69-4117-8797-2dacaf0f64e4-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6fl9q\" (UID: \"3af705fc-ec69-4117-8797-2dacaf0f64e4\") " pod="openshift-multus/multus-additional-cni-plugins-6fl9q" Apr 16 17:40:08.423563 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.422387 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3af705fc-ec69-4117-8797-2dacaf0f64e4-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6fl9q\" (UID: \"3af705fc-ec69-4117-8797-2dacaf0f64e4\") " pod="openshift-multus/multus-additional-cni-plugins-6fl9q" Apr 16 17:40:08.423563 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.422975 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/af1b1635-1312-477a-9354-2b356990c171-ovn-node-metrics-cert\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.423687 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.423599 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c07dd47a-0eea-4a37-908a-194889b059cd-agent-certs\") pod \"konnectivity-agent-tbmqq\" (UID: \"c07dd47a-0eea-4a37-908a-194889b059cd\") " pod="kube-system/konnectivity-agent-tbmqq" Apr 16 17:40:08.424725 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.424700 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/efbec2a2-d66f-4338-b932-105a8b5ba652-etc-tuned\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.424839 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.424754 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/efbec2a2-d66f-4338-b932-105a8b5ba652-tmp\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.429920 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.429834 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrq4g\" (UniqueName: \"kubernetes.io/projected/7db52a98-86b8-46da-a83e-8f6ee99d696d-kube-api-access-wrq4g\") pod \"network-metrics-daemon-n4qhr\" (UID: \"7db52a98-86b8-46da-a83e-8f6ee99d696d\") " pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:40:08.430828 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.430804 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlzdl\" (UniqueName: \"kubernetes.io/projected/3af705fc-ec69-4117-8797-2dacaf0f64e4-kube-api-access-hlzdl\") pod \"multus-additional-cni-plugins-6fl9q\" (UID: \"3af705fc-ec69-4117-8797-2dacaf0f64e4\") " pod="openshift-multus/multus-additional-cni-plugins-6fl9q" Apr 16 17:40:08.431926 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.431893 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-294rl\" (UniqueName: \"kubernetes.io/projected/76b37f14-34f7-4661-ad91-459fb138a436-kube-api-access-294rl\") pod \"node-ca-l2szv\" (UID: \"76b37f14-34f7-4661-ad91-459fb138a436\") " pod="openshift-image-registry/node-ca-l2szv" Apr 16 17:40:08.432560 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.432536 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9vz2\" (UniqueName: \"kubernetes.io/projected/5626e70b-1b0e-424a-af3b-d0dba055fd1b-kube-api-access-w9vz2\") pod \"node-resolver-clpc9\" (UID: \"5626e70b-1b0e-424a-af3b-d0dba055fd1b\") " pod="openshift-dns/node-resolver-clpc9" Apr 16 17:40:08.433040 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.432996 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmddh\" (UniqueName: \"kubernetes.io/projected/0068bf9d-29ac-42bf-95e1-3f57493e68f0-kube-api-access-nmddh\") pod \"aws-ebs-csi-driver-node-q8nzc\" (UID: \"0068bf9d-29ac-42bf-95e1-3f57493e68f0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8nzc" Apr 16 17:40:08.433040 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.433008 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr27w\" (UniqueName: \"kubernetes.io/projected/af1b1635-1312-477a-9354-2b356990c171-kube-api-access-lr27w\") pod \"ovnkube-node-9m58b\" (UID: \"af1b1635-1312-477a-9354-2b356990c171\") " pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.435027 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.435003 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws2qj\" (UniqueName: \"kubernetes.io/projected/efbec2a2-d66f-4338-b932-105a8b5ba652-kube-api-access-ws2qj\") pod \"tuned-xz9gn\" (UID: \"efbec2a2-d66f-4338-b932-105a8b5ba652\") " pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.520112 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520042 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-multus-socket-dir-parent\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.520112 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520075 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-host-var-lib-cni-bin\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.520112 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520101 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-multus-conf-dir\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.520364 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520129 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7559\" (UniqueName: \"kubernetes.io/projected/cac5b68b-21bc-4998-8cf4-855cf71cdc45-kube-api-access-l7559\") pod \"network-check-target-769fj\" (UID: \"cac5b68b-21bc-4998-8cf4-855cf71cdc45\") " pod="openshift-network-diagnostics/network-check-target-769fj" Apr 16 17:40:08.520364 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520136 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-multus-socket-dir-parent\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.520364 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520155 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b1fc7afb-d9c4-43bd-8d20-fbefd9221162-host-slash\") pod \"iptables-alerter-jm645\" (UID: \"b1fc7afb-d9c4-43bd-8d20-fbefd9221162\") " pod="openshift-network-operator/iptables-alerter-jm645" Apr 16 17:40:08.520364 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520179 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-host-run-netns\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.520364 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520182 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-host-var-lib-cni-bin\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.520364 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520196 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-multus-conf-dir\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.520364 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520209 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-os-release\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.520364 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520244 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b1fc7afb-d9c4-43bd-8d20-fbefd9221162-host-slash\") pod \"iptables-alerter-jm645\" (UID: \"b1fc7afb-d9c4-43bd-8d20-fbefd9221162\") " pod="openshift-network-operator/iptables-alerter-jm645" Apr 16 17:40:08.520364 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520250 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b1fc7afb-d9c4-43bd-8d20-fbefd9221162-iptables-alerter-script\") pod \"iptables-alerter-jm645\" (UID: \"b1fc7afb-d9c4-43bd-8d20-fbefd9221162\") " pod="openshift-network-operator/iptables-alerter-jm645" Apr 16 17:40:08.520364 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520276 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-etc-kubernetes\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.520364 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520312 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-os-release\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.520364 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520316 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c185fd76-c69c-433e-9b66-55227ea35aa0-multus-daemon-config\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.520364 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520330 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-etc-kubernetes\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.520364 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520275 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-host-run-netns\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.520364 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520364 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-system-cni-dir\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.520977 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520389 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-cnibin\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.520977 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520412 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-host-var-lib-kubelet\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.520977 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520439 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbqcq\" (UniqueName: \"kubernetes.io/projected/c185fd76-c69c-433e-9b66-55227ea35aa0-kube-api-access-jbqcq\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.520977 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520465 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c185fd76-c69c-433e-9b66-55227ea35aa0-cni-binary-copy\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.520977 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520469 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-system-cni-dir\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.520977 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520490 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-host-run-k8s-cni-cncf-io\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.520977 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520489 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-cnibin\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.520977 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520532 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xdhw4\" (UniqueName: \"kubernetes.io/projected/b1fc7afb-d9c4-43bd-8d20-fbefd9221162-kube-api-access-xdhw4\") pod \"iptables-alerter-jm645\" (UID: \"b1fc7afb-d9c4-43bd-8d20-fbefd9221162\") " pod="openshift-network-operator/iptables-alerter-jm645" Apr 16 17:40:08.520977 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520569 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-multus-cni-dir\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.520977 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520599 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-host-var-lib-cni-multus\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.520977 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520627 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-hostroot\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.520977 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520653 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-host-run-multus-certs\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.520977 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520698 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-host-var-lib-kubelet\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.520977 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520720 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-host-run-multus-certs\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.520977 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520730 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-host-var-lib-cni-multus\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.520977 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520743 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-host-run-k8s-cni-cncf-io\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.520977 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520766 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-hostroot\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.520977 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520802 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c185fd76-c69c-433e-9b66-55227ea35aa0-multus-cni-dir\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.521672 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.520886 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c185fd76-c69c-433e-9b66-55227ea35aa0-multus-daemon-config\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.521672 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.521354 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b1fc7afb-d9c4-43bd-8d20-fbefd9221162-iptables-alerter-script\") pod \"iptables-alerter-jm645\" (UID: \"b1fc7afb-d9c4-43bd-8d20-fbefd9221162\") " pod="openshift-network-operator/iptables-alerter-jm645" Apr 16 17:40:08.521672 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.521381 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c185fd76-c69c-433e-9b66-55227ea35aa0-cni-binary-copy\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.528115 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:08.528083 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:40:08.528115 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:08.528107 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:40:08.528297 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:08.528120 2573 projected.go:194] Error preparing data for projected volume kube-api-access-l7559 for pod openshift-network-diagnostics/network-check-target-769fj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:08.528297 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:08.528196 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cac5b68b-21bc-4998-8cf4-855cf71cdc45-kube-api-access-l7559 podName:cac5b68b-21bc-4998-8cf4-855cf71cdc45 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:09.028166892 +0000 UTC m=+3.241200035 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-l7559" (UniqueName: "kubernetes.io/projected/cac5b68b-21bc-4998-8cf4-855cf71cdc45-kube-api-access-l7559") pod "network-check-target-769fj" (UID: "cac5b68b-21bc-4998-8cf4-855cf71cdc45") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:08.530416 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.530399 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdhw4\" (UniqueName: \"kubernetes.io/projected/b1fc7afb-d9c4-43bd-8d20-fbefd9221162-kube-api-access-xdhw4\") pod \"iptables-alerter-jm645\" (UID: \"b1fc7afb-d9c4-43bd-8d20-fbefd9221162\") " pod="openshift-network-operator/iptables-alerter-jm645" Apr 16 17:40:08.530509 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.530437 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbqcq\" (UniqueName: \"kubernetes.io/projected/c185fd76-c69c-433e-9b66-55227ea35aa0-kube-api-access-jbqcq\") pod \"multus-wrnk9\" (UID: \"c185fd76-c69c-433e-9b66-55227ea35aa0\") " pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.608636 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.608601 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" Apr 16 17:40:08.619022 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.618996 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-clpc9" Apr 16 17:40:08.627665 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.627642 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6fl9q" Apr 16 17:40:08.632322 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.632301 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tbmqq" Apr 16 17:40:08.639884 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.639857 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8nzc" Apr 16 17:40:08.646523 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.646495 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:08.653107 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.653079 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-l2szv" Apr 16 17:40:08.660688 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.660670 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wrnk9" Apr 16 17:40:08.667183 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.667168 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-jm645" Apr 16 17:40:08.922992 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:08.922907 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7db52a98-86b8-46da-a83e-8f6ee99d696d-metrics-certs\") pod \"network-metrics-daemon-n4qhr\" (UID: \"7db52a98-86b8-46da-a83e-8f6ee99d696d\") " pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:40:08.923143 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:08.923070 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:08.923200 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:08.923145 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7db52a98-86b8-46da-a83e-8f6ee99d696d-metrics-certs podName:7db52a98-86b8-46da-a83e-8f6ee99d696d nodeName:}" failed. No retries permitted until 2026-04-16 17:40:09.923123058 +0000 UTC m=+4.136156188 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7db52a98-86b8-46da-a83e-8f6ee99d696d-metrics-certs") pod "network-metrics-daemon-n4qhr" (UID: "7db52a98-86b8-46da-a83e-8f6ee99d696d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:09.123614 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:09.123577 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7559\" (UniqueName: \"kubernetes.io/projected/cac5b68b-21bc-4998-8cf4-855cf71cdc45-kube-api-access-l7559\") pod \"network-check-target-769fj\" (UID: \"cac5b68b-21bc-4998-8cf4-855cf71cdc45\") " pod="openshift-network-diagnostics/network-check-target-769fj" Apr 16 17:40:09.123786 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:09.123751 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:40:09.123786 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:09.123775 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:40:09.123786 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:09.123788 2573 projected.go:194] Error preparing data for projected volume kube-api-access-l7559 for pod openshift-network-diagnostics/network-check-target-769fj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:09.123931 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:09.123848 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cac5b68b-21bc-4998-8cf4-855cf71cdc45-kube-api-access-l7559 podName:cac5b68b-21bc-4998-8cf4-855cf71cdc45 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:10.123828105 +0000 UTC m=+4.336861232 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7559" (UniqueName: "kubernetes.io/projected/cac5b68b-21bc-4998-8cf4-855cf71cdc45-kube-api-access-l7559") pod "network-check-target-769fj" (UID: "cac5b68b-21bc-4998-8cf4-855cf71cdc45") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:09.193489 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:09.193465 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1fc7afb_d9c4_43bd_8d20_fbefd9221162.slice/crio-81709c6d632ad248f37257e22e824371c8b59001f7bb455f7d7562f6b00cfb1d WatchSource:0}: Error finding container 81709c6d632ad248f37257e22e824371c8b59001f7bb455f7d7562f6b00cfb1d: Status 404 returned error can't find the container with id 81709c6d632ad248f37257e22e824371c8b59001f7bb455f7d7562f6b00cfb1d Apr 16 17:40:09.194570 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:09.194519 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3af705fc_ec69_4117_8797_2dacaf0f64e4.slice/crio-9e0b3a6c3c772876ca154e22fda15ca3fc41ac431ff478ab1330e689a7eb7cd5 WatchSource:0}: Error finding container 9e0b3a6c3c772876ca154e22fda15ca3fc41ac431ff478ab1330e689a7eb7cd5: Status 404 returned error can't find the container with id 9e0b3a6c3c772876ca154e22fda15ca3fc41ac431ff478ab1330e689a7eb7cd5 Apr 16 17:40:09.195346 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:09.195164 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefbec2a2_d66f_4338_b932_105a8b5ba652.slice/crio-82d2eb288d2ff7eeec1f5f0fc572d599d62daea5d1c739be29afcf92c24430d5 WatchSource:0}: Error finding container 82d2eb288d2ff7eeec1f5f0fc572d599d62daea5d1c739be29afcf92c24430d5: Status 404 returned error can't find the container with id 82d2eb288d2ff7eeec1f5f0fc572d599d62daea5d1c739be29afcf92c24430d5 Apr 16 17:40:09.198557 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:09.198535 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0068bf9d_29ac_42bf_95e1_3f57493e68f0.slice/crio-cbc617fc1e4b5d04a7b0762af328fc444330c0301339d41da533862bfd0b908f WatchSource:0}: Error finding container cbc617fc1e4b5d04a7b0762af328fc444330c0301339d41da533862bfd0b908f: Status 404 returned error can't find the container with id cbc617fc1e4b5d04a7b0762af328fc444330c0301339d41da533862bfd0b908f Apr 16 17:40:09.199330 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:09.199306 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc07dd47a_0eea_4a37_908a_194889b059cd.slice/crio-87672c7c6d4e0483066d52070f7c6798c9f14daba906c72cb73a3bacf7b240ea WatchSource:0}: Error finding container 87672c7c6d4e0483066d52070f7c6798c9f14daba906c72cb73a3bacf7b240ea: Status 404 returned error can't find the container with id 87672c7c6d4e0483066d52070f7c6798c9f14daba906c72cb73a3bacf7b240ea Apr 16 17:40:09.200117 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:09.200096 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf1b1635_1312_477a_9354_2b356990c171.slice/crio-848fd812da4c713906723be1e1e1abd369cdef3b98a7ce06c963c4627dbc9507 WatchSource:0}: Error finding container 848fd812da4c713906723be1e1e1abd369cdef3b98a7ce06c963c4627dbc9507: Status 404 returned error can't find the container with id 848fd812da4c713906723be1e1e1abd369cdef3b98a7ce06c963c4627dbc9507 Apr 16 17:40:09.201017 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:09.200932 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5626e70b_1b0e_424a_af3b_d0dba055fd1b.slice/crio-0155511c3275b6c6cd648fc3a6e83a7ccea59ade08a1ae58b1bfbd8bb23e60af WatchSource:0}: Error finding container 0155511c3275b6c6cd648fc3a6e83a7ccea59ade08a1ae58b1bfbd8bb23e60af: Status 404 returned error can't find the container with id 0155511c3275b6c6cd648fc3a6e83a7ccea59ade08a1ae58b1bfbd8bb23e60af Apr 16 17:40:09.202614 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:09.202490 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76b37f14_34f7_4661_ad91_459fb138a436.slice/crio-d5a7f79203725dbd4819b8a0ba4ce262f9a33be6ab43cc7df956e0f27172a7e4 WatchSource:0}: Error finding container d5a7f79203725dbd4819b8a0ba4ce262f9a33be6ab43cc7df956e0f27172a7e4: Status 404 returned error can't find the container with id d5a7f79203725dbd4819b8a0ba4ce262f9a33be6ab43cc7df956e0f27172a7e4 Apr 16 17:40:09.203030 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:09.202919 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc185fd76_c69c_433e_9b66_55227ea35aa0.slice/crio-b9324da83ca6d3a8801b9baaaffd364acb835f8e98186ff9801d24e877beed37 WatchSource:0}: Error finding container b9324da83ca6d3a8801b9baaaffd364acb835f8e98186ff9801d24e877beed37: Status 404 returned error can't find the container with id b9324da83ca6d3a8801b9baaaffd364acb835f8e98186ff9801d24e877beed37 Apr 16 17:40:09.348396 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:09.348246 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 17:35:07 +0000 UTC" deadline="2027-11-11 05:02:15.584029435 +0000 UTC" Apr 16 17:40:09.348396 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:09.348394 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13763h22m6.235638098s" Apr 16 17:40:09.438078 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:09.437983 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:40:09.438078 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:09.438002 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-769fj" Apr 16 17:40:09.438245 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:09.438087 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-769fj" podUID="cac5b68b-21bc-4998-8cf4-855cf71cdc45" Apr 16 17:40:09.438245 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:09.438207 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n4qhr" podUID="7db52a98-86b8-46da-a83e-8f6ee99d696d" Apr 16 17:40:09.445908 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:09.445885 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tbmqq" event={"ID":"c07dd47a-0eea-4a37-908a-194889b059cd","Type":"ContainerStarted","Data":"87672c7c6d4e0483066d52070f7c6798c9f14daba906c72cb73a3bacf7b240ea"} Apr 16 17:40:09.446786 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:09.446766 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wrnk9" event={"ID":"c185fd76-c69c-433e-9b66-55227ea35aa0","Type":"ContainerStarted","Data":"b9324da83ca6d3a8801b9baaaffd364acb835f8e98186ff9801d24e877beed37"} Apr 16 17:40:09.447774 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:09.447750 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-l2szv" event={"ID":"76b37f14-34f7-4661-ad91-459fb138a436","Type":"ContainerStarted","Data":"d5a7f79203725dbd4819b8a0ba4ce262f9a33be6ab43cc7df956e0f27172a7e4"} Apr 16 17:40:09.448708 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:09.448683 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-clpc9" event={"ID":"5626e70b-1b0e-424a-af3b-d0dba055fd1b","Type":"ContainerStarted","Data":"0155511c3275b6c6cd648fc3a6e83a7ccea59ade08a1ae58b1bfbd8bb23e60af"} Apr 16 17:40:09.449631 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:09.449611 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" event={"ID":"af1b1635-1312-477a-9354-2b356990c171","Type":"ContainerStarted","Data":"848fd812da4c713906723be1e1e1abd369cdef3b98a7ce06c963c4627dbc9507"} Apr 16 17:40:09.450566 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:09.450545 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8nzc" event={"ID":"0068bf9d-29ac-42bf-95e1-3f57493e68f0","Type":"ContainerStarted","Data":"cbc617fc1e4b5d04a7b0762af328fc444330c0301339d41da533862bfd0b908f"} Apr 16 17:40:09.451529 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:09.451505 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jm645" event={"ID":"b1fc7afb-d9c4-43bd-8d20-fbefd9221162","Type":"ContainerStarted","Data":"81709c6d632ad248f37257e22e824371c8b59001f7bb455f7d7562f6b00cfb1d"} Apr 16 17:40:09.452924 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:09.452905 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-241.ec2.internal" event={"ID":"cba24747f92b6b0c65ecaea92412a09c","Type":"ContainerStarted","Data":"adecbce19b4dee41f783c38a2944a60e3a74e175be181911cf100725071a4f2c"} Apr 16 17:40:09.453901 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:09.453880 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" event={"ID":"efbec2a2-d66f-4338-b932-105a8b5ba652","Type":"ContainerStarted","Data":"82d2eb288d2ff7eeec1f5f0fc572d599d62daea5d1c739be29afcf92c24430d5"} Apr 16 17:40:09.454861 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:09.454838 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6fl9q" event={"ID":"3af705fc-ec69-4117-8797-2dacaf0f64e4","Type":"ContainerStarted","Data":"9e0b3a6c3c772876ca154e22fda15ca3fc41ac431ff478ab1330e689a7eb7cd5"} Apr 16 17:40:09.469193 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:09.469146 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-241.ec2.internal" podStartSLOduration=2.469131287 podStartE2EDuration="2.469131287s" podCreationTimestamp="2026-04-16 17:40:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:40:09.468430728 +0000 UTC m=+3.681463873" watchObservedRunningTime="2026-04-16 17:40:09.469131287 +0000 UTC m=+3.682164435" Apr 16 17:40:09.930911 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:09.930812 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7db52a98-86b8-46da-a83e-8f6ee99d696d-metrics-certs\") pod \"network-metrics-daemon-n4qhr\" (UID: \"7db52a98-86b8-46da-a83e-8f6ee99d696d\") " pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:40:09.931068 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:09.930970 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:09.931068 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:09.931038 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7db52a98-86b8-46da-a83e-8f6ee99d696d-metrics-certs podName:7db52a98-86b8-46da-a83e-8f6ee99d696d nodeName:}" failed. No retries permitted until 2026-04-16 17:40:11.931016388 +0000 UTC m=+6.144049539 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7db52a98-86b8-46da-a83e-8f6ee99d696d-metrics-certs") pod "network-metrics-daemon-n4qhr" (UID: "7db52a98-86b8-46da-a83e-8f6ee99d696d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:10.131967 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:10.131931 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7559\" (UniqueName: \"kubernetes.io/projected/cac5b68b-21bc-4998-8cf4-855cf71cdc45-kube-api-access-l7559\") pod \"network-check-target-769fj\" (UID: \"cac5b68b-21bc-4998-8cf4-855cf71cdc45\") " pod="openshift-network-diagnostics/network-check-target-769fj" Apr 16 17:40:10.132174 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:10.132151 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:40:10.132277 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:10.132177 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:40:10.132277 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:10.132192 2573 projected.go:194] Error preparing data for projected volume kube-api-access-l7559 for pod openshift-network-diagnostics/network-check-target-769fj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:10.132277 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:10.132268 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cac5b68b-21bc-4998-8cf4-855cf71cdc45-kube-api-access-l7559 podName:cac5b68b-21bc-4998-8cf4-855cf71cdc45 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:12.132247298 +0000 UTC m=+6.345280480 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7559" (UniqueName: "kubernetes.io/projected/cac5b68b-21bc-4998-8cf4-855cf71cdc45-kube-api-access-l7559") pod "network-check-target-769fj" (UID: "cac5b68b-21bc-4998-8cf4-855cf71cdc45") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:11.438587 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:11.438555 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-769fj" Apr 16 17:40:11.439072 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:11.438685 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-769fj" podUID="cac5b68b-21bc-4998-8cf4-855cf71cdc45" Apr 16 17:40:11.439161 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:11.439131 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:40:11.439276 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:11.439254 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n4qhr" podUID="7db52a98-86b8-46da-a83e-8f6ee99d696d" Apr 16 17:40:11.466301 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:11.466267 2573 generic.go:358] "Generic (PLEG): container finished" podID="59def4faf8392fa6f55a2e9f2ba8a155" containerID="88df18d78a4bcb9b76088b21b49cfa0cfa662f204a8a1ec1f6c9570b0d09e0c5" exitCode=0 Apr 16 17:40:11.466457 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:11.466326 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-241.ec2.internal" event={"ID":"59def4faf8392fa6f55a2e9f2ba8a155","Type":"ContainerDied","Data":"88df18d78a4bcb9b76088b21b49cfa0cfa662f204a8a1ec1f6c9570b0d09e0c5"} Apr 16 17:40:11.943870 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:11.943834 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7db52a98-86b8-46da-a83e-8f6ee99d696d-metrics-certs\") pod \"network-metrics-daemon-n4qhr\" (UID: \"7db52a98-86b8-46da-a83e-8f6ee99d696d\") " pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:40:11.944046 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:11.944024 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:11.944116 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:11.944094 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7db52a98-86b8-46da-a83e-8f6ee99d696d-metrics-certs podName:7db52a98-86b8-46da-a83e-8f6ee99d696d nodeName:}" failed. No retries permitted until 2026-04-16 17:40:15.944072937 +0000 UTC m=+10.157106067 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7db52a98-86b8-46da-a83e-8f6ee99d696d-metrics-certs") pod "network-metrics-daemon-n4qhr" (UID: "7db52a98-86b8-46da-a83e-8f6ee99d696d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:12.145040 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:12.145002 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7559\" (UniqueName: \"kubernetes.io/projected/cac5b68b-21bc-4998-8cf4-855cf71cdc45-kube-api-access-l7559\") pod \"network-check-target-769fj\" (UID: \"cac5b68b-21bc-4998-8cf4-855cf71cdc45\") " pod="openshift-network-diagnostics/network-check-target-769fj" Apr 16 17:40:12.145245 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:12.145169 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:40:12.145245 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:12.145194 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:40:12.145245 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:12.145207 2573 projected.go:194] Error preparing data for projected volume kube-api-access-l7559 for pod openshift-network-diagnostics/network-check-target-769fj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:12.145400 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:12.145286 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cac5b68b-21bc-4998-8cf4-855cf71cdc45-kube-api-access-l7559 podName:cac5b68b-21bc-4998-8cf4-855cf71cdc45 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:16.145265637 +0000 UTC m=+10.358298787 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7559" (UniqueName: "kubernetes.io/projected/cac5b68b-21bc-4998-8cf4-855cf71cdc45-kube-api-access-l7559") pod "network-check-target-769fj" (UID: "cac5b68b-21bc-4998-8cf4-855cf71cdc45") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:13.438851 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:13.438753 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-769fj" Apr 16 17:40:13.439300 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:13.438890 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-769fj" podUID="cac5b68b-21bc-4998-8cf4-855cf71cdc45" Apr 16 17:40:13.439300 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:13.438753 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:40:13.439406 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:13.439292 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n4qhr" podUID="7db52a98-86b8-46da-a83e-8f6ee99d696d" Apr 16 17:40:15.438142 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:15.438076 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:40:15.438142 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:15.438096 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-769fj" Apr 16 17:40:15.438707 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:15.438194 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-769fj" podUID="cac5b68b-21bc-4998-8cf4-855cf71cdc45" Apr 16 17:40:15.438707 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:15.438665 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n4qhr" podUID="7db52a98-86b8-46da-a83e-8f6ee99d696d" Apr 16 17:40:15.977002 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:15.976962 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7db52a98-86b8-46da-a83e-8f6ee99d696d-metrics-certs\") pod \"network-metrics-daemon-n4qhr\" (UID: \"7db52a98-86b8-46da-a83e-8f6ee99d696d\") " pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:40:15.977200 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:15.977135 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:15.977319 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:15.977206 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7db52a98-86b8-46da-a83e-8f6ee99d696d-metrics-certs podName:7db52a98-86b8-46da-a83e-8f6ee99d696d nodeName:}" failed. No retries permitted until 2026-04-16 17:40:23.977184451 +0000 UTC m=+18.190217592 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7db52a98-86b8-46da-a83e-8f6ee99d696d-metrics-certs") pod "network-metrics-daemon-n4qhr" (UID: "7db52a98-86b8-46da-a83e-8f6ee99d696d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:16.179178 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:16.179137 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7559\" (UniqueName: \"kubernetes.io/projected/cac5b68b-21bc-4998-8cf4-855cf71cdc45-kube-api-access-l7559\") pod \"network-check-target-769fj\" (UID: \"cac5b68b-21bc-4998-8cf4-855cf71cdc45\") " pod="openshift-network-diagnostics/network-check-target-769fj" Apr 16 17:40:16.179390 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:16.179360 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:40:16.179390 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:16.179380 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:40:16.179495 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:16.179393 2573 projected.go:194] Error preparing data for projected volume kube-api-access-l7559 for pod openshift-network-diagnostics/network-check-target-769fj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:16.179495 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:16.179452 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cac5b68b-21bc-4998-8cf4-855cf71cdc45-kube-api-access-l7559 podName:cac5b68b-21bc-4998-8cf4-855cf71cdc45 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:24.179433899 +0000 UTC m=+18.392467031 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7559" (UniqueName: "kubernetes.io/projected/cac5b68b-21bc-4998-8cf4-855cf71cdc45-kube-api-access-l7559") pod "network-check-target-769fj" (UID: "cac5b68b-21bc-4998-8cf4-855cf71cdc45") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:17.437856 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:17.437814 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:40:17.438310 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:17.437815 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-769fj" Apr 16 17:40:17.438310 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:17.437971 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n4qhr" podUID="7db52a98-86b8-46da-a83e-8f6ee99d696d" Apr 16 17:40:17.438310 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:17.438031 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-769fj" podUID="cac5b68b-21bc-4998-8cf4-855cf71cdc45" Apr 16 17:40:19.438585 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:19.438544 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-769fj" Apr 16 17:40:19.438999 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:19.438544 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:40:19.438999 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:19.438671 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-769fj" podUID="cac5b68b-21bc-4998-8cf4-855cf71cdc45" Apr 16 17:40:19.438999 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:19.438800 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n4qhr" podUID="7db52a98-86b8-46da-a83e-8f6ee99d696d" Apr 16 17:40:21.438454 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:21.438425 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-769fj" Apr 16 17:40:21.438777 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:21.438481 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:40:21.438777 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:21.438608 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n4qhr" podUID="7db52a98-86b8-46da-a83e-8f6ee99d696d" Apr 16 17:40:21.438777 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:21.438743 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-769fj" podUID="cac5b68b-21bc-4998-8cf4-855cf71cdc45" Apr 16 17:40:23.438623 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:23.438583 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-769fj" Apr 16 17:40:23.439040 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:23.438586 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:40:23.439040 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:23.438705 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-769fj" podUID="cac5b68b-21bc-4998-8cf4-855cf71cdc45" Apr 16 17:40:23.439040 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:23.438813 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n4qhr" podUID="7db52a98-86b8-46da-a83e-8f6ee99d696d" Apr 16 17:40:24.037634 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:24.037586 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7db52a98-86b8-46da-a83e-8f6ee99d696d-metrics-certs\") pod \"network-metrics-daemon-n4qhr\" (UID: \"7db52a98-86b8-46da-a83e-8f6ee99d696d\") " pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:40:24.037818 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:24.037735 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:24.037818 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:24.037808 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7db52a98-86b8-46da-a83e-8f6ee99d696d-metrics-certs podName:7db52a98-86b8-46da-a83e-8f6ee99d696d nodeName:}" failed. No retries permitted until 2026-04-16 17:40:40.037788545 +0000 UTC m=+34.250821678 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7db52a98-86b8-46da-a83e-8f6ee99d696d-metrics-certs") pod "network-metrics-daemon-n4qhr" (UID: "7db52a98-86b8-46da-a83e-8f6ee99d696d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 17:40:24.239666 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:24.239627 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7559\" (UniqueName: \"kubernetes.io/projected/cac5b68b-21bc-4998-8cf4-855cf71cdc45-kube-api-access-l7559\") pod \"network-check-target-769fj\" (UID: \"cac5b68b-21bc-4998-8cf4-855cf71cdc45\") " pod="openshift-network-diagnostics/network-check-target-769fj" Apr 16 17:40:24.239833 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:24.239792 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 17:40:24.239833 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:24.239814 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 17:40:24.239833 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:24.239823 2573 projected.go:194] Error preparing data for projected volume kube-api-access-l7559 for pod openshift-network-diagnostics/network-check-target-769fj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:24.239971 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:24.239885 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cac5b68b-21bc-4998-8cf4-855cf71cdc45-kube-api-access-l7559 podName:cac5b68b-21bc-4998-8cf4-855cf71cdc45 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:40.239866988 +0000 UTC m=+34.452900133 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-l7559" (UniqueName: "kubernetes.io/projected/cac5b68b-21bc-4998-8cf4-855cf71cdc45-kube-api-access-l7559") pod "network-check-target-769fj" (UID: "cac5b68b-21bc-4998-8cf4-855cf71cdc45") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 17:40:25.438691 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:25.438652 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-769fj" Apr 16 17:40:25.439138 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:25.438659 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:40:25.439138 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:25.438786 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-769fj" podUID="cac5b68b-21bc-4998-8cf4-855cf71cdc45" Apr 16 17:40:25.439138 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:25.438861 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n4qhr" podUID="7db52a98-86b8-46da-a83e-8f6ee99d696d" Apr 16 17:40:26.498582 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:26.497688 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-241.ec2.internal" event={"ID":"59def4faf8392fa6f55a2e9f2ba8a155","Type":"ContainerStarted","Data":"e1dec7f5404ab4376eec92835f87d79e0fc90531808330b5c7feee75611fc95b"} Apr 16 17:40:26.500339 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:26.500252 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tbmqq" event={"ID":"c07dd47a-0eea-4a37-908a-194889b059cd","Type":"ContainerStarted","Data":"65e2a0a771afb3604dee376866eeb5bde715957cd30726c26bbb29cb6bb5faf6"} Apr 16 17:40:26.502956 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:26.502473 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wrnk9" event={"ID":"c185fd76-c69c-433e-9b66-55227ea35aa0","Type":"ContainerStarted","Data":"b26e52e4efccd7adf42b223c1183b60ad521ca8050faa1bc9160f49471c54599"} Apr 16 17:40:26.504542 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:26.504496 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-clpc9" event={"ID":"5626e70b-1b0e-424a-af3b-d0dba055fd1b","Type":"ContainerStarted","Data":"ca03cdb805668a739fc4b2fc8a7bd44b76d455396b475dc60e779e8551e78a87"} Apr 16 17:40:26.505935 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:26.505833 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" event={"ID":"efbec2a2-d66f-4338-b932-105a8b5ba652","Type":"ContainerStarted","Data":"a665a9d47c22ee33f98c255de461662adf3d2bb76333f4d675340a7c21ed462b"} Apr 16 17:40:26.529549 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:26.529507 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-241.ec2.internal" podStartSLOduration=19.529492551 podStartE2EDuration="19.529492551s" podCreationTimestamp="2026-04-16 17:40:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:40:26.516379834 +0000 UTC m=+20.729412976" watchObservedRunningTime="2026-04-16 17:40:26.529492551 +0000 UTC m=+20.742525732" Apr 16 17:40:26.543709 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:26.543609 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-tbmqq" podStartSLOduration=8.332629335 podStartE2EDuration="20.543589093s" podCreationTimestamp="2026-04-16 17:40:06 +0000 UTC" firstStartedPulling="2026-04-16 17:40:09.201555342 +0000 UTC m=+3.414588469" lastFinishedPulling="2026-04-16 17:40:21.412515101 +0000 UTC m=+15.625548227" observedRunningTime="2026-04-16 17:40:26.543248712 +0000 UTC m=+20.756281961" watchObservedRunningTime="2026-04-16 17:40:26.543589093 +0000 UTC m=+20.756622243" Apr 16 17:40:26.543958 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:26.543790 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-clpc9" podStartSLOduration=3.621179624 podStartE2EDuration="20.543780373s" podCreationTimestamp="2026-04-16 17:40:06 +0000 UTC" firstStartedPulling="2026-04-16 17:40:09.225690055 +0000 UTC m=+3.438723184" lastFinishedPulling="2026-04-16 17:40:26.148290792 +0000 UTC m=+20.361323933" observedRunningTime="2026-04-16 17:40:26.52981618 +0000 UTC m=+20.742849328" watchObservedRunningTime="2026-04-16 17:40:26.543780373 +0000 UTC m=+20.756813537" Apr 16 17:40:26.558638 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:26.558595 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-wrnk9" podStartSLOduration=3.598141653 podStartE2EDuration="20.558578627s" podCreationTimestamp="2026-04-16 17:40:06 +0000 UTC" firstStartedPulling="2026-04-16 17:40:09.225356287 +0000 UTC m=+3.438389429" lastFinishedPulling="2026-04-16 17:40:26.185793277 +0000 UTC m=+20.398826403" observedRunningTime="2026-04-16 17:40:26.558021666 +0000 UTC m=+20.771054829" watchObservedRunningTime="2026-04-16 17:40:26.558578627 +0000 UTC m=+20.771611776" Apr 16 17:40:26.572466 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:26.572426 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-xz9gn" podStartSLOduration=3.595331461 podStartE2EDuration="20.572412416s" podCreationTimestamp="2026-04-16 17:40:06 +0000 UTC" firstStartedPulling="2026-04-16 17:40:09.197145055 +0000 UTC m=+3.410178185" lastFinishedPulling="2026-04-16 17:40:26.174226001 +0000 UTC m=+20.387259140" observedRunningTime="2026-04-16 17:40:26.572035162 +0000 UTC m=+20.785068312" watchObservedRunningTime="2026-04-16 17:40:26.572412416 +0000 UTC m=+20.785445564" Apr 16 17:40:27.394367 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:27.394342 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 17:40:27.438018 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:27.437997 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-769fj" Apr 16 17:40:27.438113 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:27.438003 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:40:27.438168 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:27.438115 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-769fj" podUID="cac5b68b-21bc-4998-8cf4-855cf71cdc45" Apr 16 17:40:27.438203 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:27.438167 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n4qhr" podUID="7db52a98-86b8-46da-a83e-8f6ee99d696d" Apr 16 17:40:27.508875 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:27.508842 2573 generic.go:358] "Generic (PLEG): container finished" podID="3af705fc-ec69-4117-8797-2dacaf0f64e4" containerID="6c4236af7f8efdd197dac9842d66a4f8b2147520941a6141265f124768330509" exitCode=0 Apr 16 17:40:27.509685 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:27.508915 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6fl9q" event={"ID":"3af705fc-ec69-4117-8797-2dacaf0f64e4","Type":"ContainerDied","Data":"6c4236af7f8efdd197dac9842d66a4f8b2147520941a6141265f124768330509"} Apr 16 17:40:27.510187 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:27.510157 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-l2szv" event={"ID":"76b37f14-34f7-4661-ad91-459fb138a436","Type":"ContainerStarted","Data":"fc4b8b8906863a27691b35bbc255e76bc57e8c68addddb74dfd4ebdd04064e3f"} Apr 16 17:40:27.512668 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:27.512645 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" event={"ID":"af1b1635-1312-477a-9354-2b356990c171","Type":"ContainerStarted","Data":"6f2e1ff4d890fe86e6e27ca3d4a3b646f5ff9861ccc5e0793b1a888bed39c409"} Apr 16 17:40:27.512773 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:27.512674 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" event={"ID":"af1b1635-1312-477a-9354-2b356990c171","Type":"ContainerStarted","Data":"843c1a07456f5c59f83945e4a2f97701c9b31bbba97afd4cad116c6de8d3b4be"} Apr 16 17:40:27.512773 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:27.512686 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" event={"ID":"af1b1635-1312-477a-9354-2b356990c171","Type":"ContainerStarted","Data":"d06107353062456fd19e8b838a498105d5ff82b684646426918d6979f8b61fc5"} Apr 16 17:40:27.512773 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:27.512695 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" event={"ID":"af1b1635-1312-477a-9354-2b356990c171","Type":"ContainerStarted","Data":"9391db0c24b1c9d438c746dca111b13f362c88425471eeba872fccbf9c3b1815"} Apr 16 17:40:27.512773 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:27.512706 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" event={"ID":"af1b1635-1312-477a-9354-2b356990c171","Type":"ContainerStarted","Data":"34c6b8e1eb13149aa78bac1915ff70e8eeb99b25bf1adb735804530ba3e50818"} Apr 16 17:40:27.512773 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:27.512725 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" event={"ID":"af1b1635-1312-477a-9354-2b356990c171","Type":"ContainerStarted","Data":"f38f91a163681357f2576cc49ab119b814ee15027daaf1977f5d0dc651f945ea"} Apr 16 17:40:27.514121 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:27.514102 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8nzc" event={"ID":"0068bf9d-29ac-42bf-95e1-3f57493e68f0","Type":"ContainerStarted","Data":"f0e012aebf8abb917f723c7c8218282f0b0725c51bfef7b764365873e6c69e09"} Apr 16 17:40:27.514197 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:27.514125 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8nzc" event={"ID":"0068bf9d-29ac-42bf-95e1-3f57493e68f0","Type":"ContainerStarted","Data":"c2deca6f3994a8fdb2fdc82f51d603b5e3fbd150cc2bf877d11620363bb50c2b"} Apr 16 17:40:27.515445 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:27.515405 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-jm645" event={"ID":"b1fc7afb-d9c4-43bd-8d20-fbefd9221162","Type":"ContainerStarted","Data":"6598a81a2c2f746fe2d6c872873b8e48ce3a47753c298cf05ac01fe8259d8c6d"} Apr 16 17:40:27.546356 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:27.546272 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-l2szv" podStartSLOduration=4.598442141 podStartE2EDuration="21.546258274s" podCreationTimestamp="2026-04-16 17:40:06 +0000 UTC" firstStartedPulling="2026-04-16 17:40:09.225437436 +0000 UTC m=+3.438470569" lastFinishedPulling="2026-04-16 17:40:26.173253569 +0000 UTC m=+20.386286702" observedRunningTime="2026-04-16 17:40:27.54606179 +0000 UTC m=+21.759094949" watchObservedRunningTime="2026-04-16 17:40:27.546258274 +0000 UTC m=+21.759291422" Apr 16 17:40:27.559865 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:27.559826 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-jm645" podStartSLOduration=4.607513061 podStartE2EDuration="21.559813201s" podCreationTimestamp="2026-04-16 17:40:06 +0000 UTC" firstStartedPulling="2026-04-16 17:40:09.195659546 +0000 UTC m=+3.408692675" lastFinishedPulling="2026-04-16 17:40:26.147959675 +0000 UTC m=+20.360992815" observedRunningTime="2026-04-16 17:40:27.559276696 +0000 UTC m=+21.772309854" watchObservedRunningTime="2026-04-16 17:40:27.559813201 +0000 UTC m=+21.772846346" Apr 16 17:40:28.386118 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:28.386000 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T17:40:27.39436097Z","UUID":"9d1b5f91-bb5f-41dc-b50f-6d385c798e69","Handler":null,"Name":"","Endpoint":""} Apr 16 17:40:28.388541 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:28.388513 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 17:40:28.388675 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:28.388553 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 17:40:28.519385 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:28.519306 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8nzc" event={"ID":"0068bf9d-29ac-42bf-95e1-3f57493e68f0","Type":"ContainerStarted","Data":"7ea2f1d685fad11a83da9fbcc7e7fbdac309007f545e4a70c2af2b1dcbf53318"} Apr 16 17:40:28.554281 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:28.554206 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-q8nzc" podStartSLOduration=3.528240042 podStartE2EDuration="22.554188052s" podCreationTimestamp="2026-04-16 17:40:06 +0000 UTC" firstStartedPulling="2026-04-16 17:40:09.200842362 +0000 UTC m=+3.413875488" lastFinishedPulling="2026-04-16 17:40:28.226790357 +0000 UTC m=+22.439823498" observedRunningTime="2026-04-16 17:40:28.553512472 +0000 UTC m=+22.766545648" watchObservedRunningTime="2026-04-16 17:40:28.554188052 +0000 UTC m=+22.767221201" Apr 16 17:40:29.438491 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:29.438449 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:40:29.438694 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:29.438505 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-769fj" Apr 16 17:40:29.438694 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:29.438611 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n4qhr" podUID="7db52a98-86b8-46da-a83e-8f6ee99d696d" Apr 16 17:40:29.438811 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:29.438737 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-769fj" podUID="cac5b68b-21bc-4998-8cf4-855cf71cdc45" Apr 16 17:40:29.524280 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:29.524245 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" event={"ID":"af1b1635-1312-477a-9354-2b356990c171","Type":"ContainerStarted","Data":"9987b1da39efb7463bcea956582503393d156052a9e11749aeceb2d5d544cfae"} Apr 16 17:40:30.596170 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:30.596139 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-tbmqq" Apr 16 17:40:30.596896 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:30.596873 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-tbmqq" Apr 16 17:40:31.437894 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:31.437766 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:40:31.437989 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:31.437806 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-769fj" Apr 16 17:40:31.438041 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:31.438011 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n4qhr" podUID="7db52a98-86b8-46da-a83e-8f6ee99d696d" Apr 16 17:40:31.438094 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:31.438064 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-769fj" podUID="cac5b68b-21bc-4998-8cf4-855cf71cdc45" Apr 16 17:40:31.530781 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:31.530643 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" event={"ID":"af1b1635-1312-477a-9354-2b356990c171","Type":"ContainerStarted","Data":"e426c31777acaf7ca3e99737b51988bbc7387eee4d13d6cb3a354f6560165322"} Apr 16 17:40:31.530932 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:31.530806 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-tbmqq" Apr 16 17:40:31.530932 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:31.530841 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:31.530932 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:31.530855 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:31.530932 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:31.530868 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:31.531866 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:31.531668 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-tbmqq" Apr 16 17:40:31.546266 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:31.546226 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:31.547122 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:31.546988 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:40:31.563031 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:31.562929 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" podStartSLOduration=8.331600019 podStartE2EDuration="25.562914049s" podCreationTimestamp="2026-04-16 17:40:06 +0000 UTC" firstStartedPulling="2026-04-16 17:40:09.20273976 +0000 UTC m=+3.415772900" lastFinishedPulling="2026-04-16 17:40:26.434053796 +0000 UTC m=+20.647086930" observedRunningTime="2026-04-16 17:40:31.560508734 +0000 UTC m=+25.773541882" watchObservedRunningTime="2026-04-16 17:40:31.562914049 +0000 UTC m=+25.775947199" Apr 16 17:40:32.336338 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:32.336301 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-bklp6"] Apr 16 17:40:32.353161 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:32.353134 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bklp6" Apr 16 17:40:32.353296 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:32.353211 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bklp6" podUID="f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd" Apr 16 17:40:32.495753 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:32.495721 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd-dbus\") pod \"global-pull-secret-syncer-bklp6\" (UID: \"f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd\") " pod="kube-system/global-pull-secret-syncer-bklp6" Apr 16 17:40:32.495902 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:32.495777 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd-original-pull-secret\") pod \"global-pull-secret-syncer-bklp6\" (UID: \"f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd\") " pod="kube-system/global-pull-secret-syncer-bklp6" Apr 16 17:40:32.495902 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:32.495842 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd-kubelet-config\") pod \"global-pull-secret-syncer-bklp6\" (UID: \"f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd\") " pod="kube-system/global-pull-secret-syncer-bklp6" Apr 16 17:40:32.533496 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:32.533468 2573 generic.go:358] "Generic (PLEG): container finished" podID="3af705fc-ec69-4117-8797-2dacaf0f64e4" containerID="587e781ff2a64639e6f625bb1ce99ef2918db7ceeac6c4084c49fb72c28f51d1" exitCode=0 Apr 16 17:40:32.533620 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:32.533551 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6fl9q" event={"ID":"3af705fc-ec69-4117-8797-2dacaf0f64e4","Type":"ContainerDied","Data":"587e781ff2a64639e6f625bb1ce99ef2918db7ceeac6c4084c49fb72c28f51d1"} Apr 16 17:40:32.596807 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:32.596504 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd-original-pull-secret\") pod \"global-pull-secret-syncer-bklp6\" (UID: \"f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd\") " pod="kube-system/global-pull-secret-syncer-bklp6" Apr 16 17:40:32.596807 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:32.596724 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 17:40:32.596807 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:32.596792 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd-kubelet-config\") pod \"global-pull-secret-syncer-bklp6\" (UID: \"f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd\") " pod="kube-system/global-pull-secret-syncer-bklp6" Apr 16 17:40:32.597068 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:32.596826 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd-original-pull-secret podName:f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd nodeName:}" failed. No retries permitted until 2026-04-16 17:40:33.096807199 +0000 UTC m=+27.309840330 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd-original-pull-secret") pod "global-pull-secret-syncer-bklp6" (UID: "f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd") : object "kube-system"/"original-pull-secret" not registered Apr 16 17:40:32.597068 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:32.596854 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd-kubelet-config\") pod \"global-pull-secret-syncer-bklp6\" (UID: \"f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd\") " pod="kube-system/global-pull-secret-syncer-bklp6" Apr 16 17:40:32.597068 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:32.597025 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd-dbus\") pod \"global-pull-secret-syncer-bklp6\" (UID: \"f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd\") " pod="kube-system/global-pull-secret-syncer-bklp6" Apr 16 17:40:32.597302 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:32.597286 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd-dbus\") pod \"global-pull-secret-syncer-bklp6\" (UID: \"f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd\") " pod="kube-system/global-pull-secret-syncer-bklp6" Apr 16 17:40:33.106175 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:33.106149 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd-original-pull-secret\") pod \"global-pull-secret-syncer-bklp6\" (UID: \"f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd\") " pod="kube-system/global-pull-secret-syncer-bklp6" Apr 16 17:40:33.106306 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:33.106289 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 17:40:33.106363 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:33.106354 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd-original-pull-secret podName:f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd nodeName:}" failed. No retries permitted until 2026-04-16 17:40:34.106338908 +0000 UTC m=+28.319372035 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd-original-pull-secret") pod "global-pull-secret-syncer-bklp6" (UID: "f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd") : object "kube-system"/"original-pull-secret" not registered Apr 16 17:40:33.438286 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:33.438256 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-769fj" Apr 16 17:40:33.438286 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:33.438265 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:40:33.438782 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:33.438420 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n4qhr" podUID="7db52a98-86b8-46da-a83e-8f6ee99d696d" Apr 16 17:40:33.438876 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:33.438832 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-769fj" podUID="cac5b68b-21bc-4998-8cf4-855cf71cdc45" Apr 16 17:40:33.446355 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:33.446335 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bklp6"] Apr 16 17:40:33.446481 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:33.446414 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bklp6" Apr 16 17:40:33.446536 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:33.446509 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bklp6" podUID="f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd" Apr 16 17:40:33.450414 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:33.450390 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-n4qhr"] Apr 16 17:40:33.450928 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:33.450905 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-769fj"] Apr 16 17:40:33.537444 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:33.537412 2573 generic.go:358] "Generic (PLEG): container finished" podID="3af705fc-ec69-4117-8797-2dacaf0f64e4" containerID="1f60cfbfafe123a967773f66150339f218d6898e63ec1437381f9cb7f15ea9f2" exitCode=0 Apr 16 17:40:33.537594 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:33.537494 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-769fj" Apr 16 17:40:33.537594 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:33.537526 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6fl9q" event={"ID":"3af705fc-ec69-4117-8797-2dacaf0f64e4","Type":"ContainerDied","Data":"1f60cfbfafe123a967773f66150339f218d6898e63ec1437381f9cb7f15ea9f2"} Apr 16 17:40:33.537841 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:33.537802 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-769fj" podUID="cac5b68b-21bc-4998-8cf4-855cf71cdc45" Apr 16 17:40:33.538573 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:33.538281 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:40:33.538573 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:33.538411 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n4qhr" podUID="7db52a98-86b8-46da-a83e-8f6ee99d696d" Apr 16 17:40:34.112968 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:34.112940 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd-original-pull-secret\") pod \"global-pull-secret-syncer-bklp6\" (UID: \"f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd\") " pod="kube-system/global-pull-secret-syncer-bklp6" Apr 16 17:40:34.113101 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:34.113066 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 17:40:34.113160 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:34.113130 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd-original-pull-secret podName:f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd nodeName:}" failed. No retries permitted until 2026-04-16 17:40:36.113112004 +0000 UTC m=+30.326145130 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd-original-pull-secret") pod "global-pull-secret-syncer-bklp6" (UID: "f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd") : object "kube-system"/"original-pull-secret" not registered Apr 16 17:40:34.541254 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:34.541200 2573 generic.go:358] "Generic (PLEG): container finished" podID="3af705fc-ec69-4117-8797-2dacaf0f64e4" containerID="0adbc9dc1e484083feca2b4db18a5c0dc38418252f5f0dc7fcd5d4b77e9655f0" exitCode=0 Apr 16 17:40:34.541631 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:34.541289 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6fl9q" event={"ID":"3af705fc-ec69-4117-8797-2dacaf0f64e4","Type":"ContainerDied","Data":"0adbc9dc1e484083feca2b4db18a5c0dc38418252f5f0dc7fcd5d4b77e9655f0"} Apr 16 17:40:35.438181 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:35.438150 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:40:35.438359 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:35.438150 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-769fj" Apr 16 17:40:35.438359 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:35.438150 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bklp6" Apr 16 17:40:35.438359 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:35.438311 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n4qhr" podUID="7db52a98-86b8-46da-a83e-8f6ee99d696d" Apr 16 17:40:35.438512 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:35.438437 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-769fj" podUID="cac5b68b-21bc-4998-8cf4-855cf71cdc45" Apr 16 17:40:35.438617 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:35.438528 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bklp6" podUID="f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd" Apr 16 17:40:36.128167 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:36.128136 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd-original-pull-secret\") pod \"global-pull-secret-syncer-bklp6\" (UID: \"f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd\") " pod="kube-system/global-pull-secret-syncer-bklp6" Apr 16 17:40:36.128618 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:36.128318 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 17:40:36.128618 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:36.128391 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd-original-pull-secret podName:f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd nodeName:}" failed. No retries permitted until 2026-04-16 17:40:40.128376492 +0000 UTC m=+34.341409618 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd-original-pull-secret") pod "global-pull-secret-syncer-bklp6" (UID: "f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd") : object "kube-system"/"original-pull-secret" not registered Apr 16 17:40:37.438340 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:37.438089 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-769fj" Apr 16 17:40:37.438748 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:37.438089 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bklp6" Apr 16 17:40:37.438748 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:37.438428 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-769fj" podUID="cac5b68b-21bc-4998-8cf4-855cf71cdc45" Apr 16 17:40:37.438748 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:37.438090 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:40:37.438748 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:37.438477 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bklp6" podUID="f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd" Apr 16 17:40:37.438748 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:37.438582 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n4qhr" podUID="7db52a98-86b8-46da-a83e-8f6ee99d696d" Apr 16 17:40:38.098401 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.098325 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-241.ec2.internal" event="NodeReady" Apr 16 17:40:38.098557 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.098450 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 17:40:38.138663 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.138631 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-9l9b5"] Apr 16 17:40:38.172823 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.172780 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6d7d9f595c-np56x"] Apr 16 17:40:38.174295 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.174247 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-9l9b5" Apr 16 17:40:38.179497 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.179469 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 17:40:38.179649 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.179568 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 17:40:38.179738 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.179719 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 17:40:38.180155 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.180132 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-sgqj7\"" Apr 16 17:40:38.203085 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.203039 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-vhf4j"] Apr 16 17:40:38.203262 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.203109 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:40:38.206039 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.206014 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-f5d25\"" Apr 16 17:40:38.206158 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.206045 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 17:40:38.206158 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.206070 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 17:40:38.206158 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.206070 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 17:40:38.221680 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.221653 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 17:40:38.227774 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.227754 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-zjsrk"] Apr 16 17:40:38.227928 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.227904 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-vhf4j" Apr 16 17:40:38.230543 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.230522 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 17:40:38.230631 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.230542 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 17:40:38.230631 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.230571 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 17:40:38.230631 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.230604 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 17:40:38.230912 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.230894 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-8fzr8\"" Apr 16 17:40:38.235663 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.235643 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 17:40:38.240203 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.240186 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-575b7b88bd-d5lx2"] Apr 16 17:40:38.240341 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.240325 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zjsrk" Apr 16 17:40:38.243538 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.243515 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-9l9b5\" (UID: \"7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-9l9b5" Apr 16 17:40:38.243618 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.243548 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlklv\" (UniqueName: \"kubernetes.io/projected/7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d-kube-api-access-qlklv\") pod \"cluster-samples-operator-667775844f-9l9b5\" (UID: \"7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-9l9b5" Apr 16 17:40:38.244477 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.244457 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 17:40:38.244573 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.244551 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-8sq5z\"" Apr 16 17:40:38.244749 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.244731 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 17:40:38.244853 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.244772 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 17:40:38.244921 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.244865 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 17:40:38.252194 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.252177 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-lbkd4"] Apr 16 17:40:38.252355 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.252336 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:40:38.255880 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.255854 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 17:40:38.255977 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.255905 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 17:40:38.256043 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.256018 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 17:40:38.256183 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.256164 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 17:40:38.256275 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.256168 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 17:40:38.256275 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.256170 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-xkkgf\"" Apr 16 17:40:38.256275 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.256262 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 17:40:38.265323 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.265301 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-9l9b5"] Apr 16 17:40:38.265411 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.265330 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-8gx9z"] Apr 16 17:40:38.265485 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.265468 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-lbkd4" Apr 16 17:40:38.269164 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.269142 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 17:40:38.269867 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.269848 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-57dq6\"" Apr 16 17:40:38.270792 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.270737 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 17:40:38.286272 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.286248 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-gdtq9"] Apr 16 17:40:38.286555 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.286532 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-8gx9z" Apr 16 17:40:38.289681 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.289661 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 17:40:38.289785 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.289701 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 17:40:38.289785 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.289661 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 17:40:38.290205 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.290185 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 17:40:38.290926 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.290906 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-28tzm\"" Apr 16 17:40:38.298328 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.298307 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-fmwp9"] Apr 16 17:40:38.298453 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.298433 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-gdtq9" Apr 16 17:40:38.301533 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.301478 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 17:40:38.301533 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.301493 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-lvg6s\"" Apr 16 17:40:38.301746 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.301730 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 17:40:38.310259 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.310242 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-985p5"] Apr 16 17:40:38.310407 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.310389 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-fmwp9" Apr 16 17:40:38.313452 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.313266 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 17:40:38.313452 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.313331 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 17:40:38.313452 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.313346 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-tq7dq\"" Apr 16 17:40:38.313630 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.313563 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 17:40:38.314493 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.314460 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 17:40:38.322453 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.322435 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-fmxzp"] Apr 16 17:40:38.322589 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.322571 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-985p5" Apr 16 17:40:38.325337 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.325284 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 17:40:38.325435 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.325287 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 17:40:38.325508 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.325446 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-x49pg\"" Apr 16 17:40:38.334490 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.334468 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-vhf4j"] Apr 16 17:40:38.334588 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.334494 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6d7d9f595c-np56x"] Apr 16 17:40:38.334588 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.334505 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-575b7b88bd-d5lx2"] Apr 16 17:40:38.334588 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.334515 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-zjsrk"] Apr 16 17:40:38.334588 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.334522 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-8gx9z"] Apr 16 17:40:38.334588 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.334529 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-gdtq9"] Apr 16 17:40:38.334588 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.334537 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-985p5"] Apr 16 17:40:38.334588 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.334544 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-fmxzp"] Apr 16 17:40:38.334588 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.334552 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-lbkd4"] Apr 16 17:40:38.334588 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.334591 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-gxh2q"] Apr 16 17:40:38.334983 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.334620 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-fmxzp" Apr 16 17:40:38.337103 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.337086 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 17:40:38.337374 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.337348 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 17:40:38.337492 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.337390 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 17:40:38.337492 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.337421 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-6djpq\"" Apr 16 17:40:38.337662 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.337644 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 17:40:38.342761 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.342742 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 17:40:38.343940 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.343920 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/b27c0f5d-1775-4ae8-8903-1d44802e9f35-snapshots\") pod \"insights-operator-5785d4fcdd-vhf4j\" (UID: \"b27c0f5d-1775-4ae8-8903-1d44802e9f35\") " pod="openshift-insights/insights-operator-5785d4fcdd-vhf4j" Apr 16 17:40:38.344037 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.343948 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cg4n\" (UniqueName: \"kubernetes.io/projected/584be834-546d-49ca-8379-4b77cb13e2ba-kube-api-access-4cg4n\") pod \"service-ca-operator-69965bb79d-zjsrk\" (UID: \"584be834-546d-49ca-8379-4b77cb13e2ba\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zjsrk" Apr 16 17:40:38.344037 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.343969 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-fmwp9"] Apr 16 17:40:38.344037 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.343980 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-image-registry-private-configuration\") pod \"image-registry-6d7d9f595c-np56x\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:40:38.344037 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.344000 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gxh2q"] Apr 16 17:40:38.344037 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.344016 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/550fd36d-dd5d-4bed-9110-110068110f23-default-certificate\") pod \"router-default-575b7b88bd-d5lx2\" (UID: \"550fd36d-dd5d-4bed-9110-110068110f23\") " pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:40:38.344363 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.344038 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctsld\" (UniqueName: \"kubernetes.io/projected/47905f73-0b0a-452f-bf8b-eaae31126adc-kube-api-access-ctsld\") pod \"volume-data-source-validator-7d955d5dd4-lbkd4\" (UID: \"47905f73-0b0a-452f-bf8b-eaae31126adc\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-lbkd4" Apr 16 17:40:38.344363 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.344065 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-ca-trust-extracted\") pod \"image-registry-6d7d9f595c-np56x\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:40:38.344363 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.344072 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-9d4rv"] Apr 16 17:40:38.344363 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.344081 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-registry-certificates\") pod \"image-registry-6d7d9f595c-np56x\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:40:38.344363 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.344100 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nbtd\" (UniqueName: \"kubernetes.io/projected/b27c0f5d-1775-4ae8-8903-1d44802e9f35-kube-api-access-5nbtd\") pod \"insights-operator-5785d4fcdd-vhf4j\" (UID: \"b27c0f5d-1775-4ae8-8903-1d44802e9f35\") " pod="openshift-insights/insights-operator-5785d4fcdd-vhf4j" Apr 16 17:40:38.344363 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.344135 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/550fd36d-dd5d-4bed-9110-110068110f23-stats-auth\") pod \"router-default-575b7b88bd-d5lx2\" (UID: \"550fd36d-dd5d-4bed-9110-110068110f23\") " pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:40:38.344363 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.344147 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gxh2q" Apr 16 17:40:38.344363 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.344170 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l77jk\" (UniqueName: \"kubernetes.io/projected/550fd36d-dd5d-4bed-9110-110068110f23-kube-api-access-l77jk\") pod \"router-default-575b7b88bd-d5lx2\" (UID: \"550fd36d-dd5d-4bed-9110-110068110f23\") " pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:40:38.344363 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.344280 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-9l9b5\" (UID: \"7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-9l9b5" Apr 16 17:40:38.344363 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.344310 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b27c0f5d-1775-4ae8-8903-1d44802e9f35-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-vhf4j\" (UID: \"b27c0f5d-1775-4ae8-8903-1d44802e9f35\") " pod="openshift-insights/insights-operator-5785d4fcdd-vhf4j" Apr 16 17:40:38.344363 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.344338 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqsth\" (UniqueName: \"kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-kube-api-access-cqsth\") pod \"image-registry-6d7d9f595c-np56x\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:40:38.344941 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.344374 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qlklv\" (UniqueName: \"kubernetes.io/projected/7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d-kube-api-access-qlklv\") pod \"cluster-samples-operator-667775844f-9l9b5\" (UID: \"7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-9l9b5" Apr 16 17:40:38.344941 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.344415 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/584be834-546d-49ca-8379-4b77cb13e2ba-config\") pod \"service-ca-operator-69965bb79d-zjsrk\" (UID: \"584be834-546d-49ca-8379-4b77cb13e2ba\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zjsrk" Apr 16 17:40:38.344941 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.344439 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-registry-tls\") pod \"image-registry-6d7d9f595c-np56x\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:40:38.344941 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.344467 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-bound-sa-token\") pod \"image-registry-6d7d9f595c-np56x\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:40:38.344941 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.344495 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b27c0f5d-1775-4ae8-8903-1d44802e9f35-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-vhf4j\" (UID: \"b27c0f5d-1775-4ae8-8903-1d44802e9f35\") " pod="openshift-insights/insights-operator-5785d4fcdd-vhf4j" Apr 16 17:40:38.344941 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:38.344584 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 17:40:38.344941 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.344607 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b27c0f5d-1775-4ae8-8903-1d44802e9f35-serving-cert\") pod \"insights-operator-5785d4fcdd-vhf4j\" (UID: \"b27c0f5d-1775-4ae8-8903-1d44802e9f35\") " pod="openshift-insights/insights-operator-5785d4fcdd-vhf4j" Apr 16 17:40:38.344941 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:38.344645 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d-samples-operator-tls podName:7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d nodeName:}" failed. No retries permitted until 2026-04-16 17:40:38.844628158 +0000 UTC m=+33.057661284 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d-samples-operator-tls") pod "cluster-samples-operator-667775844f-9l9b5" (UID: "7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d") : secret "samples-operator-tls" not found Apr 16 17:40:38.344941 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.344679 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b27c0f5d-1775-4ae8-8903-1d44802e9f35-tmp\") pod \"insights-operator-5785d4fcdd-vhf4j\" (UID: \"b27c0f5d-1775-4ae8-8903-1d44802e9f35\") " pod="openshift-insights/insights-operator-5785d4fcdd-vhf4j" Apr 16 17:40:38.344941 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.344705 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/550fd36d-dd5d-4bed-9110-110068110f23-service-ca-bundle\") pod \"router-default-575b7b88bd-d5lx2\" (UID: \"550fd36d-dd5d-4bed-9110-110068110f23\") " pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:40:38.344941 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.344733 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/550fd36d-dd5d-4bed-9110-110068110f23-metrics-certs\") pod \"router-default-575b7b88bd-d5lx2\" (UID: \"550fd36d-dd5d-4bed-9110-110068110f23\") " pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:40:38.344941 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.344760 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-trusted-ca\") pod \"image-registry-6d7d9f595c-np56x\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:40:38.344941 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.344809 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-installation-pull-secrets\") pod \"image-registry-6d7d9f595c-np56x\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:40:38.344941 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.344900 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/584be834-546d-49ca-8379-4b77cb13e2ba-serving-cert\") pod \"service-ca-operator-69965bb79d-zjsrk\" (UID: \"584be834-546d-49ca-8379-4b77cb13e2ba\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zjsrk" Apr 16 17:40:38.346534 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.346518 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 17:40:38.346895 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.346872 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-j7r56\"" Apr 16 17:40:38.346995 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.346949 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 17:40:38.347055 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.346951 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 17:40:38.353746 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.353725 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9d4rv"] Apr 16 17:40:38.354029 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.354007 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9d4rv" Apr 16 17:40:38.356757 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.356738 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tjhv5\"" Apr 16 17:40:38.356963 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.356923 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 17:40:38.357059 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.357019 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 17:40:38.359959 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.359935 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlklv\" (UniqueName: \"kubernetes.io/projected/7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d-kube-api-access-qlklv\") pod \"cluster-samples-operator-667775844f-9l9b5\" (UID: \"7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-9l9b5" Apr 16 17:40:38.445428 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.445388 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-image-registry-private-configuration\") pod \"image-registry-6d7d9f595c-np56x\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:40:38.445988 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.445445 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cg4n\" (UniqueName: \"kubernetes.io/projected/584be834-546d-49ca-8379-4b77cb13e2ba-kube-api-access-4cg4n\") pod \"service-ca-operator-69965bb79d-zjsrk\" (UID: \"584be834-546d-49ca-8379-4b77cb13e2ba\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zjsrk" Apr 16 17:40:38.445988 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.445600 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctsld\" (UniqueName: \"kubernetes.io/projected/47905f73-0b0a-452f-bf8b-eaae31126adc-kube-api-access-ctsld\") pod \"volume-data-source-validator-7d955d5dd4-lbkd4\" (UID: \"47905f73-0b0a-452f-bf8b-eaae31126adc\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-lbkd4" Apr 16 17:40:38.445988 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.445637 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5nbtd\" (UniqueName: \"kubernetes.io/projected/b27c0f5d-1775-4ae8-8903-1d44802e9f35-kube-api-access-5nbtd\") pod \"insights-operator-5785d4fcdd-vhf4j\" (UID: \"b27c0f5d-1775-4ae8-8903-1d44802e9f35\") " pod="openshift-insights/insights-operator-5785d4fcdd-vhf4j" Apr 16 17:40:38.445988 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.445670 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/550fd36d-dd5d-4bed-9110-110068110f23-stats-auth\") pod \"router-default-575b7b88bd-d5lx2\" (UID: \"550fd36d-dd5d-4bed-9110-110068110f23\") " pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:40:38.445988 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.445790 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31969ea1-4893-4e83-ac1e-f5882799c5da-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-8gx9z\" (UID: \"31969ea1-4893-4e83-ac1e-f5882799c5da\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-8gx9z" Apr 16 17:40:38.445988 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.445826 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqsth\" (UniqueName: \"kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-kube-api-access-cqsth\") pod \"image-registry-6d7d9f595c-np56x\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:40:38.446344 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.446024 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b27c0f5d-1775-4ae8-8903-1d44802e9f35-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-vhf4j\" (UID: \"b27c0f5d-1775-4ae8-8903-1d44802e9f35\") " pod="openshift-insights/insights-operator-5785d4fcdd-vhf4j" Apr 16 17:40:38.446344 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.446070 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-667ss\" (UniqueName: \"kubernetes.io/projected/214ea8ee-8a72-42a3-abfb-ceb3622fea44-kube-api-access-667ss\") pod \"cluster-monitoring-operator-6667474d89-fmwp9\" (UID: \"214ea8ee-8a72-42a3-abfb-ceb3622fea44\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-fmwp9" Apr 16 17:40:38.446344 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.446111 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-registry-tls\") pod \"image-registry-6d7d9f595c-np56x\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:40:38.446344 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.446139 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31969ea1-4893-4e83-ac1e-f5882799c5da-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-8gx9z\" (UID: \"31969ea1-4893-4e83-ac1e-f5882799c5da\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-8gx9z" Apr 16 17:40:38.446344 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.446189 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-bound-sa-token\") pod \"image-registry-6d7d9f595c-np56x\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:40:38.446344 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.446245 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/214ea8ee-8a72-42a3-abfb-ceb3622fea44-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-fmwp9\" (UID: \"214ea8ee-8a72-42a3-abfb-ceb3622fea44\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-fmwp9" Apr 16 17:40:38.446344 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.446280 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b27c0f5d-1775-4ae8-8903-1d44802e9f35-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-vhf4j\" (UID: \"b27c0f5d-1775-4ae8-8903-1d44802e9f35\") " pod="openshift-insights/insights-operator-5785d4fcdd-vhf4j" Apr 16 17:40:38.446344 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.446313 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b60ee2a1-c8c7-417f-a887-3f7008b3fb0a-metrics-tls\") pod \"dns-default-9d4rv\" (UID: \"b60ee2a1-c8c7-417f-a887-3f7008b3fb0a\") " pod="openshift-dns/dns-default-9d4rv" Apr 16 17:40:38.446702 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.446360 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8v68\" (UniqueName: \"kubernetes.io/projected/31969ea1-4893-4e83-ac1e-f5882799c5da-kube-api-access-d8v68\") pod \"kube-storage-version-migrator-operator-756bb7d76f-8gx9z\" (UID: \"31969ea1-4893-4e83-ac1e-f5882799c5da\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-8gx9z" Apr 16 17:40:38.446702 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.446413 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b27c0f5d-1775-4ae8-8903-1d44802e9f35-tmp\") pod \"insights-operator-5785d4fcdd-vhf4j\" (UID: \"b27c0f5d-1775-4ae8-8903-1d44802e9f35\") " pod="openshift-insights/insights-operator-5785d4fcdd-vhf4j" Apr 16 17:40:38.446702 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:38.446420 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 17:40:38.446702 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:38.446441 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d7d9f595c-np56x: secret "image-registry-tls" not found Apr 16 17:40:38.446702 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.446486 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/550fd36d-dd5d-4bed-9110-110068110f23-service-ca-bundle\") pod \"router-default-575b7b88bd-d5lx2\" (UID: \"550fd36d-dd5d-4bed-9110-110068110f23\") " pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:40:38.446702 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:38.446501 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-registry-tls podName:1246ad0a-0cbe-41eb-b415-d3d5b58224b0 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:38.946481447 +0000 UTC m=+33.159514588 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-registry-tls") pod "image-registry-6d7d9f595c-np56x" (UID: "1246ad0a-0cbe-41eb-b415-d3d5b58224b0") : secret "image-registry-tls" not found Apr 16 17:40:38.446702 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.446534 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/550fd36d-dd5d-4bed-9110-110068110f23-metrics-certs\") pod \"router-default-575b7b88bd-d5lx2\" (UID: \"550fd36d-dd5d-4bed-9110-110068110f23\") " pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:40:38.446702 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.446568 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2zxj\" (UniqueName: \"kubernetes.io/projected/1330492b-5728-49ea-8675-b3472e46d2dc-kube-api-access-f2zxj\") pod \"network-check-source-7b678d77c7-gdtq9\" (UID: \"1330492b-5728-49ea-8675-b3472e46d2dc\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-gdtq9" Apr 16 17:40:38.446702 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.446599 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgm44\" (UniqueName: \"kubernetes.io/projected/bc57afd6-d40c-42e7-a331-579d5c302355-kube-api-access-vgm44\") pod \"ingress-canary-gxh2q\" (UID: \"bc57afd6-d40c-42e7-a331-579d5c302355\") " pod="openshift-ingress-canary/ingress-canary-gxh2q" Apr 16 17:40:38.446702 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:38.446643 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/550fd36d-dd5d-4bed-9110-110068110f23-service-ca-bundle podName:550fd36d-dd5d-4bed-9110-110068110f23 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:38.946624494 +0000 UTC m=+33.159657643 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/550fd36d-dd5d-4bed-9110-110068110f23-service-ca-bundle") pod "router-default-575b7b88bd-d5lx2" (UID: "550fd36d-dd5d-4bed-9110-110068110f23") : configmap references non-existent config key: service-ca.crt Apr 16 17:40:38.446702 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.446695 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-ca-trust-extracted\") pod \"image-registry-6d7d9f595c-np56x\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:40:38.446702 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.446707 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b27c0f5d-1775-4ae8-8903-1d44802e9f35-tmp\") pod \"insights-operator-5785d4fcdd-vhf4j\" (UID: \"b27c0f5d-1775-4ae8-8903-1d44802e9f35\") " pod="openshift-insights/insights-operator-5785d4fcdd-vhf4j" Apr 16 17:40:38.447226 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.446738 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-registry-certificates\") pod \"image-registry-6d7d9f595c-np56x\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:40:38.447226 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:38.446772 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 17:40:38.447226 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.446780 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqjdw\" (UniqueName: \"kubernetes.io/projected/39536cc1-6596-4c35-a9d6-a93ef6779640-kube-api-access-nqjdw\") pod \"console-operator-d87b8d5fc-fmxzp\" (UID: \"39536cc1-6596-4c35-a9d6-a93ef6779640\") " pod="openshift-console-operator/console-operator-d87b8d5fc-fmxzp" Apr 16 17:40:38.447226 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:38.446816 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/550fd36d-dd5d-4bed-9110-110068110f23-metrics-certs podName:550fd36d-dd5d-4bed-9110-110068110f23 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:38.946803127 +0000 UTC m=+33.159836253 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/550fd36d-dd5d-4bed-9110-110068110f23-metrics-certs") pod "router-default-575b7b88bd-d5lx2" (UID: "550fd36d-dd5d-4bed-9110-110068110f23") : secret "router-metrics-certs-default" not found Apr 16 17:40:38.447226 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.446835 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/584be834-546d-49ca-8379-4b77cb13e2ba-serving-cert\") pod \"service-ca-operator-69965bb79d-zjsrk\" (UID: \"584be834-546d-49ca-8379-4b77cb13e2ba\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zjsrk" Apr 16 17:40:38.447226 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.446861 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/b27c0f5d-1775-4ae8-8903-1d44802e9f35-snapshots\") pod \"insights-operator-5785d4fcdd-vhf4j\" (UID: \"b27c0f5d-1775-4ae8-8903-1d44802e9f35\") " pod="openshift-insights/insights-operator-5785d4fcdd-vhf4j" Apr 16 17:40:38.447226 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.446889 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc57afd6-d40c-42e7-a331-579d5c302355-cert\") pod \"ingress-canary-gxh2q\" (UID: \"bc57afd6-d40c-42e7-a331-579d5c302355\") " pod="openshift-ingress-canary/ingress-canary-gxh2q" Apr 16 17:40:38.447226 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.446920 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/550fd36d-dd5d-4bed-9110-110068110f23-default-certificate\") pod \"router-default-575b7b88bd-d5lx2\" (UID: \"550fd36d-dd5d-4bed-9110-110068110f23\") " pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:40:38.447226 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.446948 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l77jk\" (UniqueName: \"kubernetes.io/projected/550fd36d-dd5d-4bed-9110-110068110f23-kube-api-access-l77jk\") pod \"router-default-575b7b88bd-d5lx2\" (UID: \"550fd36d-dd5d-4bed-9110-110068110f23\") " pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:40:38.447226 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.447050 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-ca-trust-extracted\") pod \"image-registry-6d7d9f595c-np56x\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:40:38.447226 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.447161 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b27c0f5d-1775-4ae8-8903-1d44802e9f35-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-vhf4j\" (UID: \"b27c0f5d-1775-4ae8-8903-1d44802e9f35\") " pod="openshift-insights/insights-operator-5785d4fcdd-vhf4j" Apr 16 17:40:38.447226 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.447177 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39536cc1-6596-4c35-a9d6-a93ef6779640-serving-cert\") pod \"console-operator-d87b8d5fc-fmxzp\" (UID: \"39536cc1-6596-4c35-a9d6-a93ef6779640\") " pod="openshift-console-operator/console-operator-d87b8d5fc-fmxzp" Apr 16 17:40:38.447740 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.447282 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/584be834-546d-49ca-8379-4b77cb13e2ba-config\") pod \"service-ca-operator-69965bb79d-zjsrk\" (UID: \"584be834-546d-49ca-8379-4b77cb13e2ba\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zjsrk" Apr 16 17:40:38.447740 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.447314 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-registry-certificates\") pod \"image-registry-6d7d9f595c-np56x\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:40:38.447740 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.447318 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f50c2657-216b-4259-a264-f4f602acfee8-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-985p5\" (UID: \"f50c2657-216b-4259-a264-f4f602acfee8\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-985p5" Apr 16 17:40:38.447740 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.447386 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f50c2657-216b-4259-a264-f4f602acfee8-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-985p5\" (UID: \"f50c2657-216b-4259-a264-f4f602acfee8\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-985p5" Apr 16 17:40:38.447740 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.447439 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/214ea8ee-8a72-42a3-abfb-ceb3622fea44-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-fmwp9\" (UID: \"214ea8ee-8a72-42a3-abfb-ceb3622fea44\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-fmwp9" Apr 16 17:40:38.447740 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.447477 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b60ee2a1-c8c7-417f-a887-3f7008b3fb0a-config-volume\") pod \"dns-default-9d4rv\" (UID: \"b60ee2a1-c8c7-417f-a887-3f7008b3fb0a\") " pod="openshift-dns/dns-default-9d4rv" Apr 16 17:40:38.447740 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.447504 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39536cc1-6596-4c35-a9d6-a93ef6779640-config\") pod \"console-operator-d87b8d5fc-fmxzp\" (UID: \"39536cc1-6596-4c35-a9d6-a93ef6779640\") " pod="openshift-console-operator/console-operator-d87b8d5fc-fmxzp" Apr 16 17:40:38.447740 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.447535 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39536cc1-6596-4c35-a9d6-a93ef6779640-trusted-ca\") pod \"console-operator-d87b8d5fc-fmxzp\" (UID: \"39536cc1-6596-4c35-a9d6-a93ef6779640\") " pod="openshift-console-operator/console-operator-d87b8d5fc-fmxzp" Apr 16 17:40:38.447740 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.447566 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b27c0f5d-1775-4ae8-8903-1d44802e9f35-serving-cert\") pod \"insights-operator-5785d4fcdd-vhf4j\" (UID: \"b27c0f5d-1775-4ae8-8903-1d44802e9f35\") " pod="openshift-insights/insights-operator-5785d4fcdd-vhf4j" Apr 16 17:40:38.447740 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.447589 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2wlz\" (UniqueName: \"kubernetes.io/projected/b60ee2a1-c8c7-417f-a887-3f7008b3fb0a-kube-api-access-m2wlz\") pod \"dns-default-9d4rv\" (UID: \"b60ee2a1-c8c7-417f-a887-3f7008b3fb0a\") " pod="openshift-dns/dns-default-9d4rv" Apr 16 17:40:38.447740 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.447615 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-trusted-ca\") pod \"image-registry-6d7d9f595c-np56x\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:40:38.447740 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.447629 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/b27c0f5d-1775-4ae8-8903-1d44802e9f35-snapshots\") pod \"insights-operator-5785d4fcdd-vhf4j\" (UID: \"b27c0f5d-1775-4ae8-8903-1d44802e9f35\") " pod="openshift-insights/insights-operator-5785d4fcdd-vhf4j" Apr 16 17:40:38.447740 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.447643 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-installation-pull-secrets\") pod \"image-registry-6d7d9f595c-np56x\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:40:38.447740 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.447645 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b27c0f5d-1775-4ae8-8903-1d44802e9f35-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-vhf4j\" (UID: \"b27c0f5d-1775-4ae8-8903-1d44802e9f35\") " pod="openshift-insights/insights-operator-5785d4fcdd-vhf4j" Apr 16 17:40:38.447740 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.447671 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b60ee2a1-c8c7-417f-a887-3f7008b3fb0a-tmp-dir\") pod \"dns-default-9d4rv\" (UID: \"b60ee2a1-c8c7-417f-a887-3f7008b3fb0a\") " pod="openshift-dns/dns-default-9d4rv" Apr 16 17:40:38.448654 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.448631 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-trusted-ca\") pod \"image-registry-6d7d9f595c-np56x\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:40:38.448852 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.448806 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/550fd36d-dd5d-4bed-9110-110068110f23-stats-auth\") pod \"router-default-575b7b88bd-d5lx2\" (UID: \"550fd36d-dd5d-4bed-9110-110068110f23\") " pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:40:38.450174 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.450152 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-image-registry-private-configuration\") pod \"image-registry-6d7d9f595c-np56x\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:40:38.450323 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.450259 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/550fd36d-dd5d-4bed-9110-110068110f23-default-certificate\") pod \"router-default-575b7b88bd-d5lx2\" (UID: \"550fd36d-dd5d-4bed-9110-110068110f23\") " pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:40:38.450679 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.450656 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-installation-pull-secrets\") pod \"image-registry-6d7d9f595c-np56x\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:40:38.450964 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.450923 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b27c0f5d-1775-4ae8-8903-1d44802e9f35-serving-cert\") pod \"insights-operator-5785d4fcdd-vhf4j\" (UID: \"b27c0f5d-1775-4ae8-8903-1d44802e9f35\") " pod="openshift-insights/insights-operator-5785d4fcdd-vhf4j" Apr 16 17:40:38.453423 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.453393 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/584be834-546d-49ca-8379-4b77cb13e2ba-serving-cert\") pod \"service-ca-operator-69965bb79d-zjsrk\" (UID: \"584be834-546d-49ca-8379-4b77cb13e2ba\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zjsrk" Apr 16 17:40:38.453561 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.453536 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/584be834-546d-49ca-8379-4b77cb13e2ba-config\") pod \"service-ca-operator-69965bb79d-zjsrk\" (UID: \"584be834-546d-49ca-8379-4b77cb13e2ba\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zjsrk" Apr 16 17:40:38.458459 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.458434 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nbtd\" (UniqueName: \"kubernetes.io/projected/b27c0f5d-1775-4ae8-8903-1d44802e9f35-kube-api-access-5nbtd\") pod \"insights-operator-5785d4fcdd-vhf4j\" (UID: \"b27c0f5d-1775-4ae8-8903-1d44802e9f35\") " pod="openshift-insights/insights-operator-5785d4fcdd-vhf4j" Apr 16 17:40:38.459162 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.459142 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cg4n\" (UniqueName: \"kubernetes.io/projected/584be834-546d-49ca-8379-4b77cb13e2ba-kube-api-access-4cg4n\") pod \"service-ca-operator-69965bb79d-zjsrk\" (UID: \"584be834-546d-49ca-8379-4b77cb13e2ba\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zjsrk" Apr 16 17:40:38.459546 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.459510 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctsld\" (UniqueName: \"kubernetes.io/projected/47905f73-0b0a-452f-bf8b-eaae31126adc-kube-api-access-ctsld\") pod \"volume-data-source-validator-7d955d5dd4-lbkd4\" (UID: \"47905f73-0b0a-452f-bf8b-eaae31126adc\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-lbkd4" Apr 16 17:40:38.466052 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.466032 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqsth\" (UniqueName: \"kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-kube-api-access-cqsth\") pod \"image-registry-6d7d9f595c-np56x\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:40:38.471431 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.471407 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l77jk\" (UniqueName: \"kubernetes.io/projected/550fd36d-dd5d-4bed-9110-110068110f23-kube-api-access-l77jk\") pod \"router-default-575b7b88bd-d5lx2\" (UID: \"550fd36d-dd5d-4bed-9110-110068110f23\") " pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:40:38.472063 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.472041 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-bound-sa-token\") pod \"image-registry-6d7d9f595c-np56x\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:40:38.537677 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.537640 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-vhf4j" Apr 16 17:40:38.548153 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.548117 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f50c2657-216b-4259-a264-f4f602acfee8-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-985p5\" (UID: \"f50c2657-216b-4259-a264-f4f602acfee8\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-985p5" Apr 16 17:40:38.548153 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.548163 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f50c2657-216b-4259-a264-f4f602acfee8-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-985p5\" (UID: \"f50c2657-216b-4259-a264-f4f602acfee8\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-985p5" Apr 16 17:40:38.548344 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.548192 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/214ea8ee-8a72-42a3-abfb-ceb3622fea44-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-fmwp9\" (UID: \"214ea8ee-8a72-42a3-abfb-ceb3622fea44\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-fmwp9" Apr 16 17:40:38.548344 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.548233 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b60ee2a1-c8c7-417f-a887-3f7008b3fb0a-config-volume\") pod \"dns-default-9d4rv\" (UID: \"b60ee2a1-c8c7-417f-a887-3f7008b3fb0a\") " pod="openshift-dns/dns-default-9d4rv" Apr 16 17:40:38.548344 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.548258 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39536cc1-6596-4c35-a9d6-a93ef6779640-config\") pod \"console-operator-d87b8d5fc-fmxzp\" (UID: \"39536cc1-6596-4c35-a9d6-a93ef6779640\") " pod="openshift-console-operator/console-operator-d87b8d5fc-fmxzp" Apr 16 17:40:38.548344 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.548280 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39536cc1-6596-4c35-a9d6-a93ef6779640-trusted-ca\") pod \"console-operator-d87b8d5fc-fmxzp\" (UID: \"39536cc1-6596-4c35-a9d6-a93ef6779640\") " pod="openshift-console-operator/console-operator-d87b8d5fc-fmxzp" Apr 16 17:40:38.548344 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.548309 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2wlz\" (UniqueName: \"kubernetes.io/projected/b60ee2a1-c8c7-417f-a887-3f7008b3fb0a-kube-api-access-m2wlz\") pod \"dns-default-9d4rv\" (UID: \"b60ee2a1-c8c7-417f-a887-3f7008b3fb0a\") " pod="openshift-dns/dns-default-9d4rv" Apr 16 17:40:38.549091 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:38.548718 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 17:40:38.549091 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.548764 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b60ee2a1-c8c7-417f-a887-3f7008b3fb0a-tmp-dir\") pod \"dns-default-9d4rv\" (UID: \"b60ee2a1-c8c7-417f-a887-3f7008b3fb0a\") " pod="openshift-dns/dns-default-9d4rv" Apr 16 17:40:38.549091 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:38.548793 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f50c2657-216b-4259-a264-f4f602acfee8-networking-console-plugin-cert podName:f50c2657-216b-4259-a264-f4f602acfee8 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:39.048771787 +0000 UTC m=+33.261804930 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f50c2657-216b-4259-a264-f4f602acfee8-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-985p5" (UID: "f50c2657-216b-4259-a264-f4f602acfee8") : secret "networking-console-plugin-cert" not found Apr 16 17:40:38.549091 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.548870 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31969ea1-4893-4e83-ac1e-f5882799c5da-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-8gx9z\" (UID: \"31969ea1-4893-4e83-ac1e-f5882799c5da\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-8gx9z" Apr 16 17:40:38.549091 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.548917 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-667ss\" (UniqueName: \"kubernetes.io/projected/214ea8ee-8a72-42a3-abfb-ceb3622fea44-kube-api-access-667ss\") pod \"cluster-monitoring-operator-6667474d89-fmwp9\" (UID: \"214ea8ee-8a72-42a3-abfb-ceb3622fea44\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-fmwp9" Apr 16 17:40:38.549091 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.548936 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f50c2657-216b-4259-a264-f4f602acfee8-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-985p5\" (UID: \"f50c2657-216b-4259-a264-f4f602acfee8\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-985p5" Apr 16 17:40:38.549091 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.549008 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39536cc1-6596-4c35-a9d6-a93ef6779640-config\") pod \"console-operator-d87b8d5fc-fmxzp\" (UID: \"39536cc1-6596-4c35-a9d6-a93ef6779640\") " pod="openshift-console-operator/console-operator-d87b8d5fc-fmxzp" Apr 16 17:40:38.549091 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.549032 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b60ee2a1-c8c7-417f-a887-3f7008b3fb0a-tmp-dir\") pod \"dns-default-9d4rv\" (UID: \"b60ee2a1-c8c7-417f-a887-3f7008b3fb0a\") " pod="openshift-dns/dns-default-9d4rv" Apr 16 17:40:38.549091 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.549050 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31969ea1-4893-4e83-ac1e-f5882799c5da-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-8gx9z\" (UID: \"31969ea1-4893-4e83-ac1e-f5882799c5da\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-8gx9z" Apr 16 17:40:38.549604 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.549129 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/214ea8ee-8a72-42a3-abfb-ceb3622fea44-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-fmwp9\" (UID: \"214ea8ee-8a72-42a3-abfb-ceb3622fea44\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-fmwp9" Apr 16 17:40:38.549604 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.549170 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b60ee2a1-c8c7-417f-a887-3f7008b3fb0a-metrics-tls\") pod \"dns-default-9d4rv\" (UID: \"b60ee2a1-c8c7-417f-a887-3f7008b3fb0a\") " pod="openshift-dns/dns-default-9d4rv" Apr 16 17:40:38.549604 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.549202 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d8v68\" (UniqueName: \"kubernetes.io/projected/31969ea1-4893-4e83-ac1e-f5882799c5da-kube-api-access-d8v68\") pod \"kube-storage-version-migrator-operator-756bb7d76f-8gx9z\" (UID: \"31969ea1-4893-4e83-ac1e-f5882799c5da\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-8gx9z" Apr 16 17:40:38.549604 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:38.549304 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:40:38.549604 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:38.549319 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 17:40:38.549604 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:38.549358 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b60ee2a1-c8c7-417f-a887-3f7008b3fb0a-metrics-tls podName:b60ee2a1-c8c7-417f-a887-3f7008b3fb0a nodeName:}" failed. No retries permitted until 2026-04-16 17:40:39.049342212 +0000 UTC m=+33.262375338 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b60ee2a1-c8c7-417f-a887-3f7008b3fb0a-metrics-tls") pod "dns-default-9d4rv" (UID: "b60ee2a1-c8c7-417f-a887-3f7008b3fb0a") : secret "dns-default-metrics-tls" not found Apr 16 17:40:38.549604 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:38.549396 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/214ea8ee-8a72-42a3-abfb-ceb3622fea44-cluster-monitoring-operator-tls podName:214ea8ee-8a72-42a3-abfb-ceb3622fea44 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:39.049381103 +0000 UTC m=+33.262414253 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/214ea8ee-8a72-42a3-abfb-ceb3622fea44-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-fmwp9" (UID: "214ea8ee-8a72-42a3-abfb-ceb3622fea44") : secret "cluster-monitoring-operator-tls" not found Apr 16 17:40:38.549604 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.549432 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39536cc1-6596-4c35-a9d6-a93ef6779640-trusted-ca\") pod \"console-operator-d87b8d5fc-fmxzp\" (UID: \"39536cc1-6596-4c35-a9d6-a93ef6779640\") " pod="openshift-console-operator/console-operator-d87b8d5fc-fmxzp" Apr 16 17:40:38.549604 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.549433 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2zxj\" (UniqueName: \"kubernetes.io/projected/1330492b-5728-49ea-8675-b3472e46d2dc-kube-api-access-f2zxj\") pod \"network-check-source-7b678d77c7-gdtq9\" (UID: \"1330492b-5728-49ea-8675-b3472e46d2dc\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-gdtq9" Apr 16 17:40:38.549604 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.549464 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgm44\" (UniqueName: \"kubernetes.io/projected/bc57afd6-d40c-42e7-a331-579d5c302355-kube-api-access-vgm44\") pod \"ingress-canary-gxh2q\" (UID: \"bc57afd6-d40c-42e7-a331-579d5c302355\") " pod="openshift-ingress-canary/ingress-canary-gxh2q" Apr 16 17:40:38.549604 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.549512 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/214ea8ee-8a72-42a3-abfb-ceb3622fea44-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-fmwp9\" (UID: \"214ea8ee-8a72-42a3-abfb-ceb3622fea44\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-fmwp9" Apr 16 17:40:38.549604 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.549517 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqjdw\" (UniqueName: \"kubernetes.io/projected/39536cc1-6596-4c35-a9d6-a93ef6779640-kube-api-access-nqjdw\") pod \"console-operator-d87b8d5fc-fmxzp\" (UID: \"39536cc1-6596-4c35-a9d6-a93ef6779640\") " pod="openshift-console-operator/console-operator-d87b8d5fc-fmxzp" Apr 16 17:40:38.549604 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.549584 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc57afd6-d40c-42e7-a331-579d5c302355-cert\") pod \"ingress-canary-gxh2q\" (UID: \"bc57afd6-d40c-42e7-a331-579d5c302355\") " pod="openshift-ingress-canary/ingress-canary-gxh2q" Apr 16 17:40:38.550101 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.549563 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31969ea1-4893-4e83-ac1e-f5882799c5da-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-8gx9z\" (UID: \"31969ea1-4893-4e83-ac1e-f5882799c5da\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-8gx9z" Apr 16 17:40:38.550101 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.549624 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39536cc1-6596-4c35-a9d6-a93ef6779640-serving-cert\") pod \"console-operator-d87b8d5fc-fmxzp\" (UID: \"39536cc1-6596-4c35-a9d6-a93ef6779640\") " pod="openshift-console-operator/console-operator-d87b8d5fc-fmxzp" Apr 16 17:40:38.550101 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:38.549655 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:40:38.550101 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.549712 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zjsrk" Apr 16 17:40:38.550101 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:38.549725 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc57afd6-d40c-42e7-a331-579d5c302355-cert podName:bc57afd6-d40c-42e7-a331-579d5c302355 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:39.049709905 +0000 UTC m=+33.262743056 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bc57afd6-d40c-42e7-a331-579d5c302355-cert") pod "ingress-canary-gxh2q" (UID: "bc57afd6-d40c-42e7-a331-579d5c302355") : secret "canary-serving-cert" not found Apr 16 17:40:38.550360 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.550302 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b60ee2a1-c8c7-417f-a887-3f7008b3fb0a-config-volume\") pod \"dns-default-9d4rv\" (UID: \"b60ee2a1-c8c7-417f-a887-3f7008b3fb0a\") " pod="openshift-dns/dns-default-9d4rv" Apr 16 17:40:38.551820 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.551800 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/31969ea1-4893-4e83-ac1e-f5882799c5da-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-8gx9z\" (UID: \"31969ea1-4893-4e83-ac1e-f5882799c5da\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-8gx9z" Apr 16 17:40:38.551944 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.551928 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39536cc1-6596-4c35-a9d6-a93ef6779640-serving-cert\") pod \"console-operator-d87b8d5fc-fmxzp\" (UID: \"39536cc1-6596-4c35-a9d6-a93ef6779640\") " pod="openshift-console-operator/console-operator-d87b8d5fc-fmxzp" Apr 16 17:40:38.559836 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.559726 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-667ss\" (UniqueName: \"kubernetes.io/projected/214ea8ee-8a72-42a3-abfb-ceb3622fea44-kube-api-access-667ss\") pod \"cluster-monitoring-operator-6667474d89-fmwp9\" (UID: \"214ea8ee-8a72-42a3-abfb-ceb3622fea44\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-fmwp9" Apr 16 17:40:38.560270 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.560202 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2wlz\" (UniqueName: \"kubernetes.io/projected/b60ee2a1-c8c7-417f-a887-3f7008b3fb0a-kube-api-access-m2wlz\") pod \"dns-default-9d4rv\" (UID: \"b60ee2a1-c8c7-417f-a887-3f7008b3fb0a\") " pod="openshift-dns/dns-default-9d4rv" Apr 16 17:40:38.560596 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.560575 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqjdw\" (UniqueName: \"kubernetes.io/projected/39536cc1-6596-4c35-a9d6-a93ef6779640-kube-api-access-nqjdw\") pod \"console-operator-d87b8d5fc-fmxzp\" (UID: \"39536cc1-6596-4c35-a9d6-a93ef6779640\") " pod="openshift-console-operator/console-operator-d87b8d5fc-fmxzp" Apr 16 17:40:38.560879 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.560832 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2zxj\" (UniqueName: \"kubernetes.io/projected/1330492b-5728-49ea-8675-b3472e46d2dc-kube-api-access-f2zxj\") pod \"network-check-source-7b678d77c7-gdtq9\" (UID: \"1330492b-5728-49ea-8675-b3472e46d2dc\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-gdtq9" Apr 16 17:40:38.561094 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.561072 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgm44\" (UniqueName: \"kubernetes.io/projected/bc57afd6-d40c-42e7-a331-579d5c302355-kube-api-access-vgm44\") pod \"ingress-canary-gxh2q\" (UID: \"bc57afd6-d40c-42e7-a331-579d5c302355\") " pod="openshift-ingress-canary/ingress-canary-gxh2q" Apr 16 17:40:38.561726 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.561706 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8v68\" (UniqueName: \"kubernetes.io/projected/31969ea1-4893-4e83-ac1e-f5882799c5da-kube-api-access-d8v68\") pod \"kube-storage-version-migrator-operator-756bb7d76f-8gx9z\" (UID: \"31969ea1-4893-4e83-ac1e-f5882799c5da\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-8gx9z" Apr 16 17:40:38.581188 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.581163 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-lbkd4" Apr 16 17:40:38.598037 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.598013 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-8gx9z" Apr 16 17:40:38.608877 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.608813 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-gdtq9" Apr 16 17:40:38.645877 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.645849 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-fmxzp" Apr 16 17:40:38.852540 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.852505 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-9l9b5\" (UID: \"7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-9l9b5" Apr 16 17:40:38.852718 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:38.852651 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 17:40:38.852789 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:38.852721 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d-samples-operator-tls podName:7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d nodeName:}" failed. No retries permitted until 2026-04-16 17:40:39.852703709 +0000 UTC m=+34.065736840 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d-samples-operator-tls") pod "cluster-samples-operator-667775844f-9l9b5" (UID: "7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d") : secret "samples-operator-tls" not found Apr 16 17:40:38.953037 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.952996 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-registry-tls\") pod \"image-registry-6d7d9f595c-np56x\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:40:38.953241 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:38.953151 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 17:40:38.953241 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:38.953173 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d7d9f595c-np56x: secret "image-registry-tls" not found Apr 16 17:40:38.953348 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.953147 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/550fd36d-dd5d-4bed-9110-110068110f23-service-ca-bundle\") pod \"router-default-575b7b88bd-d5lx2\" (UID: \"550fd36d-dd5d-4bed-9110-110068110f23\") " pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:40:38.953348 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:38.953243 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-registry-tls podName:1246ad0a-0cbe-41eb-b415-d3d5b58224b0 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:39.953211776 +0000 UTC m=+34.166244902 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-registry-tls") pod "image-registry-6d7d9f595c-np56x" (UID: "1246ad0a-0cbe-41eb-b415-d3d5b58224b0") : secret "image-registry-tls" not found Apr 16 17:40:38.953348 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:38.953289 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/550fd36d-dd5d-4bed-9110-110068110f23-service-ca-bundle podName:550fd36d-dd5d-4bed-9110-110068110f23 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:39.95327807 +0000 UTC m=+34.166311202 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/550fd36d-dd5d-4bed-9110-110068110f23-service-ca-bundle") pod "router-default-575b7b88bd-d5lx2" (UID: "550fd36d-dd5d-4bed-9110-110068110f23") : configmap references non-existent config key: service-ca.crt Apr 16 17:40:38.953348 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:38.953309 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/550fd36d-dd5d-4bed-9110-110068110f23-metrics-certs\") pod \"router-default-575b7b88bd-d5lx2\" (UID: \"550fd36d-dd5d-4bed-9110-110068110f23\") " pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:40:38.953535 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:38.953462 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 17:40:38.953535 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:38.953517 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/550fd36d-dd5d-4bed-9110-110068110f23-metrics-certs podName:550fd36d-dd5d-4bed-9110-110068110f23 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:39.953500471 +0000 UTC m=+34.166533607 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/550fd36d-dd5d-4bed-9110-110068110f23-metrics-certs") pod "router-default-575b7b88bd-d5lx2" (UID: "550fd36d-dd5d-4bed-9110-110068110f23") : secret "router-metrics-certs-default" not found Apr 16 17:40:39.054694 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:39.054654 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc57afd6-d40c-42e7-a331-579d5c302355-cert\") pod \"ingress-canary-gxh2q\" (UID: \"bc57afd6-d40c-42e7-a331-579d5c302355\") " pod="openshift-ingress-canary/ingress-canary-gxh2q" Apr 16 17:40:39.054867 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:39.054738 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f50c2657-216b-4259-a264-f4f602acfee8-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-985p5\" (UID: \"f50c2657-216b-4259-a264-f4f602acfee8\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-985p5" Apr 16 17:40:39.054867 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:39.054816 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:40:39.054867 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:39.054849 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/214ea8ee-8a72-42a3-abfb-ceb3622fea44-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-fmwp9\" (UID: \"214ea8ee-8a72-42a3-abfb-ceb3622fea44\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-fmwp9" Apr 16 17:40:39.055041 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:39.054887 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc57afd6-d40c-42e7-a331-579d5c302355-cert podName:bc57afd6-d40c-42e7-a331-579d5c302355 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:40.054868757 +0000 UTC m=+34.267901901 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bc57afd6-d40c-42e7-a331-579d5c302355-cert") pod "ingress-canary-gxh2q" (UID: "bc57afd6-d40c-42e7-a331-579d5c302355") : secret "canary-serving-cert" not found Apr 16 17:40:39.055041 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:39.054920 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 17:40:39.055041 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:39.054886 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 17:40:39.055041 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:39.054970 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/214ea8ee-8a72-42a3-abfb-ceb3622fea44-cluster-monitoring-operator-tls podName:214ea8ee-8a72-42a3-abfb-ceb3622fea44 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:40.054950908 +0000 UTC m=+34.267984037 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/214ea8ee-8a72-42a3-abfb-ceb3622fea44-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-fmwp9" (UID: "214ea8ee-8a72-42a3-abfb-ceb3622fea44") : secret "cluster-monitoring-operator-tls" not found Apr 16 17:40:39.055041 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:39.054915 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b60ee2a1-c8c7-417f-a887-3f7008b3fb0a-metrics-tls\") pod \"dns-default-9d4rv\" (UID: \"b60ee2a1-c8c7-417f-a887-3f7008b3fb0a\") " pod="openshift-dns/dns-default-9d4rv" Apr 16 17:40:39.055041 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:39.055005 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f50c2657-216b-4259-a264-f4f602acfee8-networking-console-plugin-cert podName:f50c2657-216b-4259-a264-f4f602acfee8 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:40.054992965 +0000 UTC m=+34.268026104 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f50c2657-216b-4259-a264-f4f602acfee8-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-985p5" (UID: "f50c2657-216b-4259-a264-f4f602acfee8") : secret "networking-console-plugin-cert" not found Apr 16 17:40:39.055041 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:39.055012 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:40:39.055408 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:39.055053 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b60ee2a1-c8c7-417f-a887-3f7008b3fb0a-metrics-tls podName:b60ee2a1-c8c7-417f-a887-3f7008b3fb0a nodeName:}" failed. No retries permitted until 2026-04-16 17:40:40.05504197 +0000 UTC m=+34.268075110 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b60ee2a1-c8c7-417f-a887-3f7008b3fb0a-metrics-tls") pod "dns-default-9d4rv" (UID: "b60ee2a1-c8c7-417f-a887-3f7008b3fb0a") : secret "dns-default-metrics-tls" not found Apr 16 17:40:39.438626 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:39.438595 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-769fj" Apr 16 17:40:39.438820 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:39.438595 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bklp6" Apr 16 17:40:39.438820 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:39.438598 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:40:39.443003 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:39.442970 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 17:40:39.443134 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:39.443014 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 17:40:39.443134 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:39.443023 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jmq5l\"" Apr 16 17:40:39.443282 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:39.443178 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wmwn6\"" Apr 16 17:40:39.862107 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:39.862011 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-9l9b5\" (UID: \"7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-9l9b5" Apr 16 17:40:39.862598 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:39.862186 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 17:40:39.862598 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:39.862282 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d-samples-operator-tls podName:7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d nodeName:}" failed. No retries permitted until 2026-04-16 17:40:41.862261142 +0000 UTC m=+36.075294274 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d-samples-operator-tls") pod "cluster-samples-operator-667775844f-9l9b5" (UID: "7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d") : secret "samples-operator-tls" not found Apr 16 17:40:39.963100 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:39.963054 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-registry-tls\") pod \"image-registry-6d7d9f595c-np56x\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:40:39.963307 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:39.963143 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/550fd36d-dd5d-4bed-9110-110068110f23-service-ca-bundle\") pod \"router-default-575b7b88bd-d5lx2\" (UID: \"550fd36d-dd5d-4bed-9110-110068110f23\") " pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:40:39.963307 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:39.963169 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/550fd36d-dd5d-4bed-9110-110068110f23-metrics-certs\") pod \"router-default-575b7b88bd-d5lx2\" (UID: \"550fd36d-dd5d-4bed-9110-110068110f23\") " pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:40:39.963307 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:39.963259 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 17:40:39.963307 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:39.963280 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d7d9f595c-np56x: secret "image-registry-tls" not found Apr 16 17:40:39.963515 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:39.963328 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 17:40:39.963515 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:39.963339 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-registry-tls podName:1246ad0a-0cbe-41eb-b415-d3d5b58224b0 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:41.963317668 +0000 UTC m=+36.176350817 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-registry-tls") pod "image-registry-6d7d9f595c-np56x" (UID: "1246ad0a-0cbe-41eb-b415-d3d5b58224b0") : secret "image-registry-tls" not found Apr 16 17:40:39.963515 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:39.963356 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/550fd36d-dd5d-4bed-9110-110068110f23-service-ca-bundle podName:550fd36d-dd5d-4bed-9110-110068110f23 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:41.96335008 +0000 UTC m=+36.176383205 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/550fd36d-dd5d-4bed-9110-110068110f23-service-ca-bundle") pod "router-default-575b7b88bd-d5lx2" (UID: "550fd36d-dd5d-4bed-9110-110068110f23") : configmap references non-existent config key: service-ca.crt Apr 16 17:40:39.963515 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:39.963378 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/550fd36d-dd5d-4bed-9110-110068110f23-metrics-certs podName:550fd36d-dd5d-4bed-9110-110068110f23 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:41.963361073 +0000 UTC m=+36.176394211 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/550fd36d-dd5d-4bed-9110-110068110f23-metrics-certs") pod "router-default-575b7b88bd-d5lx2" (UID: "550fd36d-dd5d-4bed-9110-110068110f23") : secret "router-metrics-certs-default" not found Apr 16 17:40:40.064586 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:40.064541 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b60ee2a1-c8c7-417f-a887-3f7008b3fb0a-metrics-tls\") pod \"dns-default-9d4rv\" (UID: \"b60ee2a1-c8c7-417f-a887-3f7008b3fb0a\") " pod="openshift-dns/dns-default-9d4rv" Apr 16 17:40:40.064850 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:40.064626 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7db52a98-86b8-46da-a83e-8f6ee99d696d-metrics-certs\") pod \"network-metrics-daemon-n4qhr\" (UID: \"7db52a98-86b8-46da-a83e-8f6ee99d696d\") " pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:40:40.064850 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:40.064663 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc57afd6-d40c-42e7-a331-579d5c302355-cert\") pod \"ingress-canary-gxh2q\" (UID: \"bc57afd6-d40c-42e7-a331-579d5c302355\") " pod="openshift-ingress-canary/ingress-canary-gxh2q" Apr 16 17:40:40.064850 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:40.064686 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:40:40.064850 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:40.064717 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f50c2657-216b-4259-a264-f4f602acfee8-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-985p5\" (UID: \"f50c2657-216b-4259-a264-f4f602acfee8\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-985p5" Apr 16 17:40:40.064850 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:40.064746 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b60ee2a1-c8c7-417f-a887-3f7008b3fb0a-metrics-tls podName:b60ee2a1-c8c7-417f-a887-3f7008b3fb0a nodeName:}" failed. No retries permitted until 2026-04-16 17:40:42.064731515 +0000 UTC m=+36.277764641 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b60ee2a1-c8c7-417f-a887-3f7008b3fb0a-metrics-tls") pod "dns-default-9d4rv" (UID: "b60ee2a1-c8c7-417f-a887-3f7008b3fb0a") : secret "dns-default-metrics-tls" not found Apr 16 17:40:40.064850 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:40.064750 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 17:40:40.064850 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:40.064792 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 17:40:40.064850 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:40.064800 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7db52a98-86b8-46da-a83e-8f6ee99d696d-metrics-certs podName:7db52a98-86b8-46da-a83e-8f6ee99d696d nodeName:}" failed. No retries permitted until 2026-04-16 17:41:12.064784311 +0000 UTC m=+66.277817437 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7db52a98-86b8-46da-a83e-8f6ee99d696d-metrics-certs") pod "network-metrics-daemon-n4qhr" (UID: "7db52a98-86b8-46da-a83e-8f6ee99d696d") : secret "metrics-daemon-secret" not found Apr 16 17:40:40.064850 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:40.064819 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:40:40.064850 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:40.064835 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f50c2657-216b-4259-a264-f4f602acfee8-networking-console-plugin-cert podName:f50c2657-216b-4259-a264-f4f602acfee8 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:42.064821583 +0000 UTC m=+36.277854711 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f50c2657-216b-4259-a264-f4f602acfee8-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-985p5" (UID: "f50c2657-216b-4259-a264-f4f602acfee8") : secret "networking-console-plugin-cert" not found Apr 16 17:40:40.064850 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:40.064856 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc57afd6-d40c-42e7-a331-579d5c302355-cert podName:bc57afd6-d40c-42e7-a331-579d5c302355 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:42.064845863 +0000 UTC m=+36.277878989 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bc57afd6-d40c-42e7-a331-579d5c302355-cert") pod "ingress-canary-gxh2q" (UID: "bc57afd6-d40c-42e7-a331-579d5c302355") : secret "canary-serving-cert" not found Apr 16 17:40:40.065313 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:40.064910 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/214ea8ee-8a72-42a3-abfb-ceb3622fea44-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-fmwp9\" (UID: \"214ea8ee-8a72-42a3-abfb-ceb3622fea44\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-fmwp9" Apr 16 17:40:40.065313 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:40.064970 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 17:40:40.065313 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:40.064993 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/214ea8ee-8a72-42a3-abfb-ceb3622fea44-cluster-monitoring-operator-tls podName:214ea8ee-8a72-42a3-abfb-ceb3622fea44 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:42.064986077 +0000 UTC m=+36.278019203 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/214ea8ee-8a72-42a3-abfb-ceb3622fea44-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-fmwp9" (UID: "214ea8ee-8a72-42a3-abfb-ceb3622fea44") : secret "cluster-monitoring-operator-tls" not found Apr 16 17:40:40.166597 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:40.165899 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd-original-pull-secret\") pod \"global-pull-secret-syncer-bklp6\" (UID: \"f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd\") " pod="kube-system/global-pull-secret-syncer-bklp6" Apr 16 17:40:40.171240 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:40.170411 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd-original-pull-secret\") pod \"global-pull-secret-syncer-bklp6\" (UID: \"f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd\") " pod="kube-system/global-pull-secret-syncer-bklp6" Apr 16 17:40:40.267335 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:40.267169 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7559\" (UniqueName: \"kubernetes.io/projected/cac5b68b-21bc-4998-8cf4-855cf71cdc45-kube-api-access-l7559\") pod \"network-check-target-769fj\" (UID: \"cac5b68b-21bc-4998-8cf4-855cf71cdc45\") " pod="openshift-network-diagnostics/network-check-target-769fj" Apr 16 17:40:40.278626 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:40.275291 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7559\" (UniqueName: \"kubernetes.io/projected/cac5b68b-21bc-4998-8cf4-855cf71cdc45-kube-api-access-l7559\") pod \"network-check-target-769fj\" (UID: \"cac5b68b-21bc-4998-8cf4-855cf71cdc45\") " pod="openshift-network-diagnostics/network-check-target-769fj" Apr 16 17:40:40.349775 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:40.349721 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-769fj" Apr 16 17:40:40.357593 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:40.357372 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bklp6" Apr 16 17:40:40.395586 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:40.395564 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-8gx9z"] Apr 16 17:40:40.396450 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:40.396431 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-vhf4j"] Apr 16 17:40:40.413661 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:40.413639 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-gdtq9"] Apr 16 17:40:40.429985 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:40.429944 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-fmxzp"] Apr 16 17:40:40.430845 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:40.430827 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-zjsrk"] Apr 16 17:40:40.433496 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:40.433476 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-lbkd4"] Apr 16 17:40:40.478752 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:40.478716 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb27c0f5d_1775_4ae8_8903_1d44802e9f35.slice/crio-619afc8aee8f657488ca820d2fd5fc7d83fd11ce1d3bec45cf929164f1950a5b WatchSource:0}: Error finding container 619afc8aee8f657488ca820d2fd5fc7d83fd11ce1d3bec45cf929164f1950a5b: Status 404 returned error can't find the container with id 619afc8aee8f657488ca820d2fd5fc7d83fd11ce1d3bec45cf929164f1950a5b Apr 16 17:40:40.479269 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:40.479238 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31969ea1_4893_4e83_ac1e_f5882799c5da.slice/crio-f00229f435c52f59e5c07a334679f4a5bc67af88881f1b5392f6dff27c75c2e5 WatchSource:0}: Error finding container f00229f435c52f59e5c07a334679f4a5bc67af88881f1b5392f6dff27c75c2e5: Status 404 returned error can't find the container with id f00229f435c52f59e5c07a334679f4a5bc67af88881f1b5392f6dff27c75c2e5 Apr 16 17:40:40.480631 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:40.480600 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1330492b_5728_49ea_8675_b3472e46d2dc.slice/crio-a78e385b1c97a9ec22301db2ea6a49ff460d6cb9ed95b795fc68de90fe981cdd WatchSource:0}: Error finding container a78e385b1c97a9ec22301db2ea6a49ff460d6cb9ed95b795fc68de90fe981cdd: Status 404 returned error can't find the container with id a78e385b1c97a9ec22301db2ea6a49ff460d6cb9ed95b795fc68de90fe981cdd Apr 16 17:40:40.481509 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:40.481149 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod584be834_546d_49ca_8379_4b77cb13e2ba.slice/crio-494429ad291111cc6a739f0bfd7e0b23a7b0ee37b8df3dc789d33ea4f8cabfd4 WatchSource:0}: Error finding container 494429ad291111cc6a739f0bfd7e0b23a7b0ee37b8df3dc789d33ea4f8cabfd4: Status 404 returned error can't find the container with id 494429ad291111cc6a739f0bfd7e0b23a7b0ee37b8df3dc789d33ea4f8cabfd4 Apr 16 17:40:40.482380 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:40.482116 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39536cc1_6596_4c35_a9d6_a93ef6779640.slice/crio-41f3c00599a2f747cddbc0e3b1c2b469fc31bd1759f8d4442a23b7ed56a4982c WatchSource:0}: Error finding container 41f3c00599a2f747cddbc0e3b1c2b469fc31bd1759f8d4442a23b7ed56a4982c: Status 404 returned error can't find the container with id 41f3c00599a2f747cddbc0e3b1c2b469fc31bd1759f8d4442a23b7ed56a4982c Apr 16 17:40:40.483298 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:40.483276 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47905f73_0b0a_452f_bf8b_eaae31126adc.slice/crio-beca3050b3add4662859f07b14adeeaf6a6f33ee1959c5ad4802b06ac909c896 WatchSource:0}: Error finding container beca3050b3add4662859f07b14adeeaf6a6f33ee1959c5ad4802b06ac909c896: Status 404 returned error can't find the container with id beca3050b3add4662859f07b14adeeaf6a6f33ee1959c5ad4802b06ac909c896 Apr 16 17:40:40.553690 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:40.553661 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-lbkd4" event={"ID":"47905f73-0b0a-452f-bf8b-eaae31126adc","Type":"ContainerStarted","Data":"beca3050b3add4662859f07b14adeeaf6a6f33ee1959c5ad4802b06ac909c896"} Apr 16 17:40:40.554479 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:40.554458 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-fmxzp" event={"ID":"39536cc1-6596-4c35-a9d6-a93ef6779640","Type":"ContainerStarted","Data":"41f3c00599a2f747cddbc0e3b1c2b469fc31bd1759f8d4442a23b7ed56a4982c"} Apr 16 17:40:40.555310 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:40.555287 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-gdtq9" event={"ID":"1330492b-5728-49ea-8675-b3472e46d2dc","Type":"ContainerStarted","Data":"a78e385b1c97a9ec22301db2ea6a49ff460d6cb9ed95b795fc68de90fe981cdd"} Apr 16 17:40:40.556140 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:40.556118 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-vhf4j" event={"ID":"b27c0f5d-1775-4ae8-8903-1d44802e9f35","Type":"ContainerStarted","Data":"619afc8aee8f657488ca820d2fd5fc7d83fd11ce1d3bec45cf929164f1950a5b"} Apr 16 17:40:40.556971 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:40.556953 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-8gx9z" event={"ID":"31969ea1-4893-4e83-ac1e-f5882799c5da","Type":"ContainerStarted","Data":"f00229f435c52f59e5c07a334679f4a5bc67af88881f1b5392f6dff27c75c2e5"} Apr 16 17:40:40.557813 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:40.557793 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zjsrk" event={"ID":"584be834-546d-49ca-8379-4b77cb13e2ba","Type":"ContainerStarted","Data":"494429ad291111cc6a739f0bfd7e0b23a7b0ee37b8df3dc789d33ea4f8cabfd4"} Apr 16 17:40:40.707247 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:40.707081 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-769fj"] Apr 16 17:40:40.709965 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:40.709940 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcac5b68b_21bc_4998_8cf4_855cf71cdc45.slice/crio-c8dd7b376b1bdade0f84dd1b1664990250118317852d5e5dbdb4d4e4a4a1857b WatchSource:0}: Error finding container c8dd7b376b1bdade0f84dd1b1664990250118317852d5e5dbdb4d4e4a4a1857b: Status 404 returned error can't find the container with id c8dd7b376b1bdade0f84dd1b1664990250118317852d5e5dbdb4d4e4a4a1857b Apr 16 17:40:40.737722 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:40.737695 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bklp6"] Apr 16 17:40:40.753889 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:40.753863 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d6d05d_7d39_4b4b_86d7_63f477f7f0fd.slice/crio-40006e3ffb7631b1895edc6097bd00ddded3500296fd3ae76aa131847224d284 WatchSource:0}: Error finding container 40006e3ffb7631b1895edc6097bd00ddded3500296fd3ae76aa131847224d284: Status 404 returned error can't find the container with id 40006e3ffb7631b1895edc6097bd00ddded3500296fd3ae76aa131847224d284 Apr 16 17:40:41.569232 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:41.569183 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-769fj" event={"ID":"cac5b68b-21bc-4998-8cf4-855cf71cdc45","Type":"ContainerStarted","Data":"c8dd7b376b1bdade0f84dd1b1664990250118317852d5e5dbdb4d4e4a4a1857b"} Apr 16 17:40:41.581960 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:41.581006 2573 generic.go:358] "Generic (PLEG): container finished" podID="3af705fc-ec69-4117-8797-2dacaf0f64e4" containerID="6245f0997f4bc27f6cccc8733b0c19105db2ee3bff38c405bba0a4fcc3b0d6ed" exitCode=0 Apr 16 17:40:41.581960 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:41.581095 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6fl9q" event={"ID":"3af705fc-ec69-4117-8797-2dacaf0f64e4","Type":"ContainerDied","Data":"6245f0997f4bc27f6cccc8733b0c19105db2ee3bff38c405bba0a4fcc3b0d6ed"} Apr 16 17:40:41.586157 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:41.586090 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bklp6" event={"ID":"f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd","Type":"ContainerStarted","Data":"40006e3ffb7631b1895edc6097bd00ddded3500296fd3ae76aa131847224d284"} Apr 16 17:40:41.883995 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:41.883099 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-9l9b5\" (UID: \"7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-9l9b5" Apr 16 17:40:41.883995 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:41.883441 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 17:40:41.883995 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:41.883508 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d-samples-operator-tls podName:7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d nodeName:}" failed. No retries permitted until 2026-04-16 17:40:45.883487839 +0000 UTC m=+40.096520970 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d-samples-operator-tls") pod "cluster-samples-operator-667775844f-9l9b5" (UID: "7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d") : secret "samples-operator-tls" not found Apr 16 17:40:41.985680 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:41.984674 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-registry-tls\") pod \"image-registry-6d7d9f595c-np56x\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:40:41.985680 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:41.984755 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/550fd36d-dd5d-4bed-9110-110068110f23-service-ca-bundle\") pod \"router-default-575b7b88bd-d5lx2\" (UID: \"550fd36d-dd5d-4bed-9110-110068110f23\") " pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:40:41.985680 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:41.984787 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/550fd36d-dd5d-4bed-9110-110068110f23-metrics-certs\") pod \"router-default-575b7b88bd-d5lx2\" (UID: \"550fd36d-dd5d-4bed-9110-110068110f23\") " pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:40:41.985680 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:41.984994 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 17:40:41.985680 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:41.985061 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/550fd36d-dd5d-4bed-9110-110068110f23-metrics-certs podName:550fd36d-dd5d-4bed-9110-110068110f23 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:45.985040181 +0000 UTC m=+40.198073332 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/550fd36d-dd5d-4bed-9110-110068110f23-metrics-certs") pod "router-default-575b7b88bd-d5lx2" (UID: "550fd36d-dd5d-4bed-9110-110068110f23") : secret "router-metrics-certs-default" not found Apr 16 17:40:41.985680 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:41.985504 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 17:40:41.985680 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:41.985521 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d7d9f595c-np56x: secret "image-registry-tls" not found Apr 16 17:40:41.985680 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:41.985568 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-registry-tls podName:1246ad0a-0cbe-41eb-b415-d3d5b58224b0 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:45.985551108 +0000 UTC m=+40.198584249 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-registry-tls") pod "image-registry-6d7d9f595c-np56x" (UID: "1246ad0a-0cbe-41eb-b415-d3d5b58224b0") : secret "image-registry-tls" not found Apr 16 17:40:41.985680 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:41.985643 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/550fd36d-dd5d-4bed-9110-110068110f23-service-ca-bundle podName:550fd36d-dd5d-4bed-9110-110068110f23 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:45.985630876 +0000 UTC m=+40.198664004 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/550fd36d-dd5d-4bed-9110-110068110f23-service-ca-bundle") pod "router-default-575b7b88bd-d5lx2" (UID: "550fd36d-dd5d-4bed-9110-110068110f23") : configmap references non-existent config key: service-ca.crt Apr 16 17:40:42.087344 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:42.086181 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b60ee2a1-c8c7-417f-a887-3f7008b3fb0a-metrics-tls\") pod \"dns-default-9d4rv\" (UID: \"b60ee2a1-c8c7-417f-a887-3f7008b3fb0a\") " pod="openshift-dns/dns-default-9d4rv" Apr 16 17:40:42.087344 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:42.086303 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc57afd6-d40c-42e7-a331-579d5c302355-cert\") pod \"ingress-canary-gxh2q\" (UID: \"bc57afd6-d40c-42e7-a331-579d5c302355\") " pod="openshift-ingress-canary/ingress-canary-gxh2q" Apr 16 17:40:42.087344 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:42.086358 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f50c2657-216b-4259-a264-f4f602acfee8-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-985p5\" (UID: \"f50c2657-216b-4259-a264-f4f602acfee8\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-985p5" Apr 16 17:40:42.087344 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:42.086458 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/214ea8ee-8a72-42a3-abfb-ceb3622fea44-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-fmwp9\" (UID: \"214ea8ee-8a72-42a3-abfb-ceb3622fea44\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-fmwp9" Apr 16 17:40:42.087344 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:42.086583 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 17:40:42.087344 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:42.086644 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/214ea8ee-8a72-42a3-abfb-ceb3622fea44-cluster-monitoring-operator-tls podName:214ea8ee-8a72-42a3-abfb-ceb3622fea44 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:46.086625816 +0000 UTC m=+40.299658954 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/214ea8ee-8a72-42a3-abfb-ceb3622fea44-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-fmwp9" (UID: "214ea8ee-8a72-42a3-abfb-ceb3622fea44") : secret "cluster-monitoring-operator-tls" not found Apr 16 17:40:42.087344 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:42.087051 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:40:42.087344 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:42.087104 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b60ee2a1-c8c7-417f-a887-3f7008b3fb0a-metrics-tls podName:b60ee2a1-c8c7-417f-a887-3f7008b3fb0a nodeName:}" failed. No retries permitted until 2026-04-16 17:40:46.087087915 +0000 UTC m=+40.300121055 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b60ee2a1-c8c7-417f-a887-3f7008b3fb0a-metrics-tls") pod "dns-default-9d4rv" (UID: "b60ee2a1-c8c7-417f-a887-3f7008b3fb0a") : secret "dns-default-metrics-tls" not found Apr 16 17:40:42.087344 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:42.087161 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:40:42.087344 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:42.087194 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc57afd6-d40c-42e7-a331-579d5c302355-cert podName:bc57afd6-d40c-42e7-a331-579d5c302355 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:46.087183086 +0000 UTC m=+40.300216214 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bc57afd6-d40c-42e7-a331-579d5c302355-cert") pod "ingress-canary-gxh2q" (UID: "bc57afd6-d40c-42e7-a331-579d5c302355") : secret "canary-serving-cert" not found Apr 16 17:40:42.087344 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:42.087268 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 17:40:42.087344 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:42.087302 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f50c2657-216b-4259-a264-f4f602acfee8-networking-console-plugin-cert podName:f50c2657-216b-4259-a264-f4f602acfee8 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:46.087290778 +0000 UTC m=+40.300323904 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f50c2657-216b-4259-a264-f4f602acfee8-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-985p5" (UID: "f50c2657-216b-4259-a264-f4f602acfee8") : secret "networking-console-plugin-cert" not found Apr 16 17:40:42.593862 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:42.593828 2573 generic.go:358] "Generic (PLEG): container finished" podID="3af705fc-ec69-4117-8797-2dacaf0f64e4" containerID="ee153b2eb069333d0fad11b64e0f853715d683d73776210ec0d9fcc202b8c4fe" exitCode=0 Apr 16 17:40:42.594487 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:42.593870 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6fl9q" event={"ID":"3af705fc-ec69-4117-8797-2dacaf0f64e4","Type":"ContainerDied","Data":"ee153b2eb069333d0fad11b64e0f853715d683d73776210ec0d9fcc202b8c4fe"} Apr 16 17:40:45.937871 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:45.937830 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-9l9b5\" (UID: \"7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-9l9b5" Apr 16 17:40:45.938342 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:45.938012 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 17:40:45.938342 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:45.938099 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d-samples-operator-tls podName:7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d nodeName:}" failed. No retries permitted until 2026-04-16 17:40:53.938076946 +0000 UTC m=+48.151110078 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d-samples-operator-tls") pod "cluster-samples-operator-667775844f-9l9b5" (UID: "7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d") : secret "samples-operator-tls" not found Apr 16 17:40:46.038778 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:46.038732 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-registry-tls\") pod \"image-registry-6d7d9f595c-np56x\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:40:46.038977 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:46.038825 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/550fd36d-dd5d-4bed-9110-110068110f23-service-ca-bundle\") pod \"router-default-575b7b88bd-d5lx2\" (UID: \"550fd36d-dd5d-4bed-9110-110068110f23\") " pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:40:46.038977 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:46.038836 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 17:40:46.038977 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:46.038855 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/550fd36d-dd5d-4bed-9110-110068110f23-metrics-certs\") pod \"router-default-575b7b88bd-d5lx2\" (UID: \"550fd36d-dd5d-4bed-9110-110068110f23\") " pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:40:46.038977 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:46.038861 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d7d9f595c-np56x: secret "image-registry-tls" not found Apr 16 17:40:46.038977 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:46.038921 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-registry-tls podName:1246ad0a-0cbe-41eb-b415-d3d5b58224b0 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:54.038901917 +0000 UTC m=+48.251935043 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-registry-tls") pod "image-registry-6d7d9f595c-np56x" (UID: "1246ad0a-0cbe-41eb-b415-d3d5b58224b0") : secret "image-registry-tls" not found Apr 16 17:40:46.038977 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:46.038968 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 17:40:46.039358 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:46.038990 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/550fd36d-dd5d-4bed-9110-110068110f23-service-ca-bundle podName:550fd36d-dd5d-4bed-9110-110068110f23 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:54.038972321 +0000 UTC m=+48.252005451 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/550fd36d-dd5d-4bed-9110-110068110f23-service-ca-bundle") pod "router-default-575b7b88bd-d5lx2" (UID: "550fd36d-dd5d-4bed-9110-110068110f23") : configmap references non-existent config key: service-ca.crt Apr 16 17:40:46.039358 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:46.039015 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/550fd36d-dd5d-4bed-9110-110068110f23-metrics-certs podName:550fd36d-dd5d-4bed-9110-110068110f23 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:54.039005474 +0000 UTC m=+48.252038604 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/550fd36d-dd5d-4bed-9110-110068110f23-metrics-certs") pod "router-default-575b7b88bd-d5lx2" (UID: "550fd36d-dd5d-4bed-9110-110068110f23") : secret "router-metrics-certs-default" not found Apr 16 17:40:46.139847 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:46.139810 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/214ea8ee-8a72-42a3-abfb-ceb3622fea44-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-fmwp9\" (UID: \"214ea8ee-8a72-42a3-abfb-ceb3622fea44\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-fmwp9" Apr 16 17:40:46.140057 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:46.139864 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b60ee2a1-c8c7-417f-a887-3f7008b3fb0a-metrics-tls\") pod \"dns-default-9d4rv\" (UID: \"b60ee2a1-c8c7-417f-a887-3f7008b3fb0a\") " pod="openshift-dns/dns-default-9d4rv" Apr 16 17:40:46.140057 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:46.139944 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc57afd6-d40c-42e7-a331-579d5c302355-cert\") pod \"ingress-canary-gxh2q\" (UID: \"bc57afd6-d40c-42e7-a331-579d5c302355\") " pod="openshift-ingress-canary/ingress-canary-gxh2q" Apr 16 17:40:46.140057 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:46.139954 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 17:40:46.140057 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:46.139964 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:40:46.140057 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:46.139999 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f50c2657-216b-4259-a264-f4f602acfee8-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-985p5\" (UID: \"f50c2657-216b-4259-a264-f4f602acfee8\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-985p5" Apr 16 17:40:46.140057 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:46.140024 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/214ea8ee-8a72-42a3-abfb-ceb3622fea44-cluster-monitoring-operator-tls podName:214ea8ee-8a72-42a3-abfb-ceb3622fea44 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:54.140005447 +0000 UTC m=+48.353038587 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/214ea8ee-8a72-42a3-abfb-ceb3622fea44-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-fmwp9" (UID: "214ea8ee-8a72-42a3-abfb-ceb3622fea44") : secret "cluster-monitoring-operator-tls" not found Apr 16 17:40:46.140057 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:46.140061 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b60ee2a1-c8c7-417f-a887-3f7008b3fb0a-metrics-tls podName:b60ee2a1-c8c7-417f-a887-3f7008b3fb0a nodeName:}" failed. No retries permitted until 2026-04-16 17:40:54.140042512 +0000 UTC m=+48.353075638 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b60ee2a1-c8c7-417f-a887-3f7008b3fb0a-metrics-tls") pod "dns-default-9d4rv" (UID: "b60ee2a1-c8c7-417f-a887-3f7008b3fb0a") : secret "dns-default-metrics-tls" not found Apr 16 17:40:46.140420 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:46.140092 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 17:40:46.140420 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:46.140119 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:40:46.140420 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:46.140131 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f50c2657-216b-4259-a264-f4f602acfee8-networking-console-plugin-cert podName:f50c2657-216b-4259-a264-f4f602acfee8 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:54.140117677 +0000 UTC m=+48.353150810 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f50c2657-216b-4259-a264-f4f602acfee8-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-985p5" (UID: "f50c2657-216b-4259-a264-f4f602acfee8") : secret "networking-console-plugin-cert" not found Apr 16 17:40:46.140420 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:46.140173 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc57afd6-d40c-42e7-a331-579d5c302355-cert podName:bc57afd6-d40c-42e7-a331-579d5c302355 nodeName:}" failed. No retries permitted until 2026-04-16 17:40:54.140158685 +0000 UTC m=+48.353191816 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bc57afd6-d40c-42e7-a331-579d5c302355-cert") pod "ingress-canary-gxh2q" (UID: "bc57afd6-d40c-42e7-a331-579d5c302355") : secret "canary-serving-cert" not found Apr 16 17:40:50.616714 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:50.616675 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6fl9q" event={"ID":"3af705fc-ec69-4117-8797-2dacaf0f64e4","Type":"ContainerStarted","Data":"7ed0852f5ee9a3ac846363dc129ca0a8f46af96de46d0194d901ef01311f04e6"} Apr 16 17:40:50.618537 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:50.618510 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-lbkd4" event={"ID":"47905f73-0b0a-452f-bf8b-eaae31126adc","Type":"ContainerStarted","Data":"f608add00ed05ac9abbda7f4527b73f21a3f0c92f63994f0e63d203a77754e43"} Apr 16 17:40:50.620174 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:50.620151 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-fmxzp_39536cc1-6596-4c35-a9d6-a93ef6779640/console-operator/0.log" Apr 16 17:40:50.620297 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:50.620191 2573 generic.go:358] "Generic (PLEG): container finished" podID="39536cc1-6596-4c35-a9d6-a93ef6779640" containerID="3c3a9b1957d34385be241b208a3d7a775ed614e549d83ec191896d6214f3cc71" exitCode=255 Apr 16 17:40:50.620297 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:50.620270 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-fmxzp" event={"ID":"39536cc1-6596-4c35-a9d6-a93ef6779640","Type":"ContainerDied","Data":"3c3a9b1957d34385be241b208a3d7a775ed614e549d83ec191896d6214f3cc71"} Apr 16 17:40:50.620478 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:50.620462 2573 scope.go:117] "RemoveContainer" containerID="3c3a9b1957d34385be241b208a3d7a775ed614e549d83ec191896d6214f3cc71" Apr 16 17:40:50.622044 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:50.621729 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bklp6" event={"ID":"f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd","Type":"ContainerStarted","Data":"673a8450d133fc74213b8b32921beaaaf05246d8cab0df991b2a6b0c4b533b92"} Apr 16 17:40:50.623342 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:50.623308 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-gdtq9" event={"ID":"1330492b-5728-49ea-8675-b3472e46d2dc","Type":"ContainerStarted","Data":"14d8dfaad01ebe1b6421b1556912d0850e23234c0d02a318f8fe70bbfee4aa8c"} Apr 16 17:40:50.624964 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:50.624940 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-vhf4j" event={"ID":"b27c0f5d-1775-4ae8-8903-1d44802e9f35","Type":"ContainerStarted","Data":"fde3a37ea01101589d656c4bac2fdfbaee1592da7a659c3ba8c51a9922b23580"} Apr 16 17:40:50.626357 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:50.626327 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-8gx9z" event={"ID":"31969ea1-4893-4e83-ac1e-f5882799c5da","Type":"ContainerStarted","Data":"63b84c877727520a5c05ba76c771939e20105e6c7009f58ed0332e8e7e9a8435"} Apr 16 17:40:50.627756 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:50.627713 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zjsrk" event={"ID":"584be834-546d-49ca-8379-4b77cb13e2ba","Type":"ContainerStarted","Data":"826f98ab899575dd17d2517a0d67256f5bde315912baeb0572adbbb61b5685bb"} Apr 16 17:40:50.629535 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:50.629509 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-769fj" event={"ID":"cac5b68b-21bc-4998-8cf4-855cf71cdc45","Type":"ContainerStarted","Data":"e9908ad5dcbf7591b7f371aa09b27f31db531db08120672d456f62cd44e92244"} Apr 16 17:40:50.629660 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:50.629647 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-769fj" Apr 16 17:40:50.642699 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:50.642662 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6fl9q" podStartSLOduration=13.299451611 podStartE2EDuration="44.642652007s" podCreationTimestamp="2026-04-16 17:40:06 +0000 UTC" firstStartedPulling="2026-04-16 17:40:09.196781391 +0000 UTC m=+3.409814521" lastFinishedPulling="2026-04-16 17:40:40.53998178 +0000 UTC m=+34.753014917" observedRunningTime="2026-04-16 17:40:50.642131354 +0000 UTC m=+44.855164517" watchObservedRunningTime="2026-04-16 17:40:50.642652007 +0000 UTC m=+44.855685154" Apr 16 17:40:50.659041 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:50.658994 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zjsrk" podStartSLOduration=34.22931353 podStartE2EDuration="43.658979009s" podCreationTimestamp="2026-04-16 17:40:07 +0000 UTC" firstStartedPulling="2026-04-16 17:40:40.509945187 +0000 UTC m=+34.722978327" lastFinishedPulling="2026-04-16 17:40:49.939610665 +0000 UTC m=+44.152643806" observedRunningTime="2026-04-16 17:40:50.6586545 +0000 UTC m=+44.871687649" watchObservedRunningTime="2026-04-16 17:40:50.658979009 +0000 UTC m=+44.872012157" Apr 16 17:40:50.695704 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:50.695655 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-vhf4j" podStartSLOduration=34.540894623 podStartE2EDuration="43.695637262s" podCreationTimestamp="2026-04-16 17:40:07 +0000 UTC" firstStartedPulling="2026-04-16 17:40:40.480896433 +0000 UTC m=+34.693929576" lastFinishedPulling="2026-04-16 17:40:49.635639079 +0000 UTC m=+43.848672215" observedRunningTime="2026-04-16 17:40:50.695135128 +0000 UTC m=+44.908168277" watchObservedRunningTime="2026-04-16 17:40:50.695637262 +0000 UTC m=+44.908670411" Apr 16 17:40:50.695941 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:50.695913 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-bklp6" podStartSLOduration=9.505822574 podStartE2EDuration="18.695906754s" podCreationTimestamp="2026-04-16 17:40:32 +0000 UTC" firstStartedPulling="2026-04-16 17:40:40.763660887 +0000 UTC m=+34.976694013" lastFinishedPulling="2026-04-16 17:40:49.953745062 +0000 UTC m=+44.166778193" observedRunningTime="2026-04-16 17:40:50.676242668 +0000 UTC m=+44.889275833" watchObservedRunningTime="2026-04-16 17:40:50.695906754 +0000 UTC m=+44.908939904" Apr 16 17:40:50.715031 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:50.714950 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-gdtq9" podStartSLOduration=34.285053697 podStartE2EDuration="43.714930297s" podCreationTimestamp="2026-04-16 17:40:07 +0000 UTC" firstStartedPulling="2026-04-16 17:40:40.509903384 +0000 UTC m=+34.722936516" lastFinishedPulling="2026-04-16 17:40:49.939779972 +0000 UTC m=+44.152813116" observedRunningTime="2026-04-16 17:40:50.714866974 +0000 UTC m=+44.927900125" watchObservedRunningTime="2026-04-16 17:40:50.714930297 +0000 UTC m=+44.927963443" Apr 16 17:40:50.737050 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:50.736995 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-769fj" podStartSLOduration=35.506641298 podStartE2EDuration="44.736977466s" podCreationTimestamp="2026-04-16 17:40:06 +0000 UTC" firstStartedPulling="2026-04-16 17:40:40.711964936 +0000 UTC m=+34.924998065" lastFinishedPulling="2026-04-16 17:40:49.942301107 +0000 UTC m=+44.155334233" observedRunningTime="2026-04-16 17:40:50.735777333 +0000 UTC m=+44.948810476" watchObservedRunningTime="2026-04-16 17:40:50.736977466 +0000 UTC m=+44.950010615" Apr 16 17:40:50.762818 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:50.762756 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-8gx9z" podStartSLOduration=34.304222565 podStartE2EDuration="43.762736321s" podCreationTimestamp="2026-04-16 17:40:07 +0000 UTC" firstStartedPulling="2026-04-16 17:40:40.481966426 +0000 UTC m=+34.694999553" lastFinishedPulling="2026-04-16 17:40:49.940480167 +0000 UTC m=+44.153513309" observedRunningTime="2026-04-16 17:40:50.762637373 +0000 UTC m=+44.975670545" watchObservedRunningTime="2026-04-16 17:40:50.762736321 +0000 UTC m=+44.975769470" Apr 16 17:40:51.348508 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:51.348445 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-lbkd4" podStartSLOduration=34.918058799 podStartE2EDuration="44.348426184s" podCreationTimestamp="2026-04-16 17:40:07 +0000 UTC" firstStartedPulling="2026-04-16 17:40:40.509910426 +0000 UTC m=+34.722943566" lastFinishedPulling="2026-04-16 17:40:49.940277825 +0000 UTC m=+44.153310951" observedRunningTime="2026-04-16 17:40:50.808258037 +0000 UTC m=+45.021291199" watchObservedRunningTime="2026-04-16 17:40:51.348426184 +0000 UTC m=+45.561459333" Apr 16 17:40:51.349092 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:51.349076 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-qrxlk"] Apr 16 17:40:51.369463 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:51.369434 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-qrxlk"] Apr 16 17:40:51.369615 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:51.369592 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-qrxlk" Apr 16 17:40:51.372829 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:51.372800 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 17:40:51.373976 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:51.373955 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 17:40:51.374140 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:51.373959 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-z8hdz\"" Apr 16 17:40:51.386683 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:51.386658 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfjbt\" (UniqueName: \"kubernetes.io/projected/bcc5197c-4a3c-4c47-ac22-be98c5673ab4-kube-api-access-pfjbt\") pod \"migrator-64d4d94569-qrxlk\" (UID: \"bcc5197c-4a3c-4c47-ac22-be98c5673ab4\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-qrxlk" Apr 16 17:40:51.488246 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:51.488194 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pfjbt\" (UniqueName: \"kubernetes.io/projected/bcc5197c-4a3c-4c47-ac22-be98c5673ab4-kube-api-access-pfjbt\") pod \"migrator-64d4d94569-qrxlk\" (UID: \"bcc5197c-4a3c-4c47-ac22-be98c5673ab4\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-qrxlk" Apr 16 17:40:51.501676 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:51.501642 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfjbt\" (UniqueName: \"kubernetes.io/projected/bcc5197c-4a3c-4c47-ac22-be98c5673ab4-kube-api-access-pfjbt\") pod \"migrator-64d4d94569-qrxlk\" (UID: \"bcc5197c-4a3c-4c47-ac22-be98c5673ab4\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-qrxlk" Apr 16 17:40:51.633744 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:51.633667 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-fmxzp_39536cc1-6596-4c35-a9d6-a93ef6779640/console-operator/1.log" Apr 16 17:40:51.634144 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:51.634106 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-fmxzp_39536cc1-6596-4c35-a9d6-a93ef6779640/console-operator/0.log" Apr 16 17:40:51.634199 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:51.634146 2573 generic.go:358] "Generic (PLEG): container finished" podID="39536cc1-6596-4c35-a9d6-a93ef6779640" containerID="b9544be408cada2b2517b080c19bcc26ce2cfb3279b7fa2eef347520106f6408" exitCode=255 Apr 16 17:40:51.634322 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:51.634303 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-fmxzp" event={"ID":"39536cc1-6596-4c35-a9d6-a93ef6779640","Type":"ContainerDied","Data":"b9544be408cada2b2517b080c19bcc26ce2cfb3279b7fa2eef347520106f6408"} Apr 16 17:40:51.634388 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:51.634341 2573 scope.go:117] "RemoveContainer" containerID="3c3a9b1957d34385be241b208a3d7a775ed614e549d83ec191896d6214f3cc71" Apr 16 17:40:51.634543 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:51.634523 2573 scope.go:117] "RemoveContainer" containerID="b9544be408cada2b2517b080c19bcc26ce2cfb3279b7fa2eef347520106f6408" Apr 16 17:40:51.635123 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:51.635044 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-fmxzp_openshift-console-operator(39536cc1-6596-4c35-a9d6-a93ef6779640)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-fmxzp" podUID="39536cc1-6596-4c35-a9d6-a93ef6779640" Apr 16 17:40:51.680013 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:51.679974 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-qrxlk" Apr 16 17:40:51.812266 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:51.812234 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-qrxlk"] Apr 16 17:40:51.816418 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:51.816378 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcc5197c_4a3c_4c47_ac22_be98c5673ab4.slice/crio-15c4fd106188620e277ff4357545bba9bdeedfa243aa40c274b264eec655a618 WatchSource:0}: Error finding container 15c4fd106188620e277ff4357545bba9bdeedfa243aa40c274b264eec655a618: Status 404 returned error can't find the container with id 15c4fd106188620e277ff4357545bba9bdeedfa243aa40c274b264eec655a618 Apr 16 17:40:52.638818 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:52.638794 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-fmxzp_39536cc1-6596-4c35-a9d6-a93ef6779640/console-operator/1.log" Apr 16 17:40:52.639256 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:52.639142 2573 scope.go:117] "RemoveContainer" containerID="b9544be408cada2b2517b080c19bcc26ce2cfb3279b7fa2eef347520106f6408" Apr 16 17:40:52.639396 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:52.639378 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-fmxzp_openshift-console-operator(39536cc1-6596-4c35-a9d6-a93ef6779640)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-fmxzp" podUID="39536cc1-6596-4c35-a9d6-a93ef6779640" Apr 16 17:40:52.639864 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:52.639843 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-qrxlk" event={"ID":"bcc5197c-4a3c-4c47-ac22-be98c5673ab4","Type":"ContainerStarted","Data":"15c4fd106188620e277ff4357545bba9bdeedfa243aa40c274b264eec655a618"} Apr 16 17:40:53.644786 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:53.644706 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-qrxlk" event={"ID":"bcc5197c-4a3c-4c47-ac22-be98c5673ab4","Type":"ContainerStarted","Data":"310d9cb5b734540ddfd8e884fe8db8372854bbea98cd4cd319bf781639f63a6d"} Apr 16 17:40:53.644786 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:53.644744 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-qrxlk" event={"ID":"bcc5197c-4a3c-4c47-ac22-be98c5673ab4","Type":"ContainerStarted","Data":"609837bbc0e0c3e06c5f7c71c9dbb056e45817eecddb9b1290987001435c45d6"} Apr 16 17:40:53.662019 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:53.661971 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-qrxlk" podStartSLOduration=1.22440575 podStartE2EDuration="2.661957093s" podCreationTimestamp="2026-04-16 17:40:51 +0000 UTC" firstStartedPulling="2026-04-16 17:40:51.81909227 +0000 UTC m=+46.032125399" lastFinishedPulling="2026-04-16 17:40:53.256643601 +0000 UTC m=+47.469676742" observedRunningTime="2026-04-16 17:40:53.661389945 +0000 UTC m=+47.874423093" watchObservedRunningTime="2026-04-16 17:40:53.661957093 +0000 UTC m=+47.874990240" Apr 16 17:40:53.836754 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:53.836725 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-clpc9_5626e70b-1b0e-424a-af3b-d0dba055fd1b/dns-node-resolver/0.log" Apr 16 17:40:54.012487 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:54.012450 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-9l9b5\" (UID: \"7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-9l9b5" Apr 16 17:40:54.012653 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:54.012604 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 17:40:54.012694 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:54.012669 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d-samples-operator-tls podName:7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d nodeName:}" failed. No retries permitted until 2026-04-16 17:41:10.012655216 +0000 UTC m=+64.225688345 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d-samples-operator-tls") pod "cluster-samples-operator-667775844f-9l9b5" (UID: "7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d") : secret "samples-operator-tls" not found Apr 16 17:40:54.113795 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:54.113758 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-registry-tls\") pod \"image-registry-6d7d9f595c-np56x\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:40:54.113958 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:54.113824 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/550fd36d-dd5d-4bed-9110-110068110f23-service-ca-bundle\") pod \"router-default-575b7b88bd-d5lx2\" (UID: \"550fd36d-dd5d-4bed-9110-110068110f23\") " pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:40:54.113958 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:54.113847 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/550fd36d-dd5d-4bed-9110-110068110f23-metrics-certs\") pod \"router-default-575b7b88bd-d5lx2\" (UID: \"550fd36d-dd5d-4bed-9110-110068110f23\") " pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:40:54.113958 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:54.113905 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 17:40:54.113958 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:54.113924 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d7d9f595c-np56x: secret "image-registry-tls" not found Apr 16 17:40:54.114127 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:54.113977 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-registry-tls podName:1246ad0a-0cbe-41eb-b415-d3d5b58224b0 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:10.113958405 +0000 UTC m=+64.326991539 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-registry-tls") pod "image-registry-6d7d9f595c-np56x" (UID: "1246ad0a-0cbe-41eb-b415-d3d5b58224b0") : secret "image-registry-tls" not found Apr 16 17:40:54.114127 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:54.113992 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/550fd36d-dd5d-4bed-9110-110068110f23-service-ca-bundle podName:550fd36d-dd5d-4bed-9110-110068110f23 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:10.113985745 +0000 UTC m=+64.327018871 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/550fd36d-dd5d-4bed-9110-110068110f23-service-ca-bundle") pod "router-default-575b7b88bd-d5lx2" (UID: "550fd36d-dd5d-4bed-9110-110068110f23") : configmap references non-existent config key: service-ca.crt Apr 16 17:40:54.114127 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:54.114022 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 17:40:54.114127 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:54.114073 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/550fd36d-dd5d-4bed-9110-110068110f23-metrics-certs podName:550fd36d-dd5d-4bed-9110-110068110f23 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:10.114061484 +0000 UTC m=+64.327094610 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/550fd36d-dd5d-4bed-9110-110068110f23-metrics-certs") pod "router-default-575b7b88bd-d5lx2" (UID: "550fd36d-dd5d-4bed-9110-110068110f23") : secret "router-metrics-certs-default" not found Apr 16 17:40:54.125494 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:54.125464 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-b4qmw"] Apr 16 17:40:54.147401 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:54.147376 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-b4qmw"] Apr 16 17:40:54.147496 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:54.147483 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-b4qmw" Apr 16 17:40:54.150033 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:54.150013 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 17:40:54.150140 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:54.150017 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 17:40:54.150294 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:54.150281 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-jkfzn\"" Apr 16 17:40:54.150469 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:54.150452 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 17:40:54.151365 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:54.151347 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 17:40:54.214631 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:54.214601 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc57afd6-d40c-42e7-a331-579d5c302355-cert\") pod \"ingress-canary-gxh2q\" (UID: \"bc57afd6-d40c-42e7-a331-579d5c302355\") " pod="openshift-ingress-canary/ingress-canary-gxh2q" Apr 16 17:40:54.214790 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:54.214646 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f50c2657-216b-4259-a264-f4f602acfee8-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-985p5\" (UID: \"f50c2657-216b-4259-a264-f4f602acfee8\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-985p5" Apr 16 17:40:54.214790 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:54.214685 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f8d06aae-bf54-439f-a705-e3e924a51512-signing-cabundle\") pod \"service-ca-bfc587fb7-b4qmw\" (UID: \"f8d06aae-bf54-439f-a705-e3e924a51512\") " pod="openshift-service-ca/service-ca-bfc587fb7-b4qmw" Apr 16 17:40:54.214790 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:54.214726 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f8d06aae-bf54-439f-a705-e3e924a51512-signing-key\") pod \"service-ca-bfc587fb7-b4qmw\" (UID: \"f8d06aae-bf54-439f-a705-e3e924a51512\") " pod="openshift-service-ca/service-ca-bfc587fb7-b4qmw" Apr 16 17:40:54.214790 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:54.214753 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 17:40:54.214790 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:54.214785 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/214ea8ee-8a72-42a3-abfb-ceb3622fea44-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-fmwp9\" (UID: \"214ea8ee-8a72-42a3-abfb-ceb3622fea44\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-fmwp9" Apr 16 17:40:54.214982 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:54.214806 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b60ee2a1-c8c7-417f-a887-3f7008b3fb0a-metrics-tls\") pod \"dns-default-9d4rv\" (UID: \"b60ee2a1-c8c7-417f-a887-3f7008b3fb0a\") " pod="openshift-dns/dns-default-9d4rv" Apr 16 17:40:54.214982 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:54.214827 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc57afd6-d40c-42e7-a331-579d5c302355-cert podName:bc57afd6-d40c-42e7-a331-579d5c302355 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:10.214811314 +0000 UTC m=+64.427844440 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bc57afd6-d40c-42e7-a331-579d5c302355-cert") pod "ingress-canary-gxh2q" (UID: "bc57afd6-d40c-42e7-a331-579d5c302355") : secret "canary-serving-cert" not found Apr 16 17:40:54.214982 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:54.214757 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 17:40:54.214982 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:54.214864 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 17:40:54.214982 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:54.214878 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 17:40:54.214982 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:54.214900 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f50c2657-216b-4259-a264-f4f602acfee8-networking-console-plugin-cert podName:f50c2657-216b-4259-a264-f4f602acfee8 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:10.214880234 +0000 UTC m=+64.427913365 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/f50c2657-216b-4259-a264-f4f602acfee8-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-985p5" (UID: "f50c2657-216b-4259-a264-f4f602acfee8") : secret "networking-console-plugin-cert" not found Apr 16 17:40:54.214982 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:54.214921 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/214ea8ee-8a72-42a3-abfb-ceb3622fea44-cluster-monitoring-operator-tls podName:214ea8ee-8a72-42a3-abfb-ceb3622fea44 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:10.214908187 +0000 UTC m=+64.427941334 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/214ea8ee-8a72-42a3-abfb-ceb3622fea44-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-fmwp9" (UID: "214ea8ee-8a72-42a3-abfb-ceb3622fea44") : secret "cluster-monitoring-operator-tls" not found Apr 16 17:40:54.214982 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:54.214937 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhj7m\" (UniqueName: \"kubernetes.io/projected/f8d06aae-bf54-439f-a705-e3e924a51512-kube-api-access-hhj7m\") pod \"service-ca-bfc587fb7-b4qmw\" (UID: \"f8d06aae-bf54-439f-a705-e3e924a51512\") " pod="openshift-service-ca/service-ca-bfc587fb7-b4qmw" Apr 16 17:40:54.215266 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:54.214990 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b60ee2a1-c8c7-417f-a887-3f7008b3fb0a-metrics-tls podName:b60ee2a1-c8c7-417f-a887-3f7008b3fb0a nodeName:}" failed. No retries permitted until 2026-04-16 17:41:10.214982659 +0000 UTC m=+64.428015786 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b60ee2a1-c8c7-417f-a887-3f7008b3fb0a-metrics-tls") pod "dns-default-9d4rv" (UID: "b60ee2a1-c8c7-417f-a887-3f7008b3fb0a") : secret "dns-default-metrics-tls" not found Apr 16 17:40:54.315752 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:54.315669 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f8d06aae-bf54-439f-a705-e3e924a51512-signing-cabundle\") pod \"service-ca-bfc587fb7-b4qmw\" (UID: \"f8d06aae-bf54-439f-a705-e3e924a51512\") " pod="openshift-service-ca/service-ca-bfc587fb7-b4qmw" Apr 16 17:40:54.315752 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:54.315710 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f8d06aae-bf54-439f-a705-e3e924a51512-signing-key\") pod \"service-ca-bfc587fb7-b4qmw\" (UID: \"f8d06aae-bf54-439f-a705-e3e924a51512\") " pod="openshift-service-ca/service-ca-bfc587fb7-b4qmw" Apr 16 17:40:54.315974 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:54.315924 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hhj7m\" (UniqueName: \"kubernetes.io/projected/f8d06aae-bf54-439f-a705-e3e924a51512-kube-api-access-hhj7m\") pod \"service-ca-bfc587fb7-b4qmw\" (UID: \"f8d06aae-bf54-439f-a705-e3e924a51512\") " pod="openshift-service-ca/service-ca-bfc587fb7-b4qmw" Apr 16 17:40:54.316412 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:54.316383 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f8d06aae-bf54-439f-a705-e3e924a51512-signing-cabundle\") pod \"service-ca-bfc587fb7-b4qmw\" (UID: \"f8d06aae-bf54-439f-a705-e3e924a51512\") " pod="openshift-service-ca/service-ca-bfc587fb7-b4qmw" Apr 16 17:40:54.318112 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:54.318091 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f8d06aae-bf54-439f-a705-e3e924a51512-signing-key\") pod \"service-ca-bfc587fb7-b4qmw\" (UID: \"f8d06aae-bf54-439f-a705-e3e924a51512\") " pod="openshift-service-ca/service-ca-bfc587fb7-b4qmw" Apr 16 17:40:54.325533 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:54.325513 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhj7m\" (UniqueName: \"kubernetes.io/projected/f8d06aae-bf54-439f-a705-e3e924a51512-kube-api-access-hhj7m\") pod \"service-ca-bfc587fb7-b4qmw\" (UID: \"f8d06aae-bf54-439f-a705-e3e924a51512\") " pod="openshift-service-ca/service-ca-bfc587fb7-b4qmw" Apr 16 17:40:54.456524 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:54.456496 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-b4qmw" Apr 16 17:40:54.571350 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:54.571286 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-b4qmw"] Apr 16 17:40:54.574519 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:40:54.574496 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8d06aae_bf54_439f_a705_e3e924a51512.slice/crio-48a2423f0b1d60174fc599dbb400be7bdedaf740ecc9d1d46e81fda6a06b9fa3 WatchSource:0}: Error finding container 48a2423f0b1d60174fc599dbb400be7bdedaf740ecc9d1d46e81fda6a06b9fa3: Status 404 returned error can't find the container with id 48a2423f0b1d60174fc599dbb400be7bdedaf740ecc9d1d46e81fda6a06b9fa3 Apr 16 17:40:54.647806 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:54.647780 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-b4qmw" event={"ID":"f8d06aae-bf54-439f-a705-e3e924a51512","Type":"ContainerStarted","Data":"48a2423f0b1d60174fc599dbb400be7bdedaf740ecc9d1d46e81fda6a06b9fa3"} Apr 16 17:40:54.835773 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:54.835706 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-l2szv_76b37f14-34f7-4661-ad91-459fb138a436/node-ca/0.log" Apr 16 17:40:55.654398 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:55.654366 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-b4qmw" event={"ID":"f8d06aae-bf54-439f-a705-e3e924a51512","Type":"ContainerStarted","Data":"0391a8f84c4e31cae5259898bc6eeb8225e207a5d1f3bdba2ee367d932584ece"} Apr 16 17:40:55.673629 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:55.673578 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-b4qmw" podStartSLOduration=1.673561383 podStartE2EDuration="1.673561383s" podCreationTimestamp="2026-04-16 17:40:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:40:55.672729225 +0000 UTC m=+49.885762374" watchObservedRunningTime="2026-04-16 17:40:55.673561383 +0000 UTC m=+49.886594532" Apr 16 17:40:55.835630 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:55.835601 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-qrxlk_bcc5197c-4a3c-4c47-ac22-be98c5673ab4/migrator/0.log" Apr 16 17:40:56.035174 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:56.035102 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-qrxlk_bcc5197c-4a3c-4c47-ac22-be98c5673ab4/graceful-termination/0.log" Apr 16 17:40:56.236768 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:56.236734 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-8gx9z_31969ea1-4893-4e83-ac1e-f5882799c5da/kube-storage-version-migrator-operator/0.log" Apr 16 17:40:58.646301 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:58.646265 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-fmxzp" Apr 16 17:40:58.646301 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:58.646310 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-fmxzp" Apr 16 17:40:58.646776 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:40:58.646671 2573 scope.go:117] "RemoveContainer" containerID="b9544be408cada2b2517b080c19bcc26ce2cfb3279b7fa2eef347520106f6408" Apr 16 17:40:58.646865 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:40:58.646846 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-fmxzp_openshift-console-operator(39536cc1-6596-4c35-a9d6-a93ef6779640)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-fmxzp" podUID="39536cc1-6596-4c35-a9d6-a93ef6779640" Apr 16 17:41:03.552554 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:03.552526 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9m58b" Apr 16 17:41:10.050152 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.050114 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-9l9b5\" (UID: \"7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-9l9b5" Apr 16 17:41:10.052509 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.052475 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-9l9b5\" (UID: \"7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-9l9b5" Apr 16 17:41:10.151017 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.150989 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-registry-tls\") pod \"image-registry-6d7d9f595c-np56x\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:41:10.151155 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.151051 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/550fd36d-dd5d-4bed-9110-110068110f23-service-ca-bundle\") pod \"router-default-575b7b88bd-d5lx2\" (UID: \"550fd36d-dd5d-4bed-9110-110068110f23\") " pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:41:10.151155 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.151069 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/550fd36d-dd5d-4bed-9110-110068110f23-metrics-certs\") pod \"router-default-575b7b88bd-d5lx2\" (UID: \"550fd36d-dd5d-4bed-9110-110068110f23\") " pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:41:10.152192 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.152167 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/550fd36d-dd5d-4bed-9110-110068110f23-service-ca-bundle\") pod \"router-default-575b7b88bd-d5lx2\" (UID: \"550fd36d-dd5d-4bed-9110-110068110f23\") " pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:41:10.153334 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.153316 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/550fd36d-dd5d-4bed-9110-110068110f23-metrics-certs\") pod \"router-default-575b7b88bd-d5lx2\" (UID: \"550fd36d-dd5d-4bed-9110-110068110f23\") " pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:41:10.153402 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.153339 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-registry-tls\") pod \"image-registry-6d7d9f595c-np56x\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:41:10.251924 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.251894 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f50c2657-216b-4259-a264-f4f602acfee8-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-985p5\" (UID: \"f50c2657-216b-4259-a264-f4f602acfee8\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-985p5" Apr 16 17:41:10.252048 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.251967 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/214ea8ee-8a72-42a3-abfb-ceb3622fea44-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-fmwp9\" (UID: \"214ea8ee-8a72-42a3-abfb-ceb3622fea44\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-fmwp9" Apr 16 17:41:10.252048 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.251998 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b60ee2a1-c8c7-417f-a887-3f7008b3fb0a-metrics-tls\") pod \"dns-default-9d4rv\" (UID: \"b60ee2a1-c8c7-417f-a887-3f7008b3fb0a\") " pod="openshift-dns/dns-default-9d4rv" Apr 16 17:41:10.252189 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.252161 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc57afd6-d40c-42e7-a331-579d5c302355-cert\") pod \"ingress-canary-gxh2q\" (UID: \"bc57afd6-d40c-42e7-a331-579d5c302355\") " pod="openshift-ingress-canary/ingress-canary-gxh2q" Apr 16 17:41:10.254277 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.254251 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f50c2657-216b-4259-a264-f4f602acfee8-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-985p5\" (UID: \"f50c2657-216b-4259-a264-f4f602acfee8\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-985p5" Apr 16 17:41:10.254401 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.254382 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc57afd6-d40c-42e7-a331-579d5c302355-cert\") pod \"ingress-canary-gxh2q\" (UID: \"bc57afd6-d40c-42e7-a331-579d5c302355\") " pod="openshift-ingress-canary/ingress-canary-gxh2q" Apr 16 17:41:10.254469 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.254414 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/214ea8ee-8a72-42a3-abfb-ceb3622fea44-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-fmwp9\" (UID: \"214ea8ee-8a72-42a3-abfb-ceb3622fea44\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-fmwp9" Apr 16 17:41:10.254469 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.254445 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b60ee2a1-c8c7-417f-a887-3f7008b3fb0a-metrics-tls\") pod \"dns-default-9d4rv\" (UID: \"b60ee2a1-c8c7-417f-a887-3f7008b3fb0a\") " pod="openshift-dns/dns-default-9d4rv" Apr 16 17:41:10.294959 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.294935 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-sgqj7\"" Apr 16 17:41:10.302271 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.302226 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-9l9b5" Apr 16 17:41:10.317061 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.317044 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-f5d25\"" Apr 16 17:41:10.324585 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.324569 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:41:10.364996 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.364853 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-xkkgf\"" Apr 16 17:41:10.372125 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.372057 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:41:10.423354 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.423109 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-tq7dq\"" Apr 16 17:41:10.430659 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.430468 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-9l9b5"] Apr 16 17:41:10.430659 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.430536 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-fmwp9" Apr 16 17:41:10.434989 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.434969 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-x49pg\"" Apr 16 17:41:10.442382 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.442363 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-985p5" Apr 16 17:41:10.454267 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.454195 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6d7d9f595c-np56x"] Apr 16 17:41:10.460890 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.460490 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-j7r56\"" Apr 16 17:41:10.465282 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.465255 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gxh2q" Apr 16 17:41:10.471468 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:41:10.471430 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1246ad0a_0cbe_41eb_b415_d3d5b58224b0.slice/crio-ea17c4b1d9548cd2712a7e4e3d2dd8dfeba068207f77262d2b39e805980a8628 WatchSource:0}: Error finding container ea17c4b1d9548cd2712a7e4e3d2dd8dfeba068207f77262d2b39e805980a8628: Status 404 returned error can't find the container with id ea17c4b1d9548cd2712a7e4e3d2dd8dfeba068207f77262d2b39e805980a8628 Apr 16 17:41:10.474483 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.474054 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tjhv5\"" Apr 16 17:41:10.483527 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.483503 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-9d4rv" Apr 16 17:41:10.549403 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.549348 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-575b7b88bd-d5lx2"] Apr 16 17:41:10.563356 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:41:10.561010 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod550fd36d_dd5d_4bed_9110_110068110f23.slice/crio-c7e71fa12a21fce47e1d6c54479dedfee20aec17b3743fdcc24e46b692e57849 WatchSource:0}: Error finding container c7e71fa12a21fce47e1d6c54479dedfee20aec17b3743fdcc24e46b692e57849: Status 404 returned error can't find the container with id c7e71fa12a21fce47e1d6c54479dedfee20aec17b3743fdcc24e46b692e57849 Apr 16 17:41:10.644655 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.644629 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-fmwp9"] Apr 16 17:41:10.654859 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:41:10.654828 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod214ea8ee_8a72_42a3_abfb_ceb3622fea44.slice/crio-696edec3d84daf7dcabc4ba5c5f73211409af6a3a1a999a398b80124b863e645 WatchSource:0}: Error finding container 696edec3d84daf7dcabc4ba5c5f73211409af6a3a1a999a398b80124b863e645: Status 404 returned error can't find the container with id 696edec3d84daf7dcabc4ba5c5f73211409af6a3a1a999a398b80124b863e645 Apr 16 17:41:10.661920 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.661874 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-985p5"] Apr 16 17:41:10.664849 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:41:10.664819 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf50c2657_216b_4259_a264_f4f602acfee8.slice/crio-addafa5651c97e5ba8c867cc4a56bb42d691b2cf7d1070e0393460990f576011 WatchSource:0}: Error finding container addafa5651c97e5ba8c867cc4a56bb42d691b2cf7d1070e0393460990f576011: Status 404 returned error can't find the container with id addafa5651c97e5ba8c867cc4a56bb42d691b2cf7d1070e0393460990f576011 Apr 16 17:41:10.681945 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.681920 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gxh2q"] Apr 16 17:41:10.685058 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:41:10.685030 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc57afd6_d40c_42e7_a331_579d5c302355.slice/crio-def3ccedf56b13677e5e08390feabc1fedbf01c8c401c5cb819c1d29c7791f04 WatchSource:0}: Error finding container def3ccedf56b13677e5e08390feabc1fedbf01c8c401c5cb819c1d29c7791f04: Status 404 returned error can't find the container with id def3ccedf56b13677e5e08390feabc1fedbf01c8c401c5cb819c1d29c7791f04 Apr 16 17:41:10.693737 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.693706 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-fmwp9" event={"ID":"214ea8ee-8a72-42a3-abfb-ceb3622fea44","Type":"ContainerStarted","Data":"696edec3d84daf7dcabc4ba5c5f73211409af6a3a1a999a398b80124b863e645"} Apr 16 17:41:10.695096 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.695066 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-575b7b88bd-d5lx2" event={"ID":"550fd36d-dd5d-4bed-9110-110068110f23","Type":"ContainerStarted","Data":"3e1f95f7b5c015cc63510d3655cb1641d15bf5461920332652041f8e7073ded4"} Apr 16 17:41:10.695193 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.695104 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-575b7b88bd-d5lx2" event={"ID":"550fd36d-dd5d-4bed-9110-110068110f23","Type":"ContainerStarted","Data":"c7e71fa12a21fce47e1d6c54479dedfee20aec17b3743fdcc24e46b692e57849"} Apr 16 17:41:10.697191 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.696574 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" event={"ID":"1246ad0a-0cbe-41eb-b415-d3d5b58224b0","Type":"ContainerStarted","Data":"e8da9c313a1457b8c5f3cd2c2c5410159ea3669dc2505b15c93f58ed70d2bb41"} Apr 16 17:41:10.697191 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.696600 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" event={"ID":"1246ad0a-0cbe-41eb-b415-d3d5b58224b0","Type":"ContainerStarted","Data":"ea17c4b1d9548cd2712a7e4e3d2dd8dfeba068207f77262d2b39e805980a8628"} Apr 16 17:41:10.697191 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.696696 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:41:10.697631 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.697611 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-9l9b5" event={"ID":"7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d","Type":"ContainerStarted","Data":"c22365817d83ab08de9385c15048480ffa0a8c53b1255c3f632576b2667bb086"} Apr 16 17:41:10.698549 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.698527 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gxh2q" event={"ID":"bc57afd6-d40c-42e7-a331-579d5c302355","Type":"ContainerStarted","Data":"def3ccedf56b13677e5e08390feabc1fedbf01c8c401c5cb819c1d29c7791f04"} Apr 16 17:41:10.699543 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.699515 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-985p5" event={"ID":"f50c2657-216b-4259-a264-f4f602acfee8","Type":"ContainerStarted","Data":"addafa5651c97e5ba8c867cc4a56bb42d691b2cf7d1070e0393460990f576011"} Apr 16 17:41:10.702629 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.702594 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-9d4rv"] Apr 16 17:41:10.707115 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:41:10.707094 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb60ee2a1_c8c7_417f_a887_3f7008b3fb0a.slice/crio-fa366d6a2343d2153b5c908cfddc5a3c5c6b4fb85f824cc797a4ccb1a452ede2 WatchSource:0}: Error finding container fa366d6a2343d2153b5c908cfddc5a3c5c6b4fb85f824cc797a4ccb1a452ede2: Status 404 returned error can't find the container with id fa366d6a2343d2153b5c908cfddc5a3c5c6b4fb85f824cc797a4ccb1a452ede2 Apr 16 17:41:10.722742 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:10.722699 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" podStartSLOduration=63.722684334 podStartE2EDuration="1m3.722684334s" podCreationTimestamp="2026-04-16 17:40:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:41:10.722024478 +0000 UTC m=+64.935057619" watchObservedRunningTime="2026-04-16 17:41:10.722684334 +0000 UTC m=+64.935717459" Apr 16 17:41:11.706320 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:11.706080 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9d4rv" event={"ID":"b60ee2a1-c8c7-417f-a887-3f7008b3fb0a","Type":"ContainerStarted","Data":"fa366d6a2343d2153b5c908cfddc5a3c5c6b4fb85f824cc797a4ccb1a452ede2"} Apr 16 17:41:11.732100 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:11.731738 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-575b7b88bd-d5lx2" podStartSLOduration=64.731709036 podStartE2EDuration="1m4.731709036s" podCreationTimestamp="2026-04-16 17:40:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:41:11.730505889 +0000 UTC m=+65.943539062" watchObservedRunningTime="2026-04-16 17:41:11.731709036 +0000 UTC m=+65.944742184" Apr 16 17:41:12.071243 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:12.071066 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7db52a98-86b8-46da-a83e-8f6ee99d696d-metrics-certs\") pod \"network-metrics-daemon-n4qhr\" (UID: \"7db52a98-86b8-46da-a83e-8f6ee99d696d\") " pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:41:12.074150 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:12.074097 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7db52a98-86b8-46da-a83e-8f6ee99d696d-metrics-certs\") pod \"network-metrics-daemon-n4qhr\" (UID: \"7db52a98-86b8-46da-a83e-8f6ee99d696d\") " pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:41:12.166278 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:12.166188 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jmq5l\"" Apr 16 17:41:12.173837 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:12.173810 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n4qhr" Apr 16 17:41:12.372676 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:12.372605 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:41:12.375633 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:12.375601 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:41:12.711288 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:12.711261 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:41:12.712624 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:12.712601 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-575b7b88bd-d5lx2" Apr 16 17:41:12.885973 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:12.885936 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-j68jr"] Apr 16 17:41:12.906844 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:12.906816 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-j68jr" Apr 16 17:41:12.911813 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:12.911358 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 17:41:12.911813 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:12.911488 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 17:41:12.911813 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:12.911644 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-bfzld\"" Apr 16 17:41:12.941961 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:12.941933 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-j68jr"] Apr 16 17:41:12.979900 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:12.979830 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4d38888e-b2cd-44dc-9990-4141ba6b0f9a-crio-socket\") pod \"insights-runtime-extractor-j68jr\" (UID: \"4d38888e-b2cd-44dc-9990-4141ba6b0f9a\") " pod="openshift-insights/insights-runtime-extractor-j68jr" Apr 16 17:41:12.979900 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:12.979868 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4d38888e-b2cd-44dc-9990-4141ba6b0f9a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-j68jr\" (UID: \"4d38888e-b2cd-44dc-9990-4141ba6b0f9a\") " pod="openshift-insights/insights-runtime-extractor-j68jr" Apr 16 17:41:12.980113 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:12.979904 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4d38888e-b2cd-44dc-9990-4141ba6b0f9a-data-volume\") pod \"insights-runtime-extractor-j68jr\" (UID: \"4d38888e-b2cd-44dc-9990-4141ba6b0f9a\") " pod="openshift-insights/insights-runtime-extractor-j68jr" Apr 16 17:41:12.980113 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:12.979928 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4d38888e-b2cd-44dc-9990-4141ba6b0f9a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-j68jr\" (UID: \"4d38888e-b2cd-44dc-9990-4141ba6b0f9a\") " pod="openshift-insights/insights-runtime-extractor-j68jr" Apr 16 17:41:12.980113 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:12.979964 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t8gk\" (UniqueName: \"kubernetes.io/projected/4d38888e-b2cd-44dc-9990-4141ba6b0f9a-kube-api-access-2t8gk\") pod \"insights-runtime-extractor-j68jr\" (UID: \"4d38888e-b2cd-44dc-9990-4141ba6b0f9a\") " pod="openshift-insights/insights-runtime-extractor-j68jr" Apr 16 17:41:13.080696 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:13.080661 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4d38888e-b2cd-44dc-9990-4141ba6b0f9a-crio-socket\") pod \"insights-runtime-extractor-j68jr\" (UID: \"4d38888e-b2cd-44dc-9990-4141ba6b0f9a\") " pod="openshift-insights/insights-runtime-extractor-j68jr" Apr 16 17:41:13.080891 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:13.080703 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4d38888e-b2cd-44dc-9990-4141ba6b0f9a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-j68jr\" (UID: \"4d38888e-b2cd-44dc-9990-4141ba6b0f9a\") " pod="openshift-insights/insights-runtime-extractor-j68jr" Apr 16 17:41:13.080891 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:13.080791 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4d38888e-b2cd-44dc-9990-4141ba6b0f9a-crio-socket\") pod \"insights-runtime-extractor-j68jr\" (UID: \"4d38888e-b2cd-44dc-9990-4141ba6b0f9a\") " pod="openshift-insights/insights-runtime-extractor-j68jr" Apr 16 17:41:13.080891 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:13.080841 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4d38888e-b2cd-44dc-9990-4141ba6b0f9a-data-volume\") pod \"insights-runtime-extractor-j68jr\" (UID: \"4d38888e-b2cd-44dc-9990-4141ba6b0f9a\") " pod="openshift-insights/insights-runtime-extractor-j68jr" Apr 16 17:41:13.080891 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:13.080884 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4d38888e-b2cd-44dc-9990-4141ba6b0f9a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-j68jr\" (UID: \"4d38888e-b2cd-44dc-9990-4141ba6b0f9a\") " pod="openshift-insights/insights-runtime-extractor-j68jr" Apr 16 17:41:13.081097 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:13.080939 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2t8gk\" (UniqueName: \"kubernetes.io/projected/4d38888e-b2cd-44dc-9990-4141ba6b0f9a-kube-api-access-2t8gk\") pod \"insights-runtime-extractor-j68jr\" (UID: \"4d38888e-b2cd-44dc-9990-4141ba6b0f9a\") " pod="openshift-insights/insights-runtime-extractor-j68jr" Apr 16 17:41:13.081208 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:13.081186 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4d38888e-b2cd-44dc-9990-4141ba6b0f9a-data-volume\") pod \"insights-runtime-extractor-j68jr\" (UID: \"4d38888e-b2cd-44dc-9990-4141ba6b0f9a\") " pod="openshift-insights/insights-runtime-extractor-j68jr" Apr 16 17:41:13.081368 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:13.081340 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4d38888e-b2cd-44dc-9990-4141ba6b0f9a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-j68jr\" (UID: \"4d38888e-b2cd-44dc-9990-4141ba6b0f9a\") " pod="openshift-insights/insights-runtime-extractor-j68jr" Apr 16 17:41:13.083513 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:13.083490 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4d38888e-b2cd-44dc-9990-4141ba6b0f9a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-j68jr\" (UID: \"4d38888e-b2cd-44dc-9990-4141ba6b0f9a\") " pod="openshift-insights/insights-runtime-extractor-j68jr" Apr 16 17:41:13.091446 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:13.091426 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t8gk\" (UniqueName: \"kubernetes.io/projected/4d38888e-b2cd-44dc-9990-4141ba6b0f9a-kube-api-access-2t8gk\") pod \"insights-runtime-extractor-j68jr\" (UID: \"4d38888e-b2cd-44dc-9990-4141ba6b0f9a\") " pod="openshift-insights/insights-runtime-extractor-j68jr" Apr 16 17:41:13.220878 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:13.220843 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-j68jr" Apr 16 17:41:13.438559 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:13.438533 2573 scope.go:117] "RemoveContainer" containerID="b9544be408cada2b2517b080c19bcc26ce2cfb3279b7fa2eef347520106f6408" Apr 16 17:41:15.259026 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:15.258954 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-j68jr"] Apr 16 17:41:15.274278 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:41:15.274251 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d38888e_b2cd_44dc_9990_4141ba6b0f9a.slice/crio-fb215cd42772a45555b8a85d2326fb815eff7d081aa7c0c35efb9886954ad60d WatchSource:0}: Error finding container fb215cd42772a45555b8a85d2326fb815eff7d081aa7c0c35efb9886954ad60d: Status 404 returned error can't find the container with id fb215cd42772a45555b8a85d2326fb815eff7d081aa7c0c35efb9886954ad60d Apr 16 17:41:15.283225 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:15.283178 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-n4qhr"] Apr 16 17:41:15.290999 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:41:15.290132 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7db52a98_86b8_46da_a83e_8f6ee99d696d.slice/crio-d790def6880a6ee93bdf909732d99015400c791381356d9baf20f8901be6196f WatchSource:0}: Error finding container d790def6880a6ee93bdf909732d99015400c791381356d9baf20f8901be6196f: Status 404 returned error can't find the container with id d790def6880a6ee93bdf909732d99015400c791381356d9baf20f8901be6196f Apr 16 17:41:15.724678 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:15.724624 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gxh2q" event={"ID":"bc57afd6-d40c-42e7-a331-579d5c302355","Type":"ContainerStarted","Data":"303a86cdd3705cd219a93116f4afd88ce6a956ab2fabc0d7487fd4a88838129d"} Apr 16 17:41:15.727275 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:15.727255 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-fmxzp_39536cc1-6596-4c35-a9d6-a93ef6779640/console-operator/1.log" Apr 16 17:41:15.727398 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:15.727335 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-fmxzp" event={"ID":"39536cc1-6596-4c35-a9d6-a93ef6779640","Type":"ContainerStarted","Data":"ed303b9889f67755141ee599631147febc36fdfc6634735a6fc3129b23d1294a"} Apr 16 17:41:15.727833 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:15.727689 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-fmxzp" Apr 16 17:41:15.729353 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:15.729327 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-fmwp9" event={"ID":"214ea8ee-8a72-42a3-abfb-ceb3622fea44","Type":"ContainerStarted","Data":"0cd1d7eb0e410249b1c137b3e34d8eb48a029535ca83f1934af95ce7a1e6d91f"} Apr 16 17:41:15.730836 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:15.730814 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-985p5" event={"ID":"f50c2657-216b-4259-a264-f4f602acfee8","Type":"ContainerStarted","Data":"df4924eab9ac967a6443d59d2ea48aa2565e6442ac48bbf7058bf45ccbe0b2fd"} Apr 16 17:41:15.732488 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:15.732441 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j68jr" event={"ID":"4d38888e-b2cd-44dc-9990-4141ba6b0f9a","Type":"ContainerStarted","Data":"741cbbf0a0efab1e48064ba20a390eab36d65ff78985b77c931042806098f606"} Apr 16 17:41:15.732488 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:15.732469 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j68jr" event={"ID":"4d38888e-b2cd-44dc-9990-4141ba6b0f9a","Type":"ContainerStarted","Data":"fb215cd42772a45555b8a85d2326fb815eff7d081aa7c0c35efb9886954ad60d"} Apr 16 17:41:15.734078 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:15.734056 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9d4rv" event={"ID":"b60ee2a1-c8c7-417f-a887-3f7008b3fb0a","Type":"ContainerStarted","Data":"87266e2f24a1cfc34b1fd2f89d8e602801f4b80dc82ca3b9937871ab1375b832"} Apr 16 17:41:15.734190 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:15.734083 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-9d4rv" event={"ID":"b60ee2a1-c8c7-417f-a887-3f7008b3fb0a","Type":"ContainerStarted","Data":"b5322eae854c047e59be96df3cba88dea8ea8e1ce7b3f45d90db8d089bd3becd"} Apr 16 17:41:15.734286 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:15.734225 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-9d4rv" Apr 16 17:41:15.735924 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:15.735903 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-9l9b5" event={"ID":"7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d","Type":"ContainerStarted","Data":"1d3936fbeeeb0b2889abb9da5ed791ea05dcda1ba6fdb079dff8ecb6c4e218e1"} Apr 16 17:41:15.736005 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:15.735931 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-9l9b5" event={"ID":"7404b4b2-7ffd-4cef-a5a2-4669dcd3b76d","Type":"ContainerStarted","Data":"bf090c67cdd8d9d8f8e72fd3d515ddf975aaa8e6b5f1503349657392b08b6c8b"} Apr 16 17:41:15.737477 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:15.737456 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n4qhr" event={"ID":"7db52a98-86b8-46da-a83e-8f6ee99d696d","Type":"ContainerStarted","Data":"d790def6880a6ee93bdf909732d99015400c791381356d9baf20f8901be6196f"} Apr 16 17:41:15.746043 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:15.745995 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-gxh2q" podStartSLOduration=33.319304286 podStartE2EDuration="37.745981013s" podCreationTimestamp="2026-04-16 17:40:38 +0000 UTC" firstStartedPulling="2026-04-16 17:41:10.687054003 +0000 UTC m=+64.900087128" lastFinishedPulling="2026-04-16 17:41:15.113730727 +0000 UTC m=+69.326763855" observedRunningTime="2026-04-16 17:41:15.744249661 +0000 UTC m=+69.957282812" watchObservedRunningTime="2026-04-16 17:41:15.745981013 +0000 UTC m=+69.959014163" Apr 16 17:41:15.765312 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:15.763984 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-985p5" podStartSLOduration=63.321764899 podStartE2EDuration="1m7.763967207s" podCreationTimestamp="2026-04-16 17:40:08 +0000 UTC" firstStartedPulling="2026-04-16 17:41:10.666898862 +0000 UTC m=+64.879931987" lastFinishedPulling="2026-04-16 17:41:15.109101155 +0000 UTC m=+69.322134295" observedRunningTime="2026-04-16 17:41:15.763104065 +0000 UTC m=+69.976137214" watchObservedRunningTime="2026-04-16 17:41:15.763967207 +0000 UTC m=+69.977000357" Apr 16 17:41:15.787421 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:15.787360 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-fmwp9" podStartSLOduration=64.329553203 podStartE2EDuration="1m8.787339135s" podCreationTimestamp="2026-04-16 17:40:07 +0000 UTC" firstStartedPulling="2026-04-16 17:41:10.656810294 +0000 UTC m=+64.869843420" lastFinishedPulling="2026-04-16 17:41:15.114596224 +0000 UTC m=+69.327629352" observedRunningTime="2026-04-16 17:41:15.782325104 +0000 UTC m=+69.995358471" watchObservedRunningTime="2026-04-16 17:41:15.787339135 +0000 UTC m=+70.000372283" Apr 16 17:41:15.807998 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:15.807773 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-fmxzp" podStartSLOduration=59.377701046 podStartE2EDuration="1m8.807758361s" podCreationTimestamp="2026-04-16 17:40:07 +0000 UTC" firstStartedPulling="2026-04-16 17:40:40.510243967 +0000 UTC m=+34.723277095" lastFinishedPulling="2026-04-16 17:40:49.940301269 +0000 UTC m=+44.153334410" observedRunningTime="2026-04-16 17:41:15.80655634 +0000 UTC m=+70.019589508" watchObservedRunningTime="2026-04-16 17:41:15.807758361 +0000 UTC m=+70.020791511" Apr 16 17:41:15.827814 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:15.827750 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-hwsph"] Apr 16 17:41:15.829028 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:15.828968 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-9d4rv" podStartSLOduration=33.423585472 podStartE2EDuration="37.828950871s" podCreationTimestamp="2026-04-16 17:40:38 +0000 UTC" firstStartedPulling="2026-04-16 17:41:10.70880104 +0000 UTC m=+64.921834165" lastFinishedPulling="2026-04-16 17:41:15.114166437 +0000 UTC m=+69.327199564" observedRunningTime="2026-04-16 17:41:15.826711764 +0000 UTC m=+70.039744940" watchObservedRunningTime="2026-04-16 17:41:15.828950871 +0000 UTC m=+70.041984021" Apr 16 17:41:15.832535 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:15.832515 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-hwsph" Apr 16 17:41:15.834961 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:15.834934 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 17:41:15.835074 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:15.835012 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-pn8f5\"" Apr 16 17:41:15.838417 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:15.838378 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-hwsph"] Apr 16 17:41:15.854895 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:15.854840 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-9l9b5" podStartSLOduration=64.24505397 podStartE2EDuration="1m8.854822692s" podCreationTimestamp="2026-04-16 17:40:07 +0000 UTC" firstStartedPulling="2026-04-16 17:41:10.505496958 +0000 UTC m=+64.718530086" lastFinishedPulling="2026-04-16 17:41:15.115265677 +0000 UTC m=+69.328298808" observedRunningTime="2026-04-16 17:41:15.852657909 +0000 UTC m=+70.065691083" watchObservedRunningTime="2026-04-16 17:41:15.854822692 +0000 UTC m=+70.067855842" Apr 16 17:41:15.908681 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:15.908598 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/73840891-1c5d-4e9b-9e80-e22fe56583c0-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-hwsph\" (UID: \"73840891-1c5d-4e9b-9e80-e22fe56583c0\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-hwsph" Apr 16 17:41:16.009435 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:16.009406 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/73840891-1c5d-4e9b-9e80-e22fe56583c0-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-hwsph\" (UID: \"73840891-1c5d-4e9b-9e80-e22fe56583c0\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-hwsph" Apr 16 17:41:16.012526 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:16.012496 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/73840891-1c5d-4e9b-9e80-e22fe56583c0-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-hwsph\" (UID: \"73840891-1c5d-4e9b-9e80-e22fe56583c0\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-hwsph" Apr 16 17:41:16.057090 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:16.057063 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-fmxzp" Apr 16 17:41:16.146131 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:16.146108 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-hwsph" Apr 16 17:41:16.242466 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:16.242432 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-cd57h"] Apr 16 17:41:16.260614 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:16.260580 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-cd57h"] Apr 16 17:41:16.261058 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:16.260711 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-cd57h" Apr 16 17:41:16.263646 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:16.263618 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 17:41:16.264026 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:16.263783 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 17:41:16.264026 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:16.263868 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-qvpgv\"" Apr 16 17:41:16.413163 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:16.413084 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzjc2\" (UniqueName: \"kubernetes.io/projected/c67baaa1-8042-4274-a9ff-bd69b3157f62-kube-api-access-wzjc2\") pod \"downloads-586b57c7b4-cd57h\" (UID: \"c67baaa1-8042-4274-a9ff-bd69b3157f62\") " pod="openshift-console/downloads-586b57c7b4-cd57h" Apr 16 17:41:16.513610 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:16.513575 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wzjc2\" (UniqueName: \"kubernetes.io/projected/c67baaa1-8042-4274-a9ff-bd69b3157f62-kube-api-access-wzjc2\") pod \"downloads-586b57c7b4-cd57h\" (UID: \"c67baaa1-8042-4274-a9ff-bd69b3157f62\") " pod="openshift-console/downloads-586b57c7b4-cd57h" Apr 16 17:41:16.524886 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:16.524843 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzjc2\" (UniqueName: \"kubernetes.io/projected/c67baaa1-8042-4274-a9ff-bd69b3157f62-kube-api-access-wzjc2\") pod \"downloads-586b57c7b4-cd57h\" (UID: \"c67baaa1-8042-4274-a9ff-bd69b3157f62\") " pod="openshift-console/downloads-586b57c7b4-cd57h" Apr 16 17:41:16.573292 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:16.573250 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-cd57h" Apr 16 17:41:16.758238 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:16.758058 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-cd57h"] Apr 16 17:41:16.775151 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:16.775008 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-hwsph"] Apr 16 17:41:16.866777 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:41:16.866738 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc67baaa1_8042_4274_a9ff_bd69b3157f62.slice/crio-fb232ce2b1d5e9e40f7f7185dd3b647110405f5ffb4124916dea9b176e012821 WatchSource:0}: Error finding container fb232ce2b1d5e9e40f7f7185dd3b647110405f5ffb4124916dea9b176e012821: Status 404 returned error can't find the container with id fb232ce2b1d5e9e40f7f7185dd3b647110405f5ffb4124916dea9b176e012821 Apr 16 17:41:16.867573 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:41:16.867541 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73840891_1c5d_4e9b_9e80_e22fe56583c0.slice/crio-415635e858f580088122763ec6c934f41e56d5622066e2f1e76991f30630e612 WatchSource:0}: Error finding container 415635e858f580088122763ec6c934f41e56d5622066e2f1e76991f30630e612: Status 404 returned error can't find the container with id 415635e858f580088122763ec6c934f41e56d5622066e2f1e76991f30630e612 Apr 16 17:41:17.746359 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:17.746296 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-cd57h" event={"ID":"c67baaa1-8042-4274-a9ff-bd69b3157f62","Type":"ContainerStarted","Data":"fb232ce2b1d5e9e40f7f7185dd3b647110405f5ffb4124916dea9b176e012821"} Apr 16 17:41:17.748705 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:17.748672 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j68jr" event={"ID":"4d38888e-b2cd-44dc-9990-4141ba6b0f9a","Type":"ContainerStarted","Data":"b95d431e2b6977fc66fe7c1c4c6739e40fe7c8d0684638756df36cc713b8cd45"} Apr 16 17:41:17.750056 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:17.750034 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-hwsph" event={"ID":"73840891-1c5d-4e9b-9e80-e22fe56583c0","Type":"ContainerStarted","Data":"415635e858f580088122763ec6c934f41e56d5622066e2f1e76991f30630e612"} Apr 16 17:41:17.752259 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:17.752200 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n4qhr" event={"ID":"7db52a98-86b8-46da-a83e-8f6ee99d696d","Type":"ContainerStarted","Data":"6d0cfb4f3c46f4528cc6323f4ebac5cdd71cd1e9e068502ddfbc6a0c454c2b62"} Apr 16 17:41:17.752259 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:17.752242 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n4qhr" event={"ID":"7db52a98-86b8-46da-a83e-8f6ee99d696d","Type":"ContainerStarted","Data":"063696a03df92d14e335b40619028e60c40c3e7d65fe8f4296964cb8f4696dce"} Apr 16 17:41:17.771413 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:17.771370 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-n4qhr" podStartSLOduration=69.994141435 podStartE2EDuration="1m11.771355445s" podCreationTimestamp="2026-04-16 17:40:06 +0000 UTC" firstStartedPulling="2026-04-16 17:41:15.292718621 +0000 UTC m=+69.505751762" lastFinishedPulling="2026-04-16 17:41:17.06993263 +0000 UTC m=+71.282965772" observedRunningTime="2026-04-16 17:41:17.770877257 +0000 UTC m=+71.983910430" watchObservedRunningTime="2026-04-16 17:41:17.771355445 +0000 UTC m=+71.984388594" Apr 16 17:41:19.762264 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:19.762204 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j68jr" event={"ID":"4d38888e-b2cd-44dc-9990-4141ba6b0f9a","Type":"ContainerStarted","Data":"12c2c93e32fa8b5ecad0753a2aae8c71b52253a0b26c4b9ff49984555a8bccad"} Apr 16 17:41:19.763833 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:19.763806 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-hwsph" event={"ID":"73840891-1c5d-4e9b-9e80-e22fe56583c0","Type":"ContainerStarted","Data":"98d9f1931c3deb925d3b3a1259a5bdb85f8ed47fc2df67ffc70d5e87160ef99a"} Apr 16 17:41:19.763992 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:19.763971 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-hwsph" Apr 16 17:41:19.769307 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:19.769278 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-hwsph" Apr 16 17:41:19.784933 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:19.784883 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-j68jr" podStartSLOduration=3.5952164399999997 podStartE2EDuration="7.784865646s" podCreationTimestamp="2026-04-16 17:41:12 +0000 UTC" firstStartedPulling="2026-04-16 17:41:15.394983022 +0000 UTC m=+69.608016162" lastFinishedPulling="2026-04-16 17:41:19.584632226 +0000 UTC m=+73.797665368" observedRunningTime="2026-04-16 17:41:19.783014769 +0000 UTC m=+73.996047949" watchObservedRunningTime="2026-04-16 17:41:19.784865646 +0000 UTC m=+73.997898865" Apr 16 17:41:19.801005 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:19.800691 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-hwsph" podStartSLOduration=2.624154227 podStartE2EDuration="4.800674357s" podCreationTimestamp="2026-04-16 17:41:15 +0000 UTC" firstStartedPulling="2026-04-16 17:41:17.068406579 +0000 UTC m=+71.281439719" lastFinishedPulling="2026-04-16 17:41:19.244926721 +0000 UTC m=+73.457959849" observedRunningTime="2026-04-16 17:41:19.799348266 +0000 UTC m=+74.012381440" watchObservedRunningTime="2026-04-16 17:41:19.800674357 +0000 UTC m=+74.013707506" Apr 16 17:41:21.648045 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:21.648010 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-769fj" Apr 16 17:41:25.346466 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.346192 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-lqggg"] Apr 16 17:41:25.349466 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.349438 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lqggg" Apr 16 17:41:25.352767 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.352737 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 17:41:25.352998 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.352961 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-w7bpg\"" Apr 16 17:41:25.353175 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.353162 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 17:41:25.353874 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.353649 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 17:41:25.354081 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.354044 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 17:41:25.498713 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.498679 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1883ebbd-f2c4-4314-b590-4ad2d34d0a15-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lqggg\" (UID: \"1883ebbd-f2c4-4314-b590-4ad2d34d0a15\") " pod="openshift-monitoring/node-exporter-lqggg" Apr 16 17:41:25.498887 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.498731 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1883ebbd-f2c4-4314-b590-4ad2d34d0a15-node-exporter-tls\") pod \"node-exporter-lqggg\" (UID: \"1883ebbd-f2c4-4314-b590-4ad2d34d0a15\") " pod="openshift-monitoring/node-exporter-lqggg" Apr 16 17:41:25.498887 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.498761 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1883ebbd-f2c4-4314-b590-4ad2d34d0a15-root\") pod \"node-exporter-lqggg\" (UID: \"1883ebbd-f2c4-4314-b590-4ad2d34d0a15\") " pod="openshift-monitoring/node-exporter-lqggg" Apr 16 17:41:25.498887 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.498866 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1883ebbd-f2c4-4314-b590-4ad2d34d0a15-sys\") pod \"node-exporter-lqggg\" (UID: \"1883ebbd-f2c4-4314-b590-4ad2d34d0a15\") " pod="openshift-monitoring/node-exporter-lqggg" Apr 16 17:41:25.499058 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.498905 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1883ebbd-f2c4-4314-b590-4ad2d34d0a15-metrics-client-ca\") pod \"node-exporter-lqggg\" (UID: \"1883ebbd-f2c4-4314-b590-4ad2d34d0a15\") " pod="openshift-monitoring/node-exporter-lqggg" Apr 16 17:41:25.499058 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.498977 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1883ebbd-f2c4-4314-b590-4ad2d34d0a15-node-exporter-textfile\") pod \"node-exporter-lqggg\" (UID: \"1883ebbd-f2c4-4314-b590-4ad2d34d0a15\") " pod="openshift-monitoring/node-exporter-lqggg" Apr 16 17:41:25.499058 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.499014 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1883ebbd-f2c4-4314-b590-4ad2d34d0a15-node-exporter-wtmp\") pod \"node-exporter-lqggg\" (UID: \"1883ebbd-f2c4-4314-b590-4ad2d34d0a15\") " pod="openshift-monitoring/node-exporter-lqggg" Apr 16 17:41:25.499058 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.499046 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1883ebbd-f2c4-4314-b590-4ad2d34d0a15-node-exporter-accelerators-collector-config\") pod \"node-exporter-lqggg\" (UID: \"1883ebbd-f2c4-4314-b590-4ad2d34d0a15\") " pod="openshift-monitoring/node-exporter-lqggg" Apr 16 17:41:25.499249 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.499071 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwmcw\" (UniqueName: \"kubernetes.io/projected/1883ebbd-f2c4-4314-b590-4ad2d34d0a15-kube-api-access-hwmcw\") pod \"node-exporter-lqggg\" (UID: \"1883ebbd-f2c4-4314-b590-4ad2d34d0a15\") " pod="openshift-monitoring/node-exporter-lqggg" Apr 16 17:41:25.600113 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.600018 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1883ebbd-f2c4-4314-b590-4ad2d34d0a15-sys\") pod \"node-exporter-lqggg\" (UID: \"1883ebbd-f2c4-4314-b590-4ad2d34d0a15\") " pod="openshift-monitoring/node-exporter-lqggg" Apr 16 17:41:25.600113 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.600078 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1883ebbd-f2c4-4314-b590-4ad2d34d0a15-metrics-client-ca\") pod \"node-exporter-lqggg\" (UID: \"1883ebbd-f2c4-4314-b590-4ad2d34d0a15\") " pod="openshift-monitoring/node-exporter-lqggg" Apr 16 17:41:25.600349 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.600120 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1883ebbd-f2c4-4314-b590-4ad2d34d0a15-node-exporter-textfile\") pod \"node-exporter-lqggg\" (UID: \"1883ebbd-f2c4-4314-b590-4ad2d34d0a15\") " pod="openshift-monitoring/node-exporter-lqggg" Apr 16 17:41:25.600349 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.600142 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1883ebbd-f2c4-4314-b590-4ad2d34d0a15-node-exporter-wtmp\") pod \"node-exporter-lqggg\" (UID: \"1883ebbd-f2c4-4314-b590-4ad2d34d0a15\") " pod="openshift-monitoring/node-exporter-lqggg" Apr 16 17:41:25.600349 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.600141 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1883ebbd-f2c4-4314-b590-4ad2d34d0a15-sys\") pod \"node-exporter-lqggg\" (UID: \"1883ebbd-f2c4-4314-b590-4ad2d34d0a15\") " pod="openshift-monitoring/node-exporter-lqggg" Apr 16 17:41:25.600349 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.600164 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1883ebbd-f2c4-4314-b590-4ad2d34d0a15-node-exporter-accelerators-collector-config\") pod \"node-exporter-lqggg\" (UID: \"1883ebbd-f2c4-4314-b590-4ad2d34d0a15\") " pod="openshift-monitoring/node-exporter-lqggg" Apr 16 17:41:25.600349 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.600183 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwmcw\" (UniqueName: \"kubernetes.io/projected/1883ebbd-f2c4-4314-b590-4ad2d34d0a15-kube-api-access-hwmcw\") pod \"node-exporter-lqggg\" (UID: \"1883ebbd-f2c4-4314-b590-4ad2d34d0a15\") " pod="openshift-monitoring/node-exporter-lqggg" Apr 16 17:41:25.600349 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.600245 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1883ebbd-f2c4-4314-b590-4ad2d34d0a15-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lqggg\" (UID: \"1883ebbd-f2c4-4314-b590-4ad2d34d0a15\") " pod="openshift-monitoring/node-exporter-lqggg" Apr 16 17:41:25.600349 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.600283 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1883ebbd-f2c4-4314-b590-4ad2d34d0a15-node-exporter-tls\") pod \"node-exporter-lqggg\" (UID: \"1883ebbd-f2c4-4314-b590-4ad2d34d0a15\") " pod="openshift-monitoring/node-exporter-lqggg" Apr 16 17:41:25.600349 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.600313 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1883ebbd-f2c4-4314-b590-4ad2d34d0a15-root\") pod \"node-exporter-lqggg\" (UID: \"1883ebbd-f2c4-4314-b590-4ad2d34d0a15\") " pod="openshift-monitoring/node-exporter-lqggg" Apr 16 17:41:25.600349 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.600319 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1883ebbd-f2c4-4314-b590-4ad2d34d0a15-node-exporter-wtmp\") pod \"node-exporter-lqggg\" (UID: \"1883ebbd-f2c4-4314-b590-4ad2d34d0a15\") " pod="openshift-monitoring/node-exporter-lqggg" Apr 16 17:41:25.600799 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.600377 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1883ebbd-f2c4-4314-b590-4ad2d34d0a15-root\") pod \"node-exporter-lqggg\" (UID: \"1883ebbd-f2c4-4314-b590-4ad2d34d0a15\") " pod="openshift-monitoring/node-exporter-lqggg" Apr 16 17:41:25.600799 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.600657 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1883ebbd-f2c4-4314-b590-4ad2d34d0a15-node-exporter-textfile\") pod \"node-exporter-lqggg\" (UID: \"1883ebbd-f2c4-4314-b590-4ad2d34d0a15\") " pod="openshift-monitoring/node-exporter-lqggg" Apr 16 17:41:25.600799 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.600734 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1883ebbd-f2c4-4314-b590-4ad2d34d0a15-metrics-client-ca\") pod \"node-exporter-lqggg\" (UID: \"1883ebbd-f2c4-4314-b590-4ad2d34d0a15\") " pod="openshift-monitoring/node-exporter-lqggg" Apr 16 17:41:25.600799 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.600793 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1883ebbd-f2c4-4314-b590-4ad2d34d0a15-node-exporter-accelerators-collector-config\") pod \"node-exporter-lqggg\" (UID: \"1883ebbd-f2c4-4314-b590-4ad2d34d0a15\") " pod="openshift-monitoring/node-exporter-lqggg" Apr 16 17:41:25.601046 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:41:25.600838 2573 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 17:41:25.601046 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:41:25.600910 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1883ebbd-f2c4-4314-b590-4ad2d34d0a15-node-exporter-tls podName:1883ebbd-f2c4-4314-b590-4ad2d34d0a15 nodeName:}" failed. No retries permitted until 2026-04-16 17:41:26.100888771 +0000 UTC m=+80.313921910 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/1883ebbd-f2c4-4314-b590-4ad2d34d0a15-node-exporter-tls") pod "node-exporter-lqggg" (UID: "1883ebbd-f2c4-4314-b590-4ad2d34d0a15") : secret "node-exporter-tls" not found Apr 16 17:41:25.603459 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.603433 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1883ebbd-f2c4-4314-b590-4ad2d34d0a15-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lqggg\" (UID: \"1883ebbd-f2c4-4314-b590-4ad2d34d0a15\") " pod="openshift-monitoring/node-exporter-lqggg" Apr 16 17:41:25.617428 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.617402 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwmcw\" (UniqueName: \"kubernetes.io/projected/1883ebbd-f2c4-4314-b590-4ad2d34d0a15-kube-api-access-hwmcw\") pod \"node-exporter-lqggg\" (UID: \"1883ebbd-f2c4-4314-b590-4ad2d34d0a15\") " pod="openshift-monitoring/node-exporter-lqggg" Apr 16 17:41:25.744608 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:25.744578 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-9d4rv" Apr 16 17:41:26.105621 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:26.105585 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1883ebbd-f2c4-4314-b590-4ad2d34d0a15-node-exporter-tls\") pod \"node-exporter-lqggg\" (UID: \"1883ebbd-f2c4-4314-b590-4ad2d34d0a15\") " pod="openshift-monitoring/node-exporter-lqggg" Apr 16 17:41:26.108562 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:26.108533 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1883ebbd-f2c4-4314-b590-4ad2d34d0a15-node-exporter-tls\") pod \"node-exporter-lqggg\" (UID: \"1883ebbd-f2c4-4314-b590-4ad2d34d0a15\") " pod="openshift-monitoring/node-exporter-lqggg" Apr 16 17:41:26.262315 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:26.262282 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lqggg" Apr 16 17:41:30.330512 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:30.330468 2573 patch_prober.go:28] interesting pod/image-registry-6d7d9f595c-np56x container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 17:41:30.330979 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:30.330539 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" podUID="1246ad0a-0cbe-41eb-b415-d3d5b58224b0" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 17:41:31.711275 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:31.711245 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:41:32.826053 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:41:32.826020 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1883ebbd_f2c4_4314_b590_4ad2d34d0a15.slice/crio-c5e0d488d488df895f7b75bbd5992719838ae79ba07992dbdc665a59c754bc59 WatchSource:0}: Error finding container c5e0d488d488df895f7b75bbd5992719838ae79ba07992dbdc665a59c754bc59: Status 404 returned error can't find the container with id c5e0d488d488df895f7b75bbd5992719838ae79ba07992dbdc665a59c754bc59 Apr 16 17:41:33.811158 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:33.811116 2573 generic.go:358] "Generic (PLEG): container finished" podID="1883ebbd-f2c4-4314-b590-4ad2d34d0a15" containerID="bde4f39253a7d5eb95149c55e39ab01faa145f888267b3be92338c050cc96550" exitCode=0 Apr 16 17:41:33.811359 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:33.811192 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lqggg" event={"ID":"1883ebbd-f2c4-4314-b590-4ad2d34d0a15","Type":"ContainerDied","Data":"bde4f39253a7d5eb95149c55e39ab01faa145f888267b3be92338c050cc96550"} Apr 16 17:41:33.811359 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:33.811261 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lqggg" event={"ID":"1883ebbd-f2c4-4314-b590-4ad2d34d0a15","Type":"ContainerStarted","Data":"c5e0d488d488df895f7b75bbd5992719838ae79ba07992dbdc665a59c754bc59"} Apr 16 17:41:33.813031 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:33.813005 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-cd57h" event={"ID":"c67baaa1-8042-4274-a9ff-bd69b3157f62","Type":"ContainerStarted","Data":"12efec93834ddd2136e5269dc1f5efcf0380ef807c4111d6e9d1098bddd16093"} Apr 16 17:41:33.813260 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:33.813237 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-cd57h" Apr 16 17:41:33.834020 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:33.833993 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-cd57h" Apr 16 17:41:33.855984 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:33.855914 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-cd57h" podStartSLOduration=2.014670671 podStartE2EDuration="17.855895554s" podCreationTimestamp="2026-04-16 17:41:16 +0000 UTC" firstStartedPulling="2026-04-16 17:41:17.068296709 +0000 UTC m=+71.281329835" lastFinishedPulling="2026-04-16 17:41:32.90952158 +0000 UTC m=+87.122554718" observedRunningTime="2026-04-16 17:41:33.855173273 +0000 UTC m=+88.068206417" watchObservedRunningTime="2026-04-16 17:41:33.855895554 +0000 UTC m=+88.068928705" Apr 16 17:41:34.818396 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:34.818350 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lqggg" event={"ID":"1883ebbd-f2c4-4314-b590-4ad2d34d0a15","Type":"ContainerStarted","Data":"f0acdbc2d4a94098aa82f02023b01756ac8ecc0e9a892f0fca2f758c3c7ff724"} Apr 16 17:41:34.818396 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:34.818396 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lqggg" event={"ID":"1883ebbd-f2c4-4314-b590-4ad2d34d0a15","Type":"ContainerStarted","Data":"2853ab61ddb5ea5c8deea81159aff749568a25dd7ddfdefb70fc6951d4772055"} Apr 16 17:41:34.856305 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:34.856243 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-lqggg" podStartSLOduration=9.085104646 podStartE2EDuration="9.856203875s" podCreationTimestamp="2026-04-16 17:41:25 +0000 UTC" firstStartedPulling="2026-04-16 17:41:32.827663243 +0000 UTC m=+87.040696374" lastFinishedPulling="2026-04-16 17:41:33.598762474 +0000 UTC m=+87.811795603" observedRunningTime="2026-04-16 17:41:34.853516061 +0000 UTC m=+89.066549210" watchObservedRunningTime="2026-04-16 17:41:34.856203875 +0000 UTC m=+89.069237022" Apr 16 17:41:41.434585 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:41.434548 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6d7d9f595c-np56x"] Apr 16 17:41:55.879990 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:55.879956 2573 generic.go:358] "Generic (PLEG): container finished" podID="31969ea1-4893-4e83-ac1e-f5882799c5da" containerID="63b84c877727520a5c05ba76c771939e20105e6c7009f58ed0332e8e7e9a8435" exitCode=0 Apr 16 17:41:55.880404 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:55.880003 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-8gx9z" event={"ID":"31969ea1-4893-4e83-ac1e-f5882799c5da","Type":"ContainerDied","Data":"63b84c877727520a5c05ba76c771939e20105e6c7009f58ed0332e8e7e9a8435"} Apr 16 17:41:55.880404 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:55.880332 2573 scope.go:117] "RemoveContainer" containerID="63b84c877727520a5c05ba76c771939e20105e6c7009f58ed0332e8e7e9a8435" Apr 16 17:41:56.884770 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:41:56.884732 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-8gx9z" event={"ID":"31969ea1-4893-4e83-ac1e-f5882799c5da","Type":"ContainerStarted","Data":"f050cfe6917c654d81532cd73d2c823479ed31b902df4d5a3c555ce4a756c4a9"} Apr 16 17:42:01.905159 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:01.905123 2573 generic.go:358] "Generic (PLEG): container finished" podID="584be834-546d-49ca-8379-4b77cb13e2ba" containerID="826f98ab899575dd17d2517a0d67256f5bde315912baeb0572adbbb61b5685bb" exitCode=0 Apr 16 17:42:01.905622 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:01.905195 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zjsrk" event={"ID":"584be834-546d-49ca-8379-4b77cb13e2ba","Type":"ContainerDied","Data":"826f98ab899575dd17d2517a0d67256f5bde315912baeb0572adbbb61b5685bb"} Apr 16 17:42:01.905622 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:01.905541 2573 scope.go:117] "RemoveContainer" containerID="826f98ab899575dd17d2517a0d67256f5bde315912baeb0572adbbb61b5685bb" Apr 16 17:42:02.909610 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:02.909576 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-zjsrk" event={"ID":"584be834-546d-49ca-8379-4b77cb13e2ba","Type":"ContainerStarted","Data":"2a8e651c2ea6b03b27fb8be42213ede114ada2be5fa487c36592f5bbeb2000ba"} Apr 16 17:42:06.456807 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.456770 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" podUID="1246ad0a-0cbe-41eb-b415-d3d5b58224b0" containerName="registry" containerID="cri-o://e8da9c313a1457b8c5f3cd2c2c5410159ea3669dc2505b15c93f58ed70d2bb41" gracePeriod=30 Apr 16 17:42:06.708756 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.708703 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:42:06.828101 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.828061 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-trusted-ca\") pod \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " Apr 16 17:42:06.828264 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.828177 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-registry-tls\") pod \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " Apr 16 17:42:06.828264 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.828211 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqsth\" (UniqueName: \"kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-kube-api-access-cqsth\") pod \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " Apr 16 17:42:06.828355 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.828270 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-registry-certificates\") pod \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " Apr 16 17:42:06.828355 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.828288 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-image-registry-private-configuration\") pod \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " Apr 16 17:42:06.828355 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.828318 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-bound-sa-token\") pod \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " Apr 16 17:42:06.828804 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.828726 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "1246ad0a-0cbe-41eb-b415-d3d5b58224b0" (UID: "1246ad0a-0cbe-41eb-b415-d3d5b58224b0"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:42:06.828970 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.828816 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-installation-pull-secrets\") pod \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " Apr 16 17:42:06.828970 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.828874 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-ca-trust-extracted\") pod \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\" (UID: \"1246ad0a-0cbe-41eb-b415-d3d5b58224b0\") " Apr 16 17:42:06.829611 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.829494 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "1246ad0a-0cbe-41eb-b415-d3d5b58224b0" (UID: "1246ad0a-0cbe-41eb-b415-d3d5b58224b0"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:42:06.830665 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.829914 2573 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-registry-certificates\") on node \"ip-10-0-128-241.ec2.internal\" DevicePath \"\"" Apr 16 17:42:06.830665 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.829944 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-trusted-ca\") on node \"ip-10-0-128-241.ec2.internal\" DevicePath \"\"" Apr 16 17:42:06.832857 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.831689 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "1246ad0a-0cbe-41eb-b415-d3d5b58224b0" (UID: "1246ad0a-0cbe-41eb-b415-d3d5b58224b0"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:42:06.835964 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.833453 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "1246ad0a-0cbe-41eb-b415-d3d5b58224b0" (UID: "1246ad0a-0cbe-41eb-b415-d3d5b58224b0"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:42:06.836520 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.836388 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-kube-api-access-cqsth" (OuterVolumeSpecName: "kube-api-access-cqsth") pod "1246ad0a-0cbe-41eb-b415-d3d5b58224b0" (UID: "1246ad0a-0cbe-41eb-b415-d3d5b58224b0"). InnerVolumeSpecName "kube-api-access-cqsth". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:42:06.836520 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.836424 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "1246ad0a-0cbe-41eb-b415-d3d5b58224b0" (UID: "1246ad0a-0cbe-41eb-b415-d3d5b58224b0"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:42:06.836920 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.836727 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "1246ad0a-0cbe-41eb-b415-d3d5b58224b0" (UID: "1246ad0a-0cbe-41eb-b415-d3d5b58224b0"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:42:06.840467 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.840440 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "1246ad0a-0cbe-41eb-b415-d3d5b58224b0" (UID: "1246ad0a-0cbe-41eb-b415-d3d5b58224b0"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:42:06.920376 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.920341 2573 generic.go:358] "Generic (PLEG): container finished" podID="1246ad0a-0cbe-41eb-b415-d3d5b58224b0" containerID="e8da9c313a1457b8c5f3cd2c2c5410159ea3669dc2505b15c93f58ed70d2bb41" exitCode=0 Apr 16 17:42:06.920529 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.920403 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" event={"ID":"1246ad0a-0cbe-41eb-b415-d3d5b58224b0","Type":"ContainerDied","Data":"e8da9c313a1457b8c5f3cd2c2c5410159ea3669dc2505b15c93f58ed70d2bb41"} Apr 16 17:42:06.920529 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.920419 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" Apr 16 17:42:06.920529 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.920429 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d7d9f595c-np56x" event={"ID":"1246ad0a-0cbe-41eb-b415-d3d5b58224b0","Type":"ContainerDied","Data":"ea17c4b1d9548cd2712a7e4e3d2dd8dfeba068207f77262d2b39e805980a8628"} Apr 16 17:42:06.920529 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.920445 2573 scope.go:117] "RemoveContainer" containerID="e8da9c313a1457b8c5f3cd2c2c5410159ea3669dc2505b15c93f58ed70d2bb41" Apr 16 17:42:06.928637 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.928619 2573 scope.go:117] "RemoveContainer" containerID="e8da9c313a1457b8c5f3cd2c2c5410159ea3669dc2505b15c93f58ed70d2bb41" Apr 16 17:42:06.928892 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:42:06.928871 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8da9c313a1457b8c5f3cd2c2c5410159ea3669dc2505b15c93f58ed70d2bb41\": container with ID starting with e8da9c313a1457b8c5f3cd2c2c5410159ea3669dc2505b15c93f58ed70d2bb41 not found: ID does not exist" containerID="e8da9c313a1457b8c5f3cd2c2c5410159ea3669dc2505b15c93f58ed70d2bb41" Apr 16 17:42:06.928975 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.928899 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8da9c313a1457b8c5f3cd2c2c5410159ea3669dc2505b15c93f58ed70d2bb41"} err="failed to get container status \"e8da9c313a1457b8c5f3cd2c2c5410159ea3669dc2505b15c93f58ed70d2bb41\": rpc error: code = NotFound desc = could not find container \"e8da9c313a1457b8c5f3cd2c2c5410159ea3669dc2505b15c93f58ed70d2bb41\": container with ID starting with e8da9c313a1457b8c5f3cd2c2c5410159ea3669dc2505b15c93f58ed70d2bb41 not found: ID does not exist" Apr 16 17:42:06.930300 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.930283 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-registry-tls\") on node \"ip-10-0-128-241.ec2.internal\" DevicePath \"\"" Apr 16 17:42:06.930369 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.930306 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cqsth\" (UniqueName: \"kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-kube-api-access-cqsth\") on node \"ip-10-0-128-241.ec2.internal\" DevicePath \"\"" Apr 16 17:42:06.930369 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.930317 2573 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-image-registry-private-configuration\") on node \"ip-10-0-128-241.ec2.internal\" DevicePath \"\"" Apr 16 17:42:06.930369 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.930328 2573 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-bound-sa-token\") on node \"ip-10-0-128-241.ec2.internal\" DevicePath \"\"" Apr 16 17:42:06.930369 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.930336 2573 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-installation-pull-secrets\") on node \"ip-10-0-128-241.ec2.internal\" DevicePath \"\"" Apr 16 17:42:06.930369 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.930344 2573 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1246ad0a-0cbe-41eb-b415-d3d5b58224b0-ca-trust-extracted\") on node \"ip-10-0-128-241.ec2.internal\" DevicePath \"\"" Apr 16 17:42:06.943641 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.943612 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6d7d9f595c-np56x"] Apr 16 17:42:06.948398 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:06.948376 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6d7d9f595c-np56x"] Apr 16 17:42:08.442888 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:08.442854 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1246ad0a-0cbe-41eb-b415-d3d5b58224b0" path="/var/lib/kubelet/pods/1246ad0a-0cbe-41eb-b415-d3d5b58224b0/volumes" Apr 16 17:42:20.968323 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:20.968289 2573 generic.go:358] "Generic (PLEG): container finished" podID="b27c0f5d-1775-4ae8-8903-1d44802e9f35" containerID="fde3a37ea01101589d656c4bac2fdfbaee1592da7a659c3ba8c51a9922b23580" exitCode=0 Apr 16 17:42:20.968740 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:20.968365 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-vhf4j" event={"ID":"b27c0f5d-1775-4ae8-8903-1d44802e9f35","Type":"ContainerDied","Data":"fde3a37ea01101589d656c4bac2fdfbaee1592da7a659c3ba8c51a9922b23580"} Apr 16 17:42:20.968740 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:20.968675 2573 scope.go:117] "RemoveContainer" containerID="fde3a37ea01101589d656c4bac2fdfbaee1592da7a659c3ba8c51a9922b23580" Apr 16 17:42:21.972957 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:42:21.972923 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-vhf4j" event={"ID":"b27c0f5d-1775-4ae8-8903-1d44802e9f35","Type":"ContainerStarted","Data":"f8b9fea0fd62bacb28035eb2d809656a4bec0e6b38d0904408595e550f436f0d"} Apr 16 17:45:00.536238 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:00.536192 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwt4vx"] Apr 16 17:45:00.536663 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:00.536481 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1246ad0a-0cbe-41eb-b415-d3d5b58224b0" containerName="registry" Apr 16 17:45:00.536663 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:00.536492 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1246ad0a-0cbe-41eb-b415-d3d5b58224b0" containerName="registry" Apr 16 17:45:00.536663 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:00.536548 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1246ad0a-0cbe-41eb-b415-d3d5b58224b0" containerName="registry" Apr 16 17:45:00.539393 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:00.539376 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwt4vx" Apr 16 17:45:00.543019 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:00.542996 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-rd2pt\"" Apr 16 17:45:00.543193 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:00.543170 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 17:45:00.544175 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:00.544147 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 17:45:00.550843 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:00.550821 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwt4vx"] Apr 16 17:45:00.566610 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:00.566584 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1bb4eda1-fb2b-43a8-9904-663d6062599e-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwt4vx\" (UID: \"1bb4eda1-fb2b-43a8-9904-663d6062599e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwt4vx" Apr 16 17:45:00.566729 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:00.566614 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt565\" (UniqueName: \"kubernetes.io/projected/1bb4eda1-fb2b-43a8-9904-663d6062599e-kube-api-access-pt565\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwt4vx\" (UID: \"1bb4eda1-fb2b-43a8-9904-663d6062599e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwt4vx" Apr 16 17:45:00.566729 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:00.566636 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1bb4eda1-fb2b-43a8-9904-663d6062599e-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwt4vx\" (UID: \"1bb4eda1-fb2b-43a8-9904-663d6062599e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwt4vx" Apr 16 17:45:00.667606 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:00.667568 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1bb4eda1-fb2b-43a8-9904-663d6062599e-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwt4vx\" (UID: \"1bb4eda1-fb2b-43a8-9904-663d6062599e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwt4vx" Apr 16 17:45:00.667780 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:00.667646 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1bb4eda1-fb2b-43a8-9904-663d6062599e-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwt4vx\" (UID: \"1bb4eda1-fb2b-43a8-9904-663d6062599e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwt4vx" Apr 16 17:45:00.667780 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:00.667692 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pt565\" (UniqueName: \"kubernetes.io/projected/1bb4eda1-fb2b-43a8-9904-663d6062599e-kube-api-access-pt565\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwt4vx\" (UID: \"1bb4eda1-fb2b-43a8-9904-663d6062599e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwt4vx" Apr 16 17:45:00.668005 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:00.667984 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1bb4eda1-fb2b-43a8-9904-663d6062599e-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwt4vx\" (UID: \"1bb4eda1-fb2b-43a8-9904-663d6062599e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwt4vx" Apr 16 17:45:00.668065 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:00.668013 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1bb4eda1-fb2b-43a8-9904-663d6062599e-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwt4vx\" (UID: \"1bb4eda1-fb2b-43a8-9904-663d6062599e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwt4vx" Apr 16 17:45:00.678597 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:00.678568 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt565\" (UniqueName: \"kubernetes.io/projected/1bb4eda1-fb2b-43a8-9904-663d6062599e-kube-api-access-pt565\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwt4vx\" (UID: \"1bb4eda1-fb2b-43a8-9904-663d6062599e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwt4vx" Apr 16 17:45:00.848130 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:00.848049 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwt4vx" Apr 16 17:45:00.972532 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:00.972503 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwt4vx"] Apr 16 17:45:00.975659 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:45:00.975628 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bb4eda1_fb2b_43a8_9904_663d6062599e.slice/crio-f576a6318b7b92d8b4c7d174ba12d60d9adf92713194d48b3e09a859fce647d0 WatchSource:0}: Error finding container f576a6318b7b92d8b4c7d174ba12d60d9adf92713194d48b3e09a859fce647d0: Status 404 returned error can't find the container with id f576a6318b7b92d8b4c7d174ba12d60d9adf92713194d48b3e09a859fce647d0 Apr 16 17:45:01.409500 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:01.409464 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwt4vx" event={"ID":"1bb4eda1-fb2b-43a8-9904-663d6062599e","Type":"ContainerStarted","Data":"f576a6318b7b92d8b4c7d174ba12d60d9adf92713194d48b3e09a859fce647d0"} Apr 16 17:45:06.317347 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:06.317314 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-fmxzp_39536cc1-6596-4c35-a9d6-a93ef6779640/console-operator/1.log" Apr 16 17:45:06.317781 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:06.317759 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-fmxzp_39536cc1-6596-4c35-a9d6-a93ef6779640/console-operator/1.log" Apr 16 17:45:06.326895 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:06.326873 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 17:45:06.426940 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:06.426908 2573 generic.go:358] "Generic (PLEG): container finished" podID="1bb4eda1-fb2b-43a8-9904-663d6062599e" containerID="081892b46b9514f551efddd8d78cfffd68f89393dd350cbcceb29f14f03f9c17" exitCode=0 Apr 16 17:45:06.428955 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:06.426991 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwt4vx" event={"ID":"1bb4eda1-fb2b-43a8-9904-663d6062599e","Type":"ContainerDied","Data":"081892b46b9514f551efddd8d78cfffd68f89393dd350cbcceb29f14f03f9c17"} Apr 16 17:45:13.448511 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:13.448479 2573 generic.go:358] "Generic (PLEG): container finished" podID="1bb4eda1-fb2b-43a8-9904-663d6062599e" containerID="f8459f39aa12b0ea47acc8990ae8b0350d6a868229378dd58d076eabfaf1a5d8" exitCode=0 Apr 16 17:45:13.448874 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:13.448556 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwt4vx" event={"ID":"1bb4eda1-fb2b-43a8-9904-663d6062599e","Type":"ContainerDied","Data":"f8459f39aa12b0ea47acc8990ae8b0350d6a868229378dd58d076eabfaf1a5d8"} Apr 16 17:45:13.449465 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:13.449447 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:45:19.465697 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:19.465660 2573 generic.go:358] "Generic (PLEG): container finished" podID="1bb4eda1-fb2b-43a8-9904-663d6062599e" containerID="69b636df1072c2cbb5de67106e906bf11170fd60dfb51634fce6fc41972c7455" exitCode=0 Apr 16 17:45:19.466073 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:19.465710 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwt4vx" event={"ID":"1bb4eda1-fb2b-43a8-9904-663d6062599e","Type":"ContainerDied","Data":"69b636df1072c2cbb5de67106e906bf11170fd60dfb51634fce6fc41972c7455"} Apr 16 17:45:20.585414 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:20.585384 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwt4vx" Apr 16 17:45:20.600709 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:20.600679 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1bb4eda1-fb2b-43a8-9904-663d6062599e-bundle\") pod \"1bb4eda1-fb2b-43a8-9904-663d6062599e\" (UID: \"1bb4eda1-fb2b-43a8-9904-663d6062599e\") " Apr 16 17:45:20.600868 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:20.600719 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt565\" (UniqueName: \"kubernetes.io/projected/1bb4eda1-fb2b-43a8-9904-663d6062599e-kube-api-access-pt565\") pod \"1bb4eda1-fb2b-43a8-9904-663d6062599e\" (UID: \"1bb4eda1-fb2b-43a8-9904-663d6062599e\") " Apr 16 17:45:20.600868 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:20.600750 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1bb4eda1-fb2b-43a8-9904-663d6062599e-util\") pod \"1bb4eda1-fb2b-43a8-9904-663d6062599e\" (UID: \"1bb4eda1-fb2b-43a8-9904-663d6062599e\") " Apr 16 17:45:20.601396 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:20.601362 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bb4eda1-fb2b-43a8-9904-663d6062599e-bundle" (OuterVolumeSpecName: "bundle") pod "1bb4eda1-fb2b-43a8-9904-663d6062599e" (UID: "1bb4eda1-fb2b-43a8-9904-663d6062599e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:45:20.602875 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:20.602850 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bb4eda1-fb2b-43a8-9904-663d6062599e-kube-api-access-pt565" (OuterVolumeSpecName: "kube-api-access-pt565") pod "1bb4eda1-fb2b-43a8-9904-663d6062599e" (UID: "1bb4eda1-fb2b-43a8-9904-663d6062599e"). InnerVolumeSpecName "kube-api-access-pt565". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:45:20.605014 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:20.604993 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bb4eda1-fb2b-43a8-9904-663d6062599e-util" (OuterVolumeSpecName: "util") pod "1bb4eda1-fb2b-43a8-9904-663d6062599e" (UID: "1bb4eda1-fb2b-43a8-9904-663d6062599e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:45:20.701474 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:20.701441 2573 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1bb4eda1-fb2b-43a8-9904-663d6062599e-bundle\") on node \"ip-10-0-128-241.ec2.internal\" DevicePath \"\"" Apr 16 17:45:20.701474 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:20.701471 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pt565\" (UniqueName: \"kubernetes.io/projected/1bb4eda1-fb2b-43a8-9904-663d6062599e-kube-api-access-pt565\") on node \"ip-10-0-128-241.ec2.internal\" DevicePath \"\"" Apr 16 17:45:20.701673 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:20.701488 2573 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1bb4eda1-fb2b-43a8-9904-663d6062599e-util\") on node \"ip-10-0-128-241.ec2.internal\" DevicePath \"\"" Apr 16 17:45:21.473260 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:21.473200 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwt4vx" event={"ID":"1bb4eda1-fb2b-43a8-9904-663d6062599e","Type":"ContainerDied","Data":"f576a6318b7b92d8b4c7d174ba12d60d9adf92713194d48b3e09a859fce647d0"} Apr 16 17:45:21.473260 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:21.473262 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f576a6318b7b92d8b4c7d174ba12d60d9adf92713194d48b3e09a859fce647d0" Apr 16 17:45:21.473465 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:21.473287 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cwt4vx" Apr 16 17:45:27.297651 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:27.297611 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7rckh"] Apr 16 17:45:27.298030 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:27.297914 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1bb4eda1-fb2b-43a8-9904-663d6062599e" containerName="pull" Apr 16 17:45:27.298030 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:27.297928 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bb4eda1-fb2b-43a8-9904-663d6062599e" containerName="pull" Apr 16 17:45:27.298030 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:27.297942 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1bb4eda1-fb2b-43a8-9904-663d6062599e" containerName="extract" Apr 16 17:45:27.298030 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:27.297949 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bb4eda1-fb2b-43a8-9904-663d6062599e" containerName="extract" Apr 16 17:45:27.298030 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:27.297963 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1bb4eda1-fb2b-43a8-9904-663d6062599e" containerName="util" Apr 16 17:45:27.298030 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:27.297968 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bb4eda1-fb2b-43a8-9904-663d6062599e" containerName="util" Apr 16 17:45:27.298030 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:27.298018 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="1bb4eda1-fb2b-43a8-9904-663d6062599e" containerName="extract" Apr 16 17:45:27.300875 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:27.300858 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7rckh" Apr 16 17:45:27.303789 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:27.303764 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-rxt4t\"" Apr 16 17:45:27.303928 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:27.303802 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 17:45:27.304070 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:27.304054 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 17:45:27.304261 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:27.304247 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 17:45:27.314185 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:27.314160 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7rckh"] Apr 16 17:45:27.350952 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:27.350923 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9688\" (UniqueName: \"kubernetes.io/projected/3af30beb-68cd-4ff5-9eac-cabe09b75e87-kube-api-access-k9688\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-7rckh\" (UID: \"3af30beb-68cd-4ff5-9eac-cabe09b75e87\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7rckh" Apr 16 17:45:27.350952 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:27.350951 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/3af30beb-68cd-4ff5-9eac-cabe09b75e87-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-7rckh\" (UID: \"3af30beb-68cd-4ff5-9eac-cabe09b75e87\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7rckh" Apr 16 17:45:27.451581 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:27.451534 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k9688\" (UniqueName: \"kubernetes.io/projected/3af30beb-68cd-4ff5-9eac-cabe09b75e87-kube-api-access-k9688\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-7rckh\" (UID: \"3af30beb-68cd-4ff5-9eac-cabe09b75e87\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7rckh" Apr 16 17:45:27.451581 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:27.451582 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/3af30beb-68cd-4ff5-9eac-cabe09b75e87-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-7rckh\" (UID: \"3af30beb-68cd-4ff5-9eac-cabe09b75e87\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7rckh" Apr 16 17:45:27.453916 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:27.453897 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/3af30beb-68cd-4ff5-9eac-cabe09b75e87-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-7rckh\" (UID: \"3af30beb-68cd-4ff5-9eac-cabe09b75e87\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7rckh" Apr 16 17:45:27.463486 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:27.463461 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9688\" (UniqueName: \"kubernetes.io/projected/3af30beb-68cd-4ff5-9eac-cabe09b75e87-kube-api-access-k9688\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-7rckh\" (UID: \"3af30beb-68cd-4ff5-9eac-cabe09b75e87\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7rckh" Apr 16 17:45:27.610417 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:27.610322 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7rckh" Apr 16 17:45:27.734901 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:27.734862 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7rckh"] Apr 16 17:45:27.737882 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:45:27.737853 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3af30beb_68cd_4ff5_9eac_cabe09b75e87.slice/crio-da2fe187d2ad8789a6cd3f0d40ef063479a878c74b1e29d8f4d4516f90f62ee5 WatchSource:0}: Error finding container da2fe187d2ad8789a6cd3f0d40ef063479a878c74b1e29d8f4d4516f90f62ee5: Status 404 returned error can't find the container with id da2fe187d2ad8789a6cd3f0d40ef063479a878c74b1e29d8f4d4516f90f62ee5 Apr 16 17:45:28.496608 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:28.496569 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7rckh" event={"ID":"3af30beb-68cd-4ff5-9eac-cabe09b75e87","Type":"ContainerStarted","Data":"da2fe187d2ad8789a6cd3f0d40ef063479a878c74b1e29d8f4d4516f90f62ee5"} Apr 16 17:45:31.509078 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:31.509038 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7rckh" event={"ID":"3af30beb-68cd-4ff5-9eac-cabe09b75e87","Type":"ContainerStarted","Data":"aad4ce51004cd3313a6a70313a8a903585039bd0313827d78e941c2c57028075"} Apr 16 17:45:31.509495 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:31.509199 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7rckh" Apr 16 17:45:31.546429 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:31.546361 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7rckh" podStartSLOduration=1.2830646190000001 podStartE2EDuration="4.546341564s" podCreationTimestamp="2026-04-16 17:45:27 +0000 UTC" firstStartedPulling="2026-04-16 17:45:27.739566752 +0000 UTC m=+321.952599879" lastFinishedPulling="2026-04-16 17:45:31.002843681 +0000 UTC m=+325.215876824" observedRunningTime="2026-04-16 17:45:31.544417859 +0000 UTC m=+325.757451026" watchObservedRunningTime="2026-04-16 17:45:31.546341564 +0000 UTC m=+325.759374713" Apr 16 17:45:31.957914 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:31.957874 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-8rr59"] Apr 16 17:45:31.960846 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:31.960823 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8rr59" Apr 16 17:45:31.964030 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:31.964008 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 17:45:31.965044 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:31.965029 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-spg6g\"" Apr 16 17:45:31.966231 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:31.966200 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 17:45:31.983270 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:31.983250 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-8rr59"] Apr 16 17:45:31.991003 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:31.990980 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tvfv\" (UniqueName: \"kubernetes.io/projected/0bab4e0c-e2de-4d13-b2cf-74f86b9be71f-kube-api-access-8tvfv\") pod \"keda-metrics-apiserver-7c9f485588-8rr59\" (UID: \"0bab4e0c-e2de-4d13-b2cf-74f86b9be71f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8rr59" Apr 16 17:45:31.991101 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:31.991019 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/0bab4e0c-e2de-4d13-b2cf-74f86b9be71f-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-8rr59\" (UID: \"0bab4e0c-e2de-4d13-b2cf-74f86b9be71f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8rr59" Apr 16 17:45:31.991101 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:31.991064 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0bab4e0c-e2de-4d13-b2cf-74f86b9be71f-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8rr59\" (UID: \"0bab4e0c-e2de-4d13-b2cf-74f86b9be71f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8rr59" Apr 16 17:45:32.091897 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:32.091869 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0bab4e0c-e2de-4d13-b2cf-74f86b9be71f-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8rr59\" (UID: \"0bab4e0c-e2de-4d13-b2cf-74f86b9be71f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8rr59" Apr 16 17:45:32.092033 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:32.091921 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8tvfv\" (UniqueName: \"kubernetes.io/projected/0bab4e0c-e2de-4d13-b2cf-74f86b9be71f-kube-api-access-8tvfv\") pod \"keda-metrics-apiserver-7c9f485588-8rr59\" (UID: \"0bab4e0c-e2de-4d13-b2cf-74f86b9be71f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8rr59" Apr 16 17:45:32.092033 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:32.091944 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/0bab4e0c-e2de-4d13-b2cf-74f86b9be71f-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-8rr59\" (UID: \"0bab4e0c-e2de-4d13-b2cf-74f86b9be71f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8rr59" Apr 16 17:45:32.092033 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:45:32.092015 2573 secret.go:281] references non-existent secret key: tls.crt Apr 16 17:45:32.092199 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:45:32.092034 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 17:45:32.092199 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:45:32.092054 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-8rr59: references non-existent secret key: tls.crt Apr 16 17:45:32.092199 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:45:32.092107 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0bab4e0c-e2de-4d13-b2cf-74f86b9be71f-certificates podName:0bab4e0c-e2de-4d13-b2cf-74f86b9be71f nodeName:}" failed. No retries permitted until 2026-04-16 17:45:32.592088129 +0000 UTC m=+326.805121259 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0bab4e0c-e2de-4d13-b2cf-74f86b9be71f-certificates") pod "keda-metrics-apiserver-7c9f485588-8rr59" (UID: "0bab4e0c-e2de-4d13-b2cf-74f86b9be71f") : references non-existent secret key: tls.crt Apr 16 17:45:32.092372 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:32.092314 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/0bab4e0c-e2de-4d13-b2cf-74f86b9be71f-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-8rr59\" (UID: \"0bab4e0c-e2de-4d13-b2cf-74f86b9be71f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8rr59" Apr 16 17:45:32.108403 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:32.108384 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tvfv\" (UniqueName: \"kubernetes.io/projected/0bab4e0c-e2de-4d13-b2cf-74f86b9be71f-kube-api-access-8tvfv\") pod \"keda-metrics-apiserver-7c9f485588-8rr59\" (UID: \"0bab4e0c-e2de-4d13-b2cf-74f86b9be71f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8rr59" Apr 16 17:45:32.595561 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:32.595528 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0bab4e0c-e2de-4d13-b2cf-74f86b9be71f-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8rr59\" (UID: \"0bab4e0c-e2de-4d13-b2cf-74f86b9be71f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8rr59" Apr 16 17:45:32.595967 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:45:32.595675 2573 secret.go:281] references non-existent secret key: tls.crt Apr 16 17:45:32.595967 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:45:32.595694 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 17:45:32.595967 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:45:32.595713 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-8rr59: references non-existent secret key: tls.crt Apr 16 17:45:32.595967 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:45:32.595770 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0bab4e0c-e2de-4d13-b2cf-74f86b9be71f-certificates podName:0bab4e0c-e2de-4d13-b2cf-74f86b9be71f nodeName:}" failed. No retries permitted until 2026-04-16 17:45:33.595751585 +0000 UTC m=+327.808784711 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0bab4e0c-e2de-4d13-b2cf-74f86b9be71f-certificates") pod "keda-metrics-apiserver-7c9f485588-8rr59" (UID: "0bab4e0c-e2de-4d13-b2cf-74f86b9be71f") : references non-existent secret key: tls.crt Apr 16 17:45:33.605235 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:33.605174 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0bab4e0c-e2de-4d13-b2cf-74f86b9be71f-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8rr59\" (UID: \"0bab4e0c-e2de-4d13-b2cf-74f86b9be71f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8rr59" Apr 16 17:45:33.605672 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:45:33.605323 2573 secret.go:281] references non-existent secret key: tls.crt Apr 16 17:45:33.605672 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:45:33.605346 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 17:45:33.605672 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:45:33.605366 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-8rr59: references non-existent secret key: tls.crt Apr 16 17:45:33.605672 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:45:33.605419 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0bab4e0c-e2de-4d13-b2cf-74f86b9be71f-certificates podName:0bab4e0c-e2de-4d13-b2cf-74f86b9be71f nodeName:}" failed. No retries permitted until 2026-04-16 17:45:35.605403531 +0000 UTC m=+329.818436656 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0bab4e0c-e2de-4d13-b2cf-74f86b9be71f-certificates") pod "keda-metrics-apiserver-7c9f485588-8rr59" (UID: "0bab4e0c-e2de-4d13-b2cf-74f86b9be71f") : references non-existent secret key: tls.crt Apr 16 17:45:35.621539 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:35.621497 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0bab4e0c-e2de-4d13-b2cf-74f86b9be71f-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8rr59\" (UID: \"0bab4e0c-e2de-4d13-b2cf-74f86b9be71f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8rr59" Apr 16 17:45:35.621927 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:45:35.621647 2573 secret.go:281] references non-existent secret key: tls.crt Apr 16 17:45:35.621927 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:45:35.621667 2573 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 17:45:35.621927 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:45:35.621686 2573 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-8rr59: references non-existent secret key: tls.crt Apr 16 17:45:35.621927 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:45:35.621769 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0bab4e0c-e2de-4d13-b2cf-74f86b9be71f-certificates podName:0bab4e0c-e2de-4d13-b2cf-74f86b9be71f nodeName:}" failed. No retries permitted until 2026-04-16 17:45:39.621753736 +0000 UTC m=+333.834786867 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0bab4e0c-e2de-4d13-b2cf-74f86b9be71f-certificates") pod "keda-metrics-apiserver-7c9f485588-8rr59" (UID: "0bab4e0c-e2de-4d13-b2cf-74f86b9be71f") : references non-existent secret key: tls.crt Apr 16 17:45:39.652065 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:39.652028 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0bab4e0c-e2de-4d13-b2cf-74f86b9be71f-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8rr59\" (UID: \"0bab4e0c-e2de-4d13-b2cf-74f86b9be71f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8rr59" Apr 16 17:45:39.654519 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:39.654489 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0bab4e0c-e2de-4d13-b2cf-74f86b9be71f-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8rr59\" (UID: \"0bab4e0c-e2de-4d13-b2cf-74f86b9be71f\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8rr59" Apr 16 17:45:39.770765 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:39.770732 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8rr59" Apr 16 17:45:39.893703 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:39.893681 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-8rr59"] Apr 16 17:45:39.896050 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:45:39.896017 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bab4e0c_e2de_4d13_b2cf_74f86b9be71f.slice/crio-24e12b2c36f65985d297c26510f4fe927010b27fff476cc2701e88a35de13cd6 WatchSource:0}: Error finding container 24e12b2c36f65985d297c26510f4fe927010b27fff476cc2701e88a35de13cd6: Status 404 returned error can't find the container with id 24e12b2c36f65985d297c26510f4fe927010b27fff476cc2701e88a35de13cd6 Apr 16 17:45:40.539561 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:40.539524 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8rr59" event={"ID":"0bab4e0c-e2de-4d13-b2cf-74f86b9be71f","Type":"ContainerStarted","Data":"24e12b2c36f65985d297c26510f4fe927010b27fff476cc2701e88a35de13cd6"} Apr 16 17:45:42.547806 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:42.547769 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8rr59" event={"ID":"0bab4e0c-e2de-4d13-b2cf-74f86b9be71f","Type":"ContainerStarted","Data":"70c164d5e0a650803bd154fc97437c373f3ecce19ed54a755e1ea88a57583f67"} Apr 16 17:45:42.548201 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:42.547903 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8rr59" Apr 16 17:45:42.571771 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:42.571704 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8rr59" podStartSLOduration=9.00890095 podStartE2EDuration="11.571684087s" podCreationTimestamp="2026-04-16 17:45:31 +0000 UTC" firstStartedPulling="2026-04-16 17:45:39.897278839 +0000 UTC m=+334.110311964" lastFinishedPulling="2026-04-16 17:45:42.460061963 +0000 UTC m=+336.673095101" observedRunningTime="2026-04-16 17:45:42.57013375 +0000 UTC m=+336.783166919" watchObservedRunningTime="2026-04-16 17:45:42.571684087 +0000 UTC m=+336.784717237" Apr 16 17:45:52.514781 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:52.514751 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7rckh" Apr 16 17:45:53.555368 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:45:53.555337 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8rr59" Apr 16 17:46:41.510385 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:41.510354 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-dnm7w"] Apr 16 17:46:41.513689 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:41.513665 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dnm7w" Apr 16 17:46:41.516473 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:41.516440 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 17:46:41.518016 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:41.517998 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-hwzvm\"" Apr 16 17:46:41.518016 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:41.518011 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 17:46:41.518140 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:41.518008 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 17:46:41.522546 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:41.522524 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-dnm7w"] Apr 16 17:46:41.546053 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:41.546017 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-qvf7d"] Apr 16 17:46:41.549650 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:41.549562 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-qvf7d" Apr 16 17:46:41.552668 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:41.552646 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 17:46:41.552825 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:41.552807 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-kzlrc\"" Apr 16 17:46:41.558125 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:41.558036 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-qvf7d"] Apr 16 17:46:41.630595 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:41.630570 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/235e999a-6816-4ae6-a4e1-4a54cff730b6-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-dnm7w\" (UID: \"235e999a-6816-4ae6-a4e1-4a54cff730b6\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dnm7w" Apr 16 17:46:41.630732 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:41.630616 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5tbh\" (UniqueName: \"kubernetes.io/projected/235e999a-6816-4ae6-a4e1-4a54cff730b6-kube-api-access-g5tbh\") pod \"llmisvc-controller-manager-68cc5db7c4-dnm7w\" (UID: \"235e999a-6816-4ae6-a4e1-4a54cff730b6\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dnm7w" Apr 16 17:46:41.630732 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:41.630676 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8c7x\" (UniqueName: \"kubernetes.io/projected/de0466b5-9bae-43a1-a353-b912ade347a1-kube-api-access-q8c7x\") pod \"seaweedfs-86cc847c5c-qvf7d\" (UID: \"de0466b5-9bae-43a1-a353-b912ade347a1\") " pod="kserve/seaweedfs-86cc847c5c-qvf7d" Apr 16 17:46:41.630732 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:41.630723 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/de0466b5-9bae-43a1-a353-b912ade347a1-data\") pod \"seaweedfs-86cc847c5c-qvf7d\" (UID: \"de0466b5-9bae-43a1-a353-b912ade347a1\") " pod="kserve/seaweedfs-86cc847c5c-qvf7d" Apr 16 17:46:41.732041 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:41.732007 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/235e999a-6816-4ae6-a4e1-4a54cff730b6-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-dnm7w\" (UID: \"235e999a-6816-4ae6-a4e1-4a54cff730b6\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dnm7w" Apr 16 17:46:41.732254 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:41.732068 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5tbh\" (UniqueName: \"kubernetes.io/projected/235e999a-6816-4ae6-a4e1-4a54cff730b6-kube-api-access-g5tbh\") pod \"llmisvc-controller-manager-68cc5db7c4-dnm7w\" (UID: \"235e999a-6816-4ae6-a4e1-4a54cff730b6\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dnm7w" Apr 16 17:46:41.732254 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:41.732103 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8c7x\" (UniqueName: \"kubernetes.io/projected/de0466b5-9bae-43a1-a353-b912ade347a1-kube-api-access-q8c7x\") pod \"seaweedfs-86cc847c5c-qvf7d\" (UID: \"de0466b5-9bae-43a1-a353-b912ade347a1\") " pod="kserve/seaweedfs-86cc847c5c-qvf7d" Apr 16 17:46:41.732254 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:41.732142 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/de0466b5-9bae-43a1-a353-b912ade347a1-data\") pod \"seaweedfs-86cc847c5c-qvf7d\" (UID: \"de0466b5-9bae-43a1-a353-b912ade347a1\") " pod="kserve/seaweedfs-86cc847c5c-qvf7d" Apr 16 17:46:41.732531 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:41.732504 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/de0466b5-9bae-43a1-a353-b912ade347a1-data\") pod \"seaweedfs-86cc847c5c-qvf7d\" (UID: \"de0466b5-9bae-43a1-a353-b912ade347a1\") " pod="kserve/seaweedfs-86cc847c5c-qvf7d" Apr 16 17:46:41.734516 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:41.734492 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/235e999a-6816-4ae6-a4e1-4a54cff730b6-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-dnm7w\" (UID: \"235e999a-6816-4ae6-a4e1-4a54cff730b6\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dnm7w" Apr 16 17:46:41.744526 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:41.744500 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8c7x\" (UniqueName: \"kubernetes.io/projected/de0466b5-9bae-43a1-a353-b912ade347a1-kube-api-access-q8c7x\") pod \"seaweedfs-86cc847c5c-qvf7d\" (UID: \"de0466b5-9bae-43a1-a353-b912ade347a1\") " pod="kserve/seaweedfs-86cc847c5c-qvf7d" Apr 16 17:46:41.745059 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:41.745039 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5tbh\" (UniqueName: \"kubernetes.io/projected/235e999a-6816-4ae6-a4e1-4a54cff730b6-kube-api-access-g5tbh\") pod \"llmisvc-controller-manager-68cc5db7c4-dnm7w\" (UID: \"235e999a-6816-4ae6-a4e1-4a54cff730b6\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-dnm7w" Apr 16 17:46:41.824318 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:41.824232 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dnm7w" Apr 16 17:46:41.861684 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:41.861649 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-qvf7d" Apr 16 17:46:41.958212 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:41.958178 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-dnm7w"] Apr 16 17:46:41.960869 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:46:41.960837 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod235e999a_6816_4ae6_a4e1_4a54cff730b6.slice/crio-b48e2b0cbb629fc8463dfd9e9e5098d106115ce9680798634011d89fddf806a4 WatchSource:0}: Error finding container b48e2b0cbb629fc8463dfd9e9e5098d106115ce9680798634011d89fddf806a4: Status 404 returned error can't find the container with id b48e2b0cbb629fc8463dfd9e9e5098d106115ce9680798634011d89fddf806a4 Apr 16 17:46:42.000014 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:41.999991 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-qvf7d"] Apr 16 17:46:42.001922 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:46:42.001890 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde0466b5_9bae_43a1_a353_b912ade347a1.slice/crio-0a50d79e6a45b156ae88945d697b33dcd089e24c0558720b5ad86aabaab042f8 WatchSource:0}: Error finding container 0a50d79e6a45b156ae88945d697b33dcd089e24c0558720b5ad86aabaab042f8: Status 404 returned error can't find the container with id 0a50d79e6a45b156ae88945d697b33dcd089e24c0558720b5ad86aabaab042f8 Apr 16 17:46:42.743118 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:42.743054 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-qvf7d" event={"ID":"de0466b5-9bae-43a1-a353-b912ade347a1","Type":"ContainerStarted","Data":"0a50d79e6a45b156ae88945d697b33dcd089e24c0558720b5ad86aabaab042f8"} Apr 16 17:46:42.744742 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:42.744688 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dnm7w" event={"ID":"235e999a-6816-4ae6-a4e1-4a54cff730b6","Type":"ContainerStarted","Data":"b48e2b0cbb629fc8463dfd9e9e5098d106115ce9680798634011d89fddf806a4"} Apr 16 17:46:45.755167 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:45.755134 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-qvf7d" event={"ID":"de0466b5-9bae-43a1-a353-b912ade347a1","Type":"ContainerStarted","Data":"16080b95f546a8861180998d7644a5282e1654cd7093f31e6eeb67421d1b30dc"} Apr 16 17:46:45.755640 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:45.755259 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-qvf7d" Apr 16 17:46:45.756441 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:45.756419 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dnm7w" event={"ID":"235e999a-6816-4ae6-a4e1-4a54cff730b6","Type":"ContainerStarted","Data":"ad84bb8f34e75b53393112cd22d033f2659f40d3b67ddb5f15f77a27dcf81b4d"} Apr 16 17:46:45.756554 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:45.756538 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dnm7w" Apr 16 17:46:45.777040 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:45.776995 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-qvf7d" podStartSLOduration=1.547740281 podStartE2EDuration="4.776983614s" podCreationTimestamp="2026-04-16 17:46:41 +0000 UTC" firstStartedPulling="2026-04-16 17:46:42.003256701 +0000 UTC m=+396.216289830" lastFinishedPulling="2026-04-16 17:46:45.232500034 +0000 UTC m=+399.445533163" observedRunningTime="2026-04-16 17:46:45.774758356 +0000 UTC m=+399.987791533" watchObservedRunningTime="2026-04-16 17:46:45.776983614 +0000 UTC m=+399.990016758" Apr 16 17:46:45.794355 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:45.794312 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dnm7w" podStartSLOduration=1.579497427 podStartE2EDuration="4.794297359s" podCreationTimestamp="2026-04-16 17:46:41 +0000 UTC" firstStartedPulling="2026-04-16 17:46:41.962445683 +0000 UTC m=+396.175478809" lastFinishedPulling="2026-04-16 17:46:45.177245598 +0000 UTC m=+399.390278741" observedRunningTime="2026-04-16 17:46:45.792863804 +0000 UTC m=+400.005896954" watchObservedRunningTime="2026-04-16 17:46:45.794297359 +0000 UTC m=+400.007330508" Apr 16 17:46:51.762475 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:46:51.762442 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-qvf7d" Apr 16 17:47:16.762525 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:47:16.762492 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-dnm7w" Apr 16 17:47:51.973448 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:47:51.973410 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-l2jvw"] Apr 16 17:47:51.977092 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:47:51.977074 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-l2jvw" Apr 16 17:47:51.980265 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:47:51.980244 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 17:47:51.980489 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:47:51.980474 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-z89m9\"" Apr 16 17:47:51.991622 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:47:51.991600 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-l2jvw"] Apr 16 17:47:52.074576 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:47:52.074543 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/825a3116-4617-447b-a957-b37676b3ccd8-cert\") pod \"odh-model-controller-696fc77849-l2jvw\" (UID: \"825a3116-4617-447b-a957-b37676b3ccd8\") " pod="kserve/odh-model-controller-696fc77849-l2jvw" Apr 16 17:47:52.074718 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:47:52.074580 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkr7x\" (UniqueName: \"kubernetes.io/projected/825a3116-4617-447b-a957-b37676b3ccd8-kube-api-access-zkr7x\") pod \"odh-model-controller-696fc77849-l2jvw\" (UID: \"825a3116-4617-447b-a957-b37676b3ccd8\") " pod="kserve/odh-model-controller-696fc77849-l2jvw" Apr 16 17:47:52.175854 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:47:52.175828 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/825a3116-4617-447b-a957-b37676b3ccd8-cert\") pod \"odh-model-controller-696fc77849-l2jvw\" (UID: \"825a3116-4617-447b-a957-b37676b3ccd8\") " pod="kserve/odh-model-controller-696fc77849-l2jvw" Apr 16 17:47:52.175980 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:47:52.175868 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zkr7x\" (UniqueName: \"kubernetes.io/projected/825a3116-4617-447b-a957-b37676b3ccd8-kube-api-access-zkr7x\") pod \"odh-model-controller-696fc77849-l2jvw\" (UID: \"825a3116-4617-447b-a957-b37676b3ccd8\") " pod="kserve/odh-model-controller-696fc77849-l2jvw" Apr 16 17:47:52.178255 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:47:52.178210 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/825a3116-4617-447b-a957-b37676b3ccd8-cert\") pod \"odh-model-controller-696fc77849-l2jvw\" (UID: \"825a3116-4617-447b-a957-b37676b3ccd8\") " pod="kserve/odh-model-controller-696fc77849-l2jvw" Apr 16 17:47:52.185987 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:47:52.185967 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkr7x\" (UniqueName: \"kubernetes.io/projected/825a3116-4617-447b-a957-b37676b3ccd8-kube-api-access-zkr7x\") pod \"odh-model-controller-696fc77849-l2jvw\" (UID: \"825a3116-4617-447b-a957-b37676b3ccd8\") " pod="kserve/odh-model-controller-696fc77849-l2jvw" Apr 16 17:47:52.288325 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:47:52.288246 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-l2jvw" Apr 16 17:47:52.416140 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:47:52.416109 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-l2jvw"] Apr 16 17:47:52.417276 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:47:52.417247 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod825a3116_4617_447b_a957_b37676b3ccd8.slice/crio-211305bec7e027df4e2f4873a55215966cebe9181176543cd8e0351bb3ff863f WatchSource:0}: Error finding container 211305bec7e027df4e2f4873a55215966cebe9181176543cd8e0351bb3ff863f: Status 404 returned error can't find the container with id 211305bec7e027df4e2f4873a55215966cebe9181176543cd8e0351bb3ff863f Apr 16 17:47:52.974735 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:47:52.974702 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-l2jvw" event={"ID":"825a3116-4617-447b-a957-b37676b3ccd8","Type":"ContainerStarted","Data":"211305bec7e027df4e2f4873a55215966cebe9181176543cd8e0351bb3ff863f"} Apr 16 17:47:55.986904 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:47:55.986870 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-l2jvw" event={"ID":"825a3116-4617-447b-a957-b37676b3ccd8","Type":"ContainerStarted","Data":"f5511701d914f3eb6d607191e5314a67bfca877cc6630266851e61fbd1fba0ee"} Apr 16 17:47:55.987332 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:47:55.986932 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-l2jvw" Apr 16 17:47:56.007568 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:47:56.007522 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-l2jvw" podStartSLOduration=2.391016198 podStartE2EDuration="5.007508999s" podCreationTimestamp="2026-04-16 17:47:51 +0000 UTC" firstStartedPulling="2026-04-16 17:47:52.418497357 +0000 UTC m=+466.631530489" lastFinishedPulling="2026-04-16 17:47:55.034990146 +0000 UTC m=+469.248023290" observedRunningTime="2026-04-16 17:47:56.005121539 +0000 UTC m=+470.218154700" watchObservedRunningTime="2026-04-16 17:47:56.007508999 +0000 UTC m=+470.220542146" Apr 16 17:48:06.993113 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:06.993079 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-l2jvw" Apr 16 17:48:07.905966 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:07.905921 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-mc9vx"] Apr 16 17:48:07.912276 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:07.912255 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-mc9vx" Apr 16 17:48:07.917433 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:07.917409 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-mc9vx"] Apr 16 17:48:08.011181 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:08.011146 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkpxf\" (UniqueName: \"kubernetes.io/projected/b9de5d1a-f0a4-4d98-94cc-8bf2cc542b24-kube-api-access-bkpxf\") pod \"s3-init-mc9vx\" (UID: \"b9de5d1a-f0a4-4d98-94cc-8bf2cc542b24\") " pod="kserve/s3-init-mc9vx" Apr 16 17:48:08.111523 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:08.111484 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bkpxf\" (UniqueName: \"kubernetes.io/projected/b9de5d1a-f0a4-4d98-94cc-8bf2cc542b24-kube-api-access-bkpxf\") pod \"s3-init-mc9vx\" (UID: \"b9de5d1a-f0a4-4d98-94cc-8bf2cc542b24\") " pod="kserve/s3-init-mc9vx" Apr 16 17:48:08.122840 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:08.122813 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkpxf\" (UniqueName: \"kubernetes.io/projected/b9de5d1a-f0a4-4d98-94cc-8bf2cc542b24-kube-api-access-bkpxf\") pod \"s3-init-mc9vx\" (UID: \"b9de5d1a-f0a4-4d98-94cc-8bf2cc542b24\") " pod="kserve/s3-init-mc9vx" Apr 16 17:48:08.232568 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:08.232478 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-mc9vx" Apr 16 17:48:08.351412 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:08.351374 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-mc9vx"] Apr 16 17:48:08.353625 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:48:08.353592 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9de5d1a_f0a4_4d98_94cc_8bf2cc542b24.slice/crio-0ab1d665298843783168409cc2269b450cf0ec91679668e51ad0fe485c82cd05 WatchSource:0}: Error finding container 0ab1d665298843783168409cc2269b450cf0ec91679668e51ad0fe485c82cd05: Status 404 returned error can't find the container with id 0ab1d665298843783168409cc2269b450cf0ec91679668e51ad0fe485c82cd05 Apr 16 17:48:09.039817 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:09.039760 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-mc9vx" event={"ID":"b9de5d1a-f0a4-4d98-94cc-8bf2cc542b24","Type":"ContainerStarted","Data":"0ab1d665298843783168409cc2269b450cf0ec91679668e51ad0fe485c82cd05"} Apr 16 17:48:13.058331 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:13.058234 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-mc9vx" event={"ID":"b9de5d1a-f0a4-4d98-94cc-8bf2cc542b24","Type":"ContainerStarted","Data":"633adae221dbca40223926ef16d460574507d8ee68378f5ff685204314500d52"} Apr 16 17:48:13.078996 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:13.078945 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-mc9vx" podStartSLOduration=1.692416455 podStartE2EDuration="6.078931769s" podCreationTimestamp="2026-04-16 17:48:07 +0000 UTC" firstStartedPulling="2026-04-16 17:48:08.35542581 +0000 UTC m=+482.568458939" lastFinishedPulling="2026-04-16 17:48:12.741941125 +0000 UTC m=+486.954974253" observedRunningTime="2026-04-16 17:48:13.077695554 +0000 UTC m=+487.290728703" watchObservedRunningTime="2026-04-16 17:48:13.078931769 +0000 UTC m=+487.291964917" Apr 16 17:48:16.072145 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:16.072049 2573 generic.go:358] "Generic (PLEG): container finished" podID="b9de5d1a-f0a4-4d98-94cc-8bf2cc542b24" containerID="633adae221dbca40223926ef16d460574507d8ee68378f5ff685204314500d52" exitCode=0 Apr 16 17:48:16.072538 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:16.072122 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-mc9vx" event={"ID":"b9de5d1a-f0a4-4d98-94cc-8bf2cc542b24","Type":"ContainerDied","Data":"633adae221dbca40223926ef16d460574507d8ee68378f5ff685204314500d52"} Apr 16 17:48:17.200119 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:17.200091 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-mc9vx" Apr 16 17:48:17.294844 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:17.294809 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkpxf\" (UniqueName: \"kubernetes.io/projected/b9de5d1a-f0a4-4d98-94cc-8bf2cc542b24-kube-api-access-bkpxf\") pod \"b9de5d1a-f0a4-4d98-94cc-8bf2cc542b24\" (UID: \"b9de5d1a-f0a4-4d98-94cc-8bf2cc542b24\") " Apr 16 17:48:17.296933 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:17.296904 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9de5d1a-f0a4-4d98-94cc-8bf2cc542b24-kube-api-access-bkpxf" (OuterVolumeSpecName: "kube-api-access-bkpxf") pod "b9de5d1a-f0a4-4d98-94cc-8bf2cc542b24" (UID: "b9de5d1a-f0a4-4d98-94cc-8bf2cc542b24"). InnerVolumeSpecName "kube-api-access-bkpxf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 17:48:17.396405 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:17.396381 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bkpxf\" (UniqueName: \"kubernetes.io/projected/b9de5d1a-f0a4-4d98-94cc-8bf2cc542b24-kube-api-access-bkpxf\") on node \"ip-10-0-128-241.ec2.internal\" DevicePath \"\"" Apr 16 17:48:18.079828 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:18.079798 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-mc9vx" Apr 16 17:48:18.080071 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:18.079792 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-mc9vx" event={"ID":"b9de5d1a-f0a4-4d98-94cc-8bf2cc542b24","Type":"ContainerDied","Data":"0ab1d665298843783168409cc2269b450cf0ec91679668e51ad0fe485c82cd05"} Apr 16 17:48:18.080071 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:18.079896 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ab1d665298843783168409cc2269b450cf0ec91679668e51ad0fe485c82cd05" Apr 16 17:48:28.174725 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:28.174695 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst"] Apr 16 17:48:28.175111 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:28.175050 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9de5d1a-f0a4-4d98-94cc-8bf2cc542b24" containerName="s3-init" Apr 16 17:48:28.175111 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:28.175062 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9de5d1a-f0a4-4d98-94cc-8bf2cc542b24" containerName="s3-init" Apr 16 17:48:28.175181 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:28.175112 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9de5d1a-f0a4-4d98-94cc-8bf2cc542b24" containerName="s3-init" Apr 16 17:48:28.178145 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:28.178128 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst" Apr 16 17:48:28.183084 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:28.183059 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-dq85w\"" Apr 16 17:48:28.188608 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:28.188585 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst"] Apr 16 17:48:28.289121 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:28.289090 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6c1b967c-437a-4b60-af9a-8c605f2d16ee-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst\" (UID: \"6c1b967c-437a-4b60-af9a-8c605f2d16ee\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst" Apr 16 17:48:28.390427 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:28.390393 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6c1b967c-437a-4b60-af9a-8c605f2d16ee-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst\" (UID: \"6c1b967c-437a-4b60-af9a-8c605f2d16ee\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst" Apr 16 17:48:28.390772 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:28.390750 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6c1b967c-437a-4b60-af9a-8c605f2d16ee-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst\" (UID: \"6c1b967c-437a-4b60-af9a-8c605f2d16ee\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst" Apr 16 17:48:28.466928 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:28.466851 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4c5tl"] Apr 16 17:48:28.471172 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:28.471144 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4c5tl" Apr 16 17:48:28.483334 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:28.483310 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4c5tl"] Apr 16 17:48:28.488235 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:28.488194 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst" Apr 16 17:48:28.593730 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:28.593560 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84812f95-b47a-4bc5-889c-9e7cf05490f8-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-4c5tl\" (UID: \"84812f95-b47a-4bc5-889c-9e7cf05490f8\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4c5tl" Apr 16 17:48:28.649507 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:28.649471 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst"] Apr 16 17:48:28.654046 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:48:28.653991 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c1b967c_437a_4b60_af9a_8c605f2d16ee.slice/crio-078954aa946b1d240acd4b248f5de4f0959e46eb6197c295019110674aab87a9 WatchSource:0}: Error finding container 078954aa946b1d240acd4b248f5de4f0959e46eb6197c295019110674aab87a9: Status 404 returned error can't find the container with id 078954aa946b1d240acd4b248f5de4f0959e46eb6197c295019110674aab87a9 Apr 16 17:48:28.679555 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:28.679531 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7"] Apr 16 17:48:28.684581 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:28.684564 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7" Apr 16 17:48:28.694645 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:28.694403 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84812f95-b47a-4bc5-889c-9e7cf05490f8-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-4c5tl\" (UID: \"84812f95-b47a-4bc5-889c-9e7cf05490f8\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4c5tl" Apr 16 17:48:28.694645 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:28.694461 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7"] Apr 16 17:48:28.694995 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:28.694968 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84812f95-b47a-4bc5-889c-9e7cf05490f8-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-784cb989b8-4c5tl\" (UID: \"84812f95-b47a-4bc5-889c-9e7cf05490f8\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4c5tl" Apr 16 17:48:28.784337 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:28.784240 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4c5tl" Apr 16 17:48:28.795446 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:28.795409 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/82531e16-a564-4dcb-9c5e-3fa9952f570e-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7\" (UID: \"82531e16-a564-4dcb-9c5e-3fa9952f570e\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7" Apr 16 17:48:28.897657 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:28.897123 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/82531e16-a564-4dcb-9c5e-3fa9952f570e-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7\" (UID: \"82531e16-a564-4dcb-9c5e-3fa9952f570e\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7" Apr 16 17:48:28.897837 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:28.897605 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/82531e16-a564-4dcb-9c5e-3fa9952f570e-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7\" (UID: \"82531e16-a564-4dcb-9c5e-3fa9952f570e\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7" Apr 16 17:48:28.942457 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:28.942083 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4c5tl"] Apr 16 17:48:28.944242 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:48:28.944196 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84812f95_b47a_4bc5_889c_9e7cf05490f8.slice/crio-a583ff54be89be9fdc176eef834383dd034a2607499eef6a3eb79aae39c72699 WatchSource:0}: Error finding container a583ff54be89be9fdc176eef834383dd034a2607499eef6a3eb79aae39c72699: Status 404 returned error can't find the container with id a583ff54be89be9fdc176eef834383dd034a2607499eef6a3eb79aae39c72699 Apr 16 17:48:28.997859 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:28.997827 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7" Apr 16 17:48:29.116027 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:29.115994 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4c5tl" event={"ID":"84812f95-b47a-4bc5-889c-9e7cf05490f8","Type":"ContainerStarted","Data":"a583ff54be89be9fdc176eef834383dd034a2607499eef6a3eb79aae39c72699"} Apr 16 17:48:29.117260 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:29.117235 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst" event={"ID":"6c1b967c-437a-4b60-af9a-8c605f2d16ee","Type":"ContainerStarted","Data":"078954aa946b1d240acd4b248f5de4f0959e46eb6197c295019110674aab87a9"} Apr 16 17:48:29.136434 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:29.136392 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7"] Apr 16 17:48:29.139705 ip-10-0-128-241 kubenswrapper[2573]: W0416 17:48:29.139676 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82531e16_a564_4dcb_9c5e_3fa9952f570e.slice/crio-8f4435b775c4b198b06003f270a59fe7004eb7c8ddc84a1ee442e472a2b11951 WatchSource:0}: Error finding container 8f4435b775c4b198b06003f270a59fe7004eb7c8ddc84a1ee442e472a2b11951: Status 404 returned error can't find the container with id 8f4435b775c4b198b06003f270a59fe7004eb7c8ddc84a1ee442e472a2b11951 Apr 16 17:48:30.124828 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:30.124787 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7" event={"ID":"82531e16-a564-4dcb-9c5e-3fa9952f570e","Type":"ContainerStarted","Data":"8f4435b775c4b198b06003f270a59fe7004eb7c8ddc84a1ee442e472a2b11951"} Apr 16 17:48:33.139705 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:33.139296 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4c5tl" event={"ID":"84812f95-b47a-4bc5-889c-9e7cf05490f8","Type":"ContainerStarted","Data":"2b4722d62286fd324af4dc6fe02b0bbbc1a7fd33d027f4403245695a76c31f43"} Apr 16 17:48:33.141461 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:33.141430 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst" event={"ID":"6c1b967c-437a-4b60-af9a-8c605f2d16ee","Type":"ContainerStarted","Data":"16ff83c55cf09c17ab88dc8666ed1ee0d5bf8f0a5c7b8e447fe92d73132d6ed2"} Apr 16 17:48:33.143077 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:33.143047 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7" event={"ID":"82531e16-a564-4dcb-9c5e-3fa9952f570e","Type":"ContainerStarted","Data":"aed1e57ca8772594802d8926b622aa5d9ce52da346f754f61827d7f42aaabbdf"} Apr 16 17:48:37.161345 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:37.161307 2573 generic.go:358] "Generic (PLEG): container finished" podID="84812f95-b47a-4bc5-889c-9e7cf05490f8" containerID="2b4722d62286fd324af4dc6fe02b0bbbc1a7fd33d027f4403245695a76c31f43" exitCode=0 Apr 16 17:48:37.161784 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:37.161382 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4c5tl" event={"ID":"84812f95-b47a-4bc5-889c-9e7cf05490f8","Type":"ContainerDied","Data":"2b4722d62286fd324af4dc6fe02b0bbbc1a7fd33d027f4403245695a76c31f43"} Apr 16 17:48:37.162848 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:37.162826 2573 generic.go:358] "Generic (PLEG): container finished" podID="6c1b967c-437a-4b60-af9a-8c605f2d16ee" containerID="16ff83c55cf09c17ab88dc8666ed1ee0d5bf8f0a5c7b8e447fe92d73132d6ed2" exitCode=0 Apr 16 17:48:37.162945 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:37.162907 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst" event={"ID":"6c1b967c-437a-4b60-af9a-8c605f2d16ee","Type":"ContainerDied","Data":"16ff83c55cf09c17ab88dc8666ed1ee0d5bf8f0a5c7b8e447fe92d73132d6ed2"} Apr 16 17:48:37.164183 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:37.164159 2573 generic.go:358] "Generic (PLEG): container finished" podID="82531e16-a564-4dcb-9c5e-3fa9952f570e" containerID="aed1e57ca8772594802d8926b622aa5d9ce52da346f754f61827d7f42aaabbdf" exitCode=0 Apr 16 17:48:37.164305 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:48:37.164244 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7" event={"ID":"82531e16-a564-4dcb-9c5e-3fa9952f570e","Type":"ContainerDied","Data":"aed1e57ca8772594802d8926b622aa5d9ce52da346f754f61827d7f42aaabbdf"} Apr 16 17:49:06.302994 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:49:06.302898 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4c5tl" event={"ID":"84812f95-b47a-4bc5-889c-9e7cf05490f8","Type":"ContainerStarted","Data":"d09a31296b9be2cd151c29bc136eb524f4b37d741617a303719af0bbfb830c56"} Apr 16 17:49:06.303479 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:49:06.303296 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4c5tl" Apr 16 17:49:06.304672 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:49:06.304649 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst" event={"ID":"6c1b967c-437a-4b60-af9a-8c605f2d16ee","Type":"ContainerStarted","Data":"1b77ba85e15c3289ad3097e8f3bb828ad2523ab3eb125f7a9ee1fd32cbf47702"} Apr 16 17:49:06.304927 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:49:06.304907 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst" Apr 16 17:49:06.304927 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:49:06.304909 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4c5tl" podUID="84812f95-b47a-4bc5-889c-9e7cf05490f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 17:49:06.305776 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:49:06.305747 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst" podUID="6c1b967c-437a-4b60-af9a-8c605f2d16ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 17:49:06.306387 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:49:06.306370 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7" event={"ID":"82531e16-a564-4dcb-9c5e-3fa9952f570e","Type":"ContainerStarted","Data":"51de26896e2b602261dc3a92ebd68d639a88cf2d4d4a37776582195ff9625854"} Apr 16 17:49:06.306650 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:49:06.306634 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7" Apr 16 17:49:06.307513 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:49:06.307493 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7" podUID="82531e16-a564-4dcb-9c5e-3fa9952f570e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 17:49:06.325444 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:49:06.325393 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4c5tl" podStartSLOduration=1.349323169 podStartE2EDuration="38.325380508s" podCreationTimestamp="2026-04-16 17:48:28 +0000 UTC" firstStartedPulling="2026-04-16 17:48:28.946593514 +0000 UTC m=+503.159626641" lastFinishedPulling="2026-04-16 17:49:05.922650841 +0000 UTC m=+540.135683980" observedRunningTime="2026-04-16 17:49:06.323816215 +0000 UTC m=+540.536849375" watchObservedRunningTime="2026-04-16 17:49:06.325380508 +0000 UTC m=+540.538413655" Apr 16 17:49:06.345740 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:49:06.345692 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7" podStartSLOduration=1.673361307 podStartE2EDuration="38.345677412s" podCreationTimestamp="2026-04-16 17:48:28 +0000 UTC" firstStartedPulling="2026-04-16 17:48:29.141606422 +0000 UTC m=+503.354639552" lastFinishedPulling="2026-04-16 17:49:05.813922515 +0000 UTC m=+540.026955657" observedRunningTime="2026-04-16 17:49:06.342636417 +0000 UTC m=+540.555669565" watchObservedRunningTime="2026-04-16 17:49:06.345677412 +0000 UTC m=+540.558710617" Apr 16 17:49:06.362528 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:49:06.362476 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst" podStartSLOduration=1.21717762 podStartE2EDuration="38.362462459s" podCreationTimestamp="2026-04-16 17:48:28 +0000 UTC" firstStartedPulling="2026-04-16 17:48:28.656085234 +0000 UTC m=+502.869118374" lastFinishedPulling="2026-04-16 17:49:05.80137008 +0000 UTC m=+540.014403213" observedRunningTime="2026-04-16 17:49:06.361380352 +0000 UTC m=+540.574413500" watchObservedRunningTime="2026-04-16 17:49:06.362462459 +0000 UTC m=+540.575495660" Apr 16 17:49:07.310428 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:49:07.310387 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7" podUID="82531e16-a564-4dcb-9c5e-3fa9952f570e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 17:49:07.310908 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:49:07.310385 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4c5tl" podUID="84812f95-b47a-4bc5-889c-9e7cf05490f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 17:49:07.310908 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:49:07.310387 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst" podUID="6c1b967c-437a-4b60-af9a-8c605f2d16ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 17:49:17.311002 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:49:17.310959 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst" podUID="6c1b967c-437a-4b60-af9a-8c605f2d16ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 17:49:17.311399 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:49:17.310954 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7" podUID="82531e16-a564-4dcb-9c5e-3fa9952f570e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 17:49:17.311399 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:49:17.310956 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4c5tl" podUID="84812f95-b47a-4bc5-889c-9e7cf05490f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 17:49:27.310621 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:49:27.310572 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7" podUID="82531e16-a564-4dcb-9c5e-3fa9952f570e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 17:49:27.310621 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:49:27.310572 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst" podUID="6c1b967c-437a-4b60-af9a-8c605f2d16ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 17:49:27.311202 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:49:27.310572 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4c5tl" podUID="84812f95-b47a-4bc5-889c-9e7cf05490f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 17:49:37.311124 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:49:37.311073 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7" podUID="82531e16-a564-4dcb-9c5e-3fa9952f570e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 17:49:37.311570 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:49:37.311093 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4c5tl" podUID="84812f95-b47a-4bc5-889c-9e7cf05490f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 17:49:37.311570 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:49:37.311090 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst" podUID="6c1b967c-437a-4b60-af9a-8c605f2d16ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 17:49:47.311075 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:49:47.311030 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst" podUID="6c1b967c-437a-4b60-af9a-8c605f2d16ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 17:49:47.311511 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:49:47.311028 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7" podUID="82531e16-a564-4dcb-9c5e-3fa9952f570e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 17:49:47.311511 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:49:47.311028 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4c5tl" podUID="84812f95-b47a-4bc5-889c-9e7cf05490f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 17:49:57.311383 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:49:57.311335 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7" podUID="82531e16-a564-4dcb-9c5e-3fa9952f570e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 17:49:57.311782 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:49:57.311339 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4c5tl" podUID="84812f95-b47a-4bc5-889c-9e7cf05490f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 17:49:57.311782 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:49:57.311339 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst" podUID="6c1b967c-437a-4b60-af9a-8c605f2d16ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 17:50:06.343860 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:06.343830 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-fmxzp_39536cc1-6596-4c35-a9d6-a93ef6779640/console-operator/1.log" Apr 16 17:50:06.344529 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:06.344513 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-fmxzp_39536cc1-6596-4c35-a9d6-a93ef6779640/console-operator/1.log" Apr 16 17:50:07.310567 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:07.310520 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst" podUID="6c1b967c-437a-4b60-af9a-8c605f2d16ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 17:50:07.312684 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:07.310520 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7" podUID="82531e16-a564-4dcb-9c5e-3fa9952f570e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 17:50:07.312684 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:07.311657 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4c5tl" Apr 16 17:50:17.311414 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:17.311379 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7" Apr 16 17:50:17.311844 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:17.311435 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst" Apr 16 17:50:48.460388 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:48.460354 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7"] Apr 16 17:50:48.460907 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:48.460614 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7" podUID="82531e16-a564-4dcb-9c5e-3fa9952f570e" containerName="kserve-container" containerID="cri-o://51de26896e2b602261dc3a92ebd68d639a88cf2d4d4a37776582195ff9625854" gracePeriod=30 Apr 16 17:50:48.522276 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:48.522246 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst"] Apr 16 17:50:48.522517 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:48.522497 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst" podUID="6c1b967c-437a-4b60-af9a-8c605f2d16ee" containerName="kserve-container" containerID="cri-o://1b77ba85e15c3289ad3097e8f3bb828ad2523ab3eb125f7a9ee1fd32cbf47702" gracePeriod=30 Apr 16 17:50:48.627281 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:48.627244 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4c5tl"] Apr 16 17:50:48.627528 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:48.627506 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4c5tl" podUID="84812f95-b47a-4bc5-889c-9e7cf05490f8" containerName="kserve-container" containerID="cri-o://d09a31296b9be2cd151c29bc136eb524f4b37d741617a303719af0bbfb830c56" gracePeriod=30 Apr 16 17:50:52.875207 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:52.875182 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4c5tl" Apr 16 17:50:52.888156 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:52.888084 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84812f95-b47a-4bc5-889c-9e7cf05490f8-kserve-provision-location\") pod \"84812f95-b47a-4bc5-889c-9e7cf05490f8\" (UID: \"84812f95-b47a-4bc5-889c-9e7cf05490f8\") " Apr 16 17:50:52.888460 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:52.888433 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84812f95-b47a-4bc5-889c-9e7cf05490f8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "84812f95-b47a-4bc5-889c-9e7cf05490f8" (UID: "84812f95-b47a-4bc5-889c-9e7cf05490f8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:50:52.914142 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:52.914122 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7" Apr 16 17:50:52.989092 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:52.989063 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/82531e16-a564-4dcb-9c5e-3fa9952f570e-kserve-provision-location\") pod \"82531e16-a564-4dcb-9c5e-3fa9952f570e\" (UID: \"82531e16-a564-4dcb-9c5e-3fa9952f570e\") " Apr 16 17:50:52.989273 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:52.989260 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84812f95-b47a-4bc5-889c-9e7cf05490f8-kserve-provision-location\") on node \"ip-10-0-128-241.ec2.internal\" DevicePath \"\"" Apr 16 17:50:52.989385 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:52.989363 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82531e16-a564-4dcb-9c5e-3fa9952f570e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "82531e16-a564-4dcb-9c5e-3fa9952f570e" (UID: "82531e16-a564-4dcb-9c5e-3fa9952f570e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:50:53.090311 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.090276 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/82531e16-a564-4dcb-9c5e-3fa9952f570e-kserve-provision-location\") on node \"ip-10-0-128-241.ec2.internal\" DevicePath \"\"" Apr 16 17:50:53.453580 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.453555 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst" Apr 16 17:50:53.493130 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.493047 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6c1b967c-437a-4b60-af9a-8c605f2d16ee-kserve-provision-location\") pod \"6c1b967c-437a-4b60-af9a-8c605f2d16ee\" (UID: \"6c1b967c-437a-4b60-af9a-8c605f2d16ee\") " Apr 16 17:50:53.493418 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.493392 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c1b967c-437a-4b60-af9a-8c605f2d16ee-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6c1b967c-437a-4b60-af9a-8c605f2d16ee" (UID: "6c1b967c-437a-4b60-af9a-8c605f2d16ee"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:50:53.593986 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.593953 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6c1b967c-437a-4b60-af9a-8c605f2d16ee-kserve-provision-location\") on node \"ip-10-0-128-241.ec2.internal\" DevicePath \"\"" Apr 16 17:50:53.671598 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.671563 2573 generic.go:358] "Generic (PLEG): container finished" podID="84812f95-b47a-4bc5-889c-9e7cf05490f8" containerID="d09a31296b9be2cd151c29bc136eb524f4b37d741617a303719af0bbfb830c56" exitCode=0 Apr 16 17:50:53.671778 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.671640 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4c5tl" Apr 16 17:50:53.671778 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.671633 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4c5tl" event={"ID":"84812f95-b47a-4bc5-889c-9e7cf05490f8","Type":"ContainerDied","Data":"d09a31296b9be2cd151c29bc136eb524f4b37d741617a303719af0bbfb830c56"} Apr 16 17:50:53.671778 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.671766 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4c5tl" event={"ID":"84812f95-b47a-4bc5-889c-9e7cf05490f8","Type":"ContainerDied","Data":"a583ff54be89be9fdc176eef834383dd034a2607499eef6a3eb79aae39c72699"} Apr 16 17:50:53.671916 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.671792 2573 scope.go:117] "RemoveContainer" containerID="d09a31296b9be2cd151c29bc136eb524f4b37d741617a303719af0bbfb830c56" Apr 16 17:50:53.673128 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.673108 2573 generic.go:358] "Generic (PLEG): container finished" podID="6c1b967c-437a-4b60-af9a-8c605f2d16ee" containerID="1b77ba85e15c3289ad3097e8f3bb828ad2523ab3eb125f7a9ee1fd32cbf47702" exitCode=0 Apr 16 17:50:53.673240 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.673179 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst" Apr 16 17:50:53.673240 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.673180 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst" event={"ID":"6c1b967c-437a-4b60-af9a-8c605f2d16ee","Type":"ContainerDied","Data":"1b77ba85e15c3289ad3097e8f3bb828ad2523ab3eb125f7a9ee1fd32cbf47702"} Apr 16 17:50:53.673382 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.673269 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst" event={"ID":"6c1b967c-437a-4b60-af9a-8c605f2d16ee","Type":"ContainerDied","Data":"078954aa946b1d240acd4b248f5de4f0959e46eb6197c295019110674aab87a9"} Apr 16 17:50:53.674709 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.674683 2573 generic.go:358] "Generic (PLEG): container finished" podID="82531e16-a564-4dcb-9c5e-3fa9952f570e" containerID="51de26896e2b602261dc3a92ebd68d639a88cf2d4d4a37776582195ff9625854" exitCode=0 Apr 16 17:50:53.674825 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.674732 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7" event={"ID":"82531e16-a564-4dcb-9c5e-3fa9952f570e","Type":"ContainerDied","Data":"51de26896e2b602261dc3a92ebd68d639a88cf2d4d4a37776582195ff9625854"} Apr 16 17:50:53.674825 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.674757 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7" Apr 16 17:50:53.674825 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.674765 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7" event={"ID":"82531e16-a564-4dcb-9c5e-3fa9952f570e","Type":"ContainerDied","Data":"8f4435b775c4b198b06003f270a59fe7004eb7c8ddc84a1ee442e472a2b11951"} Apr 16 17:50:53.680474 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.680448 2573 scope.go:117] "RemoveContainer" containerID="2b4722d62286fd324af4dc6fe02b0bbbc1a7fd33d027f4403245695a76c31f43" Apr 16 17:50:53.688545 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.688528 2573 scope.go:117] "RemoveContainer" containerID="d09a31296b9be2cd151c29bc136eb524f4b37d741617a303719af0bbfb830c56" Apr 16 17:50:53.688791 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:50:53.688772 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d09a31296b9be2cd151c29bc136eb524f4b37d741617a303719af0bbfb830c56\": container with ID starting with d09a31296b9be2cd151c29bc136eb524f4b37d741617a303719af0bbfb830c56 not found: ID does not exist" containerID="d09a31296b9be2cd151c29bc136eb524f4b37d741617a303719af0bbfb830c56" Apr 16 17:50:53.688834 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.688800 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d09a31296b9be2cd151c29bc136eb524f4b37d741617a303719af0bbfb830c56"} err="failed to get container status \"d09a31296b9be2cd151c29bc136eb524f4b37d741617a303719af0bbfb830c56\": rpc error: code = NotFound desc = could not find container \"d09a31296b9be2cd151c29bc136eb524f4b37d741617a303719af0bbfb830c56\": container with ID starting with d09a31296b9be2cd151c29bc136eb524f4b37d741617a303719af0bbfb830c56 not found: ID does not exist" Apr 16 17:50:53.688834 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.688818 2573 scope.go:117] "RemoveContainer" containerID="2b4722d62286fd324af4dc6fe02b0bbbc1a7fd33d027f4403245695a76c31f43" Apr 16 17:50:53.689046 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:50:53.689031 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b4722d62286fd324af4dc6fe02b0bbbc1a7fd33d027f4403245695a76c31f43\": container with ID starting with 2b4722d62286fd324af4dc6fe02b0bbbc1a7fd33d027f4403245695a76c31f43 not found: ID does not exist" containerID="2b4722d62286fd324af4dc6fe02b0bbbc1a7fd33d027f4403245695a76c31f43" Apr 16 17:50:53.689098 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.689049 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b4722d62286fd324af4dc6fe02b0bbbc1a7fd33d027f4403245695a76c31f43"} err="failed to get container status \"2b4722d62286fd324af4dc6fe02b0bbbc1a7fd33d027f4403245695a76c31f43\": rpc error: code = NotFound desc = could not find container \"2b4722d62286fd324af4dc6fe02b0bbbc1a7fd33d027f4403245695a76c31f43\": container with ID starting with 2b4722d62286fd324af4dc6fe02b0bbbc1a7fd33d027f4403245695a76c31f43 not found: ID does not exist" Apr 16 17:50:53.689098 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.689062 2573 scope.go:117] "RemoveContainer" containerID="1b77ba85e15c3289ad3097e8f3bb828ad2523ab3eb125f7a9ee1fd32cbf47702" Apr 16 17:50:53.697118 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.697099 2573 scope.go:117] "RemoveContainer" containerID="16ff83c55cf09c17ab88dc8666ed1ee0d5bf8f0a5c7b8e447fe92d73132d6ed2" Apr 16 17:50:53.704441 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.704423 2573 scope.go:117] "RemoveContainer" containerID="1b77ba85e15c3289ad3097e8f3bb828ad2523ab3eb125f7a9ee1fd32cbf47702" Apr 16 17:50:53.704643 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.704624 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst"] Apr 16 17:50:53.704704 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:50:53.704684 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b77ba85e15c3289ad3097e8f3bb828ad2523ab3eb125f7a9ee1fd32cbf47702\": container with ID starting with 1b77ba85e15c3289ad3097e8f3bb828ad2523ab3eb125f7a9ee1fd32cbf47702 not found: ID does not exist" containerID="1b77ba85e15c3289ad3097e8f3bb828ad2523ab3eb125f7a9ee1fd32cbf47702" Apr 16 17:50:53.704744 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.704713 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b77ba85e15c3289ad3097e8f3bb828ad2523ab3eb125f7a9ee1fd32cbf47702"} err="failed to get container status \"1b77ba85e15c3289ad3097e8f3bb828ad2523ab3eb125f7a9ee1fd32cbf47702\": rpc error: code = NotFound desc = could not find container \"1b77ba85e15c3289ad3097e8f3bb828ad2523ab3eb125f7a9ee1fd32cbf47702\": container with ID starting with 1b77ba85e15c3289ad3097e8f3bb828ad2523ab3eb125f7a9ee1fd32cbf47702 not found: ID does not exist" Apr 16 17:50:53.704744 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.704733 2573 scope.go:117] "RemoveContainer" containerID="16ff83c55cf09c17ab88dc8666ed1ee0d5bf8f0a5c7b8e447fe92d73132d6ed2" Apr 16 17:50:53.704962 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:50:53.704947 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16ff83c55cf09c17ab88dc8666ed1ee0d5bf8f0a5c7b8e447fe92d73132d6ed2\": container with ID starting with 16ff83c55cf09c17ab88dc8666ed1ee0d5bf8f0a5c7b8e447fe92d73132d6ed2 not found: ID does not exist" containerID="16ff83c55cf09c17ab88dc8666ed1ee0d5bf8f0a5c7b8e447fe92d73132d6ed2" Apr 16 17:50:53.704998 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.704969 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16ff83c55cf09c17ab88dc8666ed1ee0d5bf8f0a5c7b8e447fe92d73132d6ed2"} err="failed to get container status \"16ff83c55cf09c17ab88dc8666ed1ee0d5bf8f0a5c7b8e447fe92d73132d6ed2\": rpc error: code = NotFound desc = could not find container \"16ff83c55cf09c17ab88dc8666ed1ee0d5bf8f0a5c7b8e447fe92d73132d6ed2\": container with ID starting with 16ff83c55cf09c17ab88dc8666ed1ee0d5bf8f0a5c7b8e447fe92d73132d6ed2 not found: ID does not exist" Apr 16 17:50:53.704998 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.704985 2573 scope.go:117] "RemoveContainer" containerID="51de26896e2b602261dc3a92ebd68d639a88cf2d4d4a37776582195ff9625854" Apr 16 17:50:53.709824 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.709805 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5466bc54c4-p8jst"] Apr 16 17:50:53.712239 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.712210 2573 scope.go:117] "RemoveContainer" containerID="aed1e57ca8772594802d8926b622aa5d9ce52da346f754f61827d7f42aaabbdf" Apr 16 17:50:53.719163 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.719149 2573 scope.go:117] "RemoveContainer" containerID="51de26896e2b602261dc3a92ebd68d639a88cf2d4d4a37776582195ff9625854" Apr 16 17:50:53.719408 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:50:53.719393 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51de26896e2b602261dc3a92ebd68d639a88cf2d4d4a37776582195ff9625854\": container with ID starting with 51de26896e2b602261dc3a92ebd68d639a88cf2d4d4a37776582195ff9625854 not found: ID does not exist" containerID="51de26896e2b602261dc3a92ebd68d639a88cf2d4d4a37776582195ff9625854" Apr 16 17:50:53.719453 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.719417 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51de26896e2b602261dc3a92ebd68d639a88cf2d4d4a37776582195ff9625854"} err="failed to get container status \"51de26896e2b602261dc3a92ebd68d639a88cf2d4d4a37776582195ff9625854\": rpc error: code = NotFound desc = could not find container \"51de26896e2b602261dc3a92ebd68d639a88cf2d4d4a37776582195ff9625854\": container with ID starting with 51de26896e2b602261dc3a92ebd68d639a88cf2d4d4a37776582195ff9625854 not found: ID does not exist" Apr 16 17:50:53.719453 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.719434 2573 scope.go:117] "RemoveContainer" containerID="aed1e57ca8772594802d8926b622aa5d9ce52da346f754f61827d7f42aaabbdf" Apr 16 17:50:53.719666 ip-10-0-128-241 kubenswrapper[2573]: E0416 17:50:53.719646 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aed1e57ca8772594802d8926b622aa5d9ce52da346f754f61827d7f42aaabbdf\": container with ID starting with aed1e57ca8772594802d8926b622aa5d9ce52da346f754f61827d7f42aaabbdf not found: ID does not exist" containerID="aed1e57ca8772594802d8926b622aa5d9ce52da346f754f61827d7f42aaabbdf" Apr 16 17:50:53.719723 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.719677 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aed1e57ca8772594802d8926b622aa5d9ce52da346f754f61827d7f42aaabbdf"} err="failed to get container status \"aed1e57ca8772594802d8926b622aa5d9ce52da346f754f61827d7f42aaabbdf\": rpc error: code = NotFound desc = could not find container \"aed1e57ca8772594802d8926b622aa5d9ce52da346f754f61827d7f42aaabbdf\": container with ID starting with aed1e57ca8772594802d8926b622aa5d9ce52da346f754f61827d7f42aaabbdf not found: ID does not exist" Apr 16 17:50:53.723624 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.723603 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4c5tl"] Apr 16 17:50:53.729107 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.729086 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-784cb989b8-4c5tl"] Apr 16 17:50:53.745604 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.745549 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7"] Apr 16 17:50:53.748339 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:53.748320 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-7dfc8cb68-tbrt7"] Apr 16 17:50:54.442941 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:54.442912 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c1b967c-437a-4b60-af9a-8c605f2d16ee" path="/var/lib/kubelet/pods/6c1b967c-437a-4b60-af9a-8c605f2d16ee/volumes" Apr 16 17:50:54.443357 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:54.443336 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82531e16-a564-4dcb-9c5e-3fa9952f570e" path="/var/lib/kubelet/pods/82531e16-a564-4dcb-9c5e-3fa9952f570e/volumes" Apr 16 17:50:54.443893 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:50:54.443874 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84812f95-b47a-4bc5-889c-9e7cf05490f8" path="/var/lib/kubelet/pods/84812f95-b47a-4bc5-889c-9e7cf05490f8/volumes" Apr 16 17:55:06.365528 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:55:06.365495 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-fmxzp_39536cc1-6596-4c35-a9d6-a93ef6779640/console-operator/1.log" Apr 16 17:55:06.366616 ip-10-0-128-241 kubenswrapper[2573]: I0416 17:55:06.366590 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-fmxzp_39536cc1-6596-4c35-a9d6-a93ef6779640/console-operator/1.log" Apr 16 18:00:06.389293 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:00:06.389265 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-fmxzp_39536cc1-6596-4c35-a9d6-a93ef6779640/console-operator/1.log" Apr 16 18:00:06.390330 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:00:06.390307 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-fmxzp_39536cc1-6596-4c35-a9d6-a93ef6779640/console-operator/1.log" Apr 16 18:05:06.411138 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:05:06.411109 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-fmxzp_39536cc1-6596-4c35-a9d6-a93ef6779640/console-operator/1.log" Apr 16 18:05:06.412786 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:05:06.412763 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-fmxzp_39536cc1-6596-4c35-a9d6-a93ef6779640/console-operator/1.log" Apr 16 18:10:06.432694 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:10:06.432668 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-fmxzp_39536cc1-6596-4c35-a9d6-a93ef6779640/console-operator/1.log" Apr 16 18:10:06.437764 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:10:06.437735 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-fmxzp_39536cc1-6596-4c35-a9d6-a93ef6779640/console-operator/1.log" Apr 16 18:15:06.461099 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:15:06.461070 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-fmxzp_39536cc1-6596-4c35-a9d6-a93ef6779640/console-operator/1.log" Apr 16 18:15:06.464771 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:15:06.464749 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-fmxzp_39536cc1-6596-4c35-a9d6-a93ef6779640/console-operator/1.log" Apr 16 18:20:06.483286 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:20:06.483256 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-fmxzp_39536cc1-6596-4c35-a9d6-a93ef6779640/console-operator/1.log" Apr 16 18:20:06.487388 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:20:06.487368 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-fmxzp_39536cc1-6596-4c35-a9d6-a93ef6779640/console-operator/1.log" Apr 16 18:25:06.505112 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:25:06.505004 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-fmxzp_39536cc1-6596-4c35-a9d6-a93ef6779640/console-operator/1.log" Apr 16 18:25:06.510052 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:25:06.510033 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-fmxzp_39536cc1-6596-4c35-a9d6-a93ef6779640/console-operator/1.log" Apr 16 18:28:29.197748 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:29.197714 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9zksv/must-gather-jn9cc"] Apr 16 18:28:29.198257 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:29.198033 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c1b967c-437a-4b60-af9a-8c605f2d16ee" containerName="kserve-container" Apr 16 18:28:29.198257 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:29.198044 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c1b967c-437a-4b60-af9a-8c605f2d16ee" containerName="kserve-container" Apr 16 18:28:29.198257 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:29.198055 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84812f95-b47a-4bc5-889c-9e7cf05490f8" containerName="storage-initializer" Apr 16 18:28:29.198257 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:29.198060 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="84812f95-b47a-4bc5-889c-9e7cf05490f8" containerName="storage-initializer" Apr 16 18:28:29.198257 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:29.198071 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c1b967c-437a-4b60-af9a-8c605f2d16ee" containerName="storage-initializer" Apr 16 18:28:29.198257 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:29.198077 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c1b967c-437a-4b60-af9a-8c605f2d16ee" containerName="storage-initializer" Apr 16 18:28:29.198257 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:29.198087 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82531e16-a564-4dcb-9c5e-3fa9952f570e" containerName="storage-initializer" Apr 16 18:28:29.198257 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:29.198093 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="82531e16-a564-4dcb-9c5e-3fa9952f570e" containerName="storage-initializer" Apr 16 18:28:29.198257 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:29.198098 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="82531e16-a564-4dcb-9c5e-3fa9952f570e" containerName="kserve-container" Apr 16 18:28:29.198257 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:29.198104 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="82531e16-a564-4dcb-9c5e-3fa9952f570e" containerName="kserve-container" Apr 16 18:28:29.198257 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:29.198113 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84812f95-b47a-4bc5-889c-9e7cf05490f8" containerName="kserve-container" Apr 16 18:28:29.198257 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:29.198118 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="84812f95-b47a-4bc5-889c-9e7cf05490f8" containerName="kserve-container" Apr 16 18:28:29.198257 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:29.198163 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="82531e16-a564-4dcb-9c5e-3fa9952f570e" containerName="kserve-container" Apr 16 18:28:29.198257 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:29.198172 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c1b967c-437a-4b60-af9a-8c605f2d16ee" containerName="kserve-container" Apr 16 18:28:29.198257 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:29.198179 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="84812f95-b47a-4bc5-889c-9e7cf05490f8" containerName="kserve-container" Apr 16 18:28:29.201050 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:29.201034 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9zksv/must-gather-jn9cc" Apr 16 18:28:29.203764 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:29.203744 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-9zksv\"/\"openshift-service-ca.crt\"" Apr 16 18:28:29.205006 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:29.204986 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-9zksv\"/\"default-dockercfg-dsbbz\"" Apr 16 18:28:29.205077 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:29.205019 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-9zksv\"/\"kube-root-ca.crt\"" Apr 16 18:28:29.215745 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:29.215717 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9zksv/must-gather-jn9cc"] Apr 16 18:28:29.316760 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:29.316723 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xfq4\" (UniqueName: \"kubernetes.io/projected/4d1040b5-9e96-47df-aad5-eebd6a9b2fea-kube-api-access-8xfq4\") pod \"must-gather-jn9cc\" (UID: \"4d1040b5-9e96-47df-aad5-eebd6a9b2fea\") " pod="openshift-must-gather-9zksv/must-gather-jn9cc" Apr 16 18:28:29.316977 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:29.316772 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4d1040b5-9e96-47df-aad5-eebd6a9b2fea-must-gather-output\") pod \"must-gather-jn9cc\" (UID: \"4d1040b5-9e96-47df-aad5-eebd6a9b2fea\") " pod="openshift-must-gather-9zksv/must-gather-jn9cc" Apr 16 18:28:29.417702 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:29.417668 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4d1040b5-9e96-47df-aad5-eebd6a9b2fea-must-gather-output\") pod \"must-gather-jn9cc\" (UID: \"4d1040b5-9e96-47df-aad5-eebd6a9b2fea\") " pod="openshift-must-gather-9zksv/must-gather-jn9cc" Apr 16 18:28:29.417863 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:29.417739 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xfq4\" (UniqueName: \"kubernetes.io/projected/4d1040b5-9e96-47df-aad5-eebd6a9b2fea-kube-api-access-8xfq4\") pod \"must-gather-jn9cc\" (UID: \"4d1040b5-9e96-47df-aad5-eebd6a9b2fea\") " pod="openshift-must-gather-9zksv/must-gather-jn9cc" Apr 16 18:28:29.418013 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:29.417991 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4d1040b5-9e96-47df-aad5-eebd6a9b2fea-must-gather-output\") pod \"must-gather-jn9cc\" (UID: \"4d1040b5-9e96-47df-aad5-eebd6a9b2fea\") " pod="openshift-must-gather-9zksv/must-gather-jn9cc" Apr 16 18:28:29.427148 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:29.427128 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xfq4\" (UniqueName: \"kubernetes.io/projected/4d1040b5-9e96-47df-aad5-eebd6a9b2fea-kube-api-access-8xfq4\") pod \"must-gather-jn9cc\" (UID: \"4d1040b5-9e96-47df-aad5-eebd6a9b2fea\") " pod="openshift-must-gather-9zksv/must-gather-jn9cc" Apr 16 18:28:29.522180 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:29.522099 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9zksv/must-gather-jn9cc" Apr 16 18:28:29.645954 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:29.645929 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9zksv/must-gather-jn9cc"] Apr 16 18:28:29.647953 ip-10-0-128-241 kubenswrapper[2573]: W0416 18:28:29.647927 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d1040b5_9e96_47df_aad5_eebd6a9b2fea.slice/crio-902eba063fb6444de341315cd48878f11cb467ad1a2701e552e6c8831cd00e0d WatchSource:0}: Error finding container 902eba063fb6444de341315cd48878f11cb467ad1a2701e552e6c8831cd00e0d: Status 404 returned error can't find the container with id 902eba063fb6444de341315cd48878f11cb467ad1a2701e552e6c8831cd00e0d Apr 16 18:28:29.649746 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:29.649727 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:28:30.038231 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:30.038182 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9zksv/must-gather-jn9cc" event={"ID":"4d1040b5-9e96-47df-aad5-eebd6a9b2fea","Type":"ContainerStarted","Data":"902eba063fb6444de341315cd48878f11cb467ad1a2701e552e6c8831cd00e0d"} Apr 16 18:28:35.057967 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:35.057927 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9zksv/must-gather-jn9cc" event={"ID":"4d1040b5-9e96-47df-aad5-eebd6a9b2fea","Type":"ContainerStarted","Data":"18290b50122966a4baa923e0856e3e3b91dc44afac2b18ba06f0b659d196325c"} Apr 16 18:28:35.057967 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:35.057973 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9zksv/must-gather-jn9cc" event={"ID":"4d1040b5-9e96-47df-aad5-eebd6a9b2fea","Type":"ContainerStarted","Data":"9524fff751ee7c2cb57d088071a12deb2105f3a420de50526acb7aa0e265b1c4"} Apr 16 18:28:35.078771 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:35.078702 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9zksv/must-gather-jn9cc" podStartSLOduration=1.751679926 podStartE2EDuration="6.078681691s" podCreationTimestamp="2026-04-16 18:28:29 +0000 UTC" firstStartedPulling="2026-04-16 18:28:29.649856865 +0000 UTC m=+2903.862889991" lastFinishedPulling="2026-04-16 18:28:33.97685863 +0000 UTC m=+2908.189891756" observedRunningTime="2026-04-16 18:28:35.075545654 +0000 UTC m=+2909.288578799" watchObservedRunningTime="2026-04-16 18:28:35.078681691 +0000 UTC m=+2909.291714841" Apr 16 18:28:52.122679 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:52.122644 2573 generic.go:358] "Generic (PLEG): container finished" podID="4d1040b5-9e96-47df-aad5-eebd6a9b2fea" containerID="9524fff751ee7c2cb57d088071a12deb2105f3a420de50526acb7aa0e265b1c4" exitCode=0 Apr 16 18:28:52.123092 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:52.122718 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9zksv/must-gather-jn9cc" event={"ID":"4d1040b5-9e96-47df-aad5-eebd6a9b2fea","Type":"ContainerDied","Data":"9524fff751ee7c2cb57d088071a12deb2105f3a420de50526acb7aa0e265b1c4"} Apr 16 18:28:52.123092 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:52.123018 2573 scope.go:117] "RemoveContainer" containerID="9524fff751ee7c2cb57d088071a12deb2105f3a420de50526acb7aa0e265b1c4" Apr 16 18:28:52.938363 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:52.938335 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9zksv_must-gather-jn9cc_4d1040b5-9e96-47df-aad5-eebd6a9b2fea/gather/0.log" Apr 16 18:28:56.327771 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:56.327740 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-bklp6_f1d6d05d-7d39-4b4b-86d7-63f477f7f0fd/global-pull-secret-syncer/0.log" Apr 16 18:28:56.511528 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:56.511499 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-tbmqq_c07dd47a-0eea-4a37-908a-194889b059cd/konnectivity-agent/0.log" Apr 16 18:28:56.579846 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:56.579773 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-241.ec2.internal_cba24747f92b6b0c65ecaea92412a09c/haproxy/0.log" Apr 16 18:28:58.372874 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:58.372833 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9zksv/must-gather-jn9cc"] Apr 16 18:28:58.376423 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:58.373824 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-9zksv/must-gather-jn9cc" podUID="4d1040b5-9e96-47df-aad5-eebd6a9b2fea" containerName="copy" containerID="cri-o://18290b50122966a4baa923e0856e3e3b91dc44afac2b18ba06f0b659d196325c" gracePeriod=2 Apr 16 18:28:58.381722 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:58.381698 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9zksv/must-gather-jn9cc"] Apr 16 18:28:58.612533 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:58.612504 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9zksv_must-gather-jn9cc_4d1040b5-9e96-47df-aad5-eebd6a9b2fea/copy/0.log" Apr 16 18:28:58.612909 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:58.612890 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9zksv/must-gather-jn9cc" Apr 16 18:28:58.763119 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:58.763083 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xfq4\" (UniqueName: \"kubernetes.io/projected/4d1040b5-9e96-47df-aad5-eebd6a9b2fea-kube-api-access-8xfq4\") pod \"4d1040b5-9e96-47df-aad5-eebd6a9b2fea\" (UID: \"4d1040b5-9e96-47df-aad5-eebd6a9b2fea\") " Apr 16 18:28:58.763119 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:58.763121 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4d1040b5-9e96-47df-aad5-eebd6a9b2fea-must-gather-output\") pod \"4d1040b5-9e96-47df-aad5-eebd6a9b2fea\" (UID: \"4d1040b5-9e96-47df-aad5-eebd6a9b2fea\") " Apr 16 18:28:58.764770 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:58.764739 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d1040b5-9e96-47df-aad5-eebd6a9b2fea-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "4d1040b5-9e96-47df-aad5-eebd6a9b2fea" (UID: "4d1040b5-9e96-47df-aad5-eebd6a9b2fea"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:28:58.765453 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:58.765432 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d1040b5-9e96-47df-aad5-eebd6a9b2fea-kube-api-access-8xfq4" (OuterVolumeSpecName: "kube-api-access-8xfq4") pod "4d1040b5-9e96-47df-aad5-eebd6a9b2fea" (UID: "4d1040b5-9e96-47df-aad5-eebd6a9b2fea"). InnerVolumeSpecName "kube-api-access-8xfq4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:28:58.864082 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:58.864044 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8xfq4\" (UniqueName: \"kubernetes.io/projected/4d1040b5-9e96-47df-aad5-eebd6a9b2fea-kube-api-access-8xfq4\") on node \"ip-10-0-128-241.ec2.internal\" DevicePath \"\"" Apr 16 18:28:58.864082 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:58.864081 2573 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4d1040b5-9e96-47df-aad5-eebd6a9b2fea-must-gather-output\") on node \"ip-10-0-128-241.ec2.internal\" DevicePath \"\"" Apr 16 18:28:59.147591 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:59.147566 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9zksv_must-gather-jn9cc_4d1040b5-9e96-47df-aad5-eebd6a9b2fea/copy/0.log" Apr 16 18:28:59.147908 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:59.147882 2573 generic.go:358] "Generic (PLEG): container finished" podID="4d1040b5-9e96-47df-aad5-eebd6a9b2fea" containerID="18290b50122966a4baa923e0856e3e3b91dc44afac2b18ba06f0b659d196325c" exitCode=143 Apr 16 18:28:59.147967 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:59.147926 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9zksv/must-gather-jn9cc" Apr 16 18:28:59.147967 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:59.147943 2573 scope.go:117] "RemoveContainer" containerID="18290b50122966a4baa923e0856e3e3b91dc44afac2b18ba06f0b659d196325c" Apr 16 18:28:59.156098 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:59.156083 2573 scope.go:117] "RemoveContainer" containerID="9524fff751ee7c2cb57d088071a12deb2105f3a420de50526acb7aa0e265b1c4" Apr 16 18:28:59.167731 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:59.167713 2573 scope.go:117] "RemoveContainer" containerID="18290b50122966a4baa923e0856e3e3b91dc44afac2b18ba06f0b659d196325c" Apr 16 18:28:59.168006 ip-10-0-128-241 kubenswrapper[2573]: E0416 18:28:59.167981 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18290b50122966a4baa923e0856e3e3b91dc44afac2b18ba06f0b659d196325c\": container with ID starting with 18290b50122966a4baa923e0856e3e3b91dc44afac2b18ba06f0b659d196325c not found: ID does not exist" containerID="18290b50122966a4baa923e0856e3e3b91dc44afac2b18ba06f0b659d196325c" Apr 16 18:28:59.168095 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:59.168010 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18290b50122966a4baa923e0856e3e3b91dc44afac2b18ba06f0b659d196325c"} err="failed to get container status \"18290b50122966a4baa923e0856e3e3b91dc44afac2b18ba06f0b659d196325c\": rpc error: code = NotFound desc = could not find container \"18290b50122966a4baa923e0856e3e3b91dc44afac2b18ba06f0b659d196325c\": container with ID starting with 18290b50122966a4baa923e0856e3e3b91dc44afac2b18ba06f0b659d196325c not found: ID does not exist" Apr 16 18:28:59.168095 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:59.168038 2573 scope.go:117] "RemoveContainer" containerID="9524fff751ee7c2cb57d088071a12deb2105f3a420de50526acb7aa0e265b1c4" Apr 16 18:28:59.168289 ip-10-0-128-241 kubenswrapper[2573]: E0416 18:28:59.168269 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9524fff751ee7c2cb57d088071a12deb2105f3a420de50526acb7aa0e265b1c4\": container with ID starting with 9524fff751ee7c2cb57d088071a12deb2105f3a420de50526acb7aa0e265b1c4 not found: ID does not exist" containerID="9524fff751ee7c2cb57d088071a12deb2105f3a420de50526acb7aa0e265b1c4" Apr 16 18:28:59.168334 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:28:59.168297 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9524fff751ee7c2cb57d088071a12deb2105f3a420de50526acb7aa0e265b1c4"} err="failed to get container status \"9524fff751ee7c2cb57d088071a12deb2105f3a420de50526acb7aa0e265b1c4\": rpc error: code = NotFound desc = could not find container \"9524fff751ee7c2cb57d088071a12deb2105f3a420de50526acb7aa0e265b1c4\": container with ID starting with 9524fff751ee7c2cb57d088071a12deb2105f3a420de50526acb7aa0e265b1c4 not found: ID does not exist" Apr 16 18:29:00.084789 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:00.084755 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-fmwp9_214ea8ee-8a72-42a3-abfb-ceb3622fea44/cluster-monitoring-operator/0.log" Apr 16 18:29:00.369002 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:00.368909 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lqggg_1883ebbd-f2c4-4314-b590-4ad2d34d0a15/node-exporter/0.log" Apr 16 18:29:00.389863 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:00.389837 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lqggg_1883ebbd-f2c4-4314-b590-4ad2d34d0a15/kube-rbac-proxy/0.log" Apr 16 18:29:00.410353 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:00.410327 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lqggg_1883ebbd-f2c4-4314-b590-4ad2d34d0a15/init-textfile/0.log" Apr 16 18:29:00.442824 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:00.442797 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d1040b5-9e96-47df-aad5-eebd6a9b2fea" path="/var/lib/kubelet/pods/4d1040b5-9e96-47df-aad5-eebd6a9b2fea/volumes" Apr 16 18:29:00.875998 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:00.875969 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-hwsph_73840891-1c5d-4e9b-9e80-e22fe56583c0/prometheus-operator-admission-webhook/0.log" Apr 16 18:29:02.436633 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:02.436606 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-985p5_f50c2657-216b-4259-a264-f4f602acfee8/networking-console-plugin/0.log" Apr 16 18:29:02.888658 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:02.888565 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-fmxzp_39536cc1-6596-4c35-a9d6-a93ef6779640/console-operator/1.log" Apr 16 18:29:02.895702 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:02.895679 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-fmxzp_39536cc1-6596-4c35-a9d6-a93ef6779640/console-operator/2.log" Apr 16 18:29:03.319144 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:03.319109 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-cd57h_c67baaa1-8042-4274-a9ff-bd69b3157f62/download-server/0.log" Apr 16 18:29:03.642977 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:03.642903 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wm28s/perf-node-gather-daemonset-kl56j"] Apr 16 18:29:03.643363 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:03.643267 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d1040b5-9e96-47df-aad5-eebd6a9b2fea" containerName="copy" Apr 16 18:29:03.643363 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:03.643279 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d1040b5-9e96-47df-aad5-eebd6a9b2fea" containerName="copy" Apr 16 18:29:03.643363 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:03.643287 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d1040b5-9e96-47df-aad5-eebd6a9b2fea" containerName="gather" Apr 16 18:29:03.643363 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:03.643292 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d1040b5-9e96-47df-aad5-eebd6a9b2fea" containerName="gather" Apr 16 18:29:03.643363 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:03.643348 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d1040b5-9e96-47df-aad5-eebd6a9b2fea" containerName="copy" Apr 16 18:29:03.643363 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:03.643358 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d1040b5-9e96-47df-aad5-eebd6a9b2fea" containerName="gather" Apr 16 18:29:03.648274 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:03.648257 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-kl56j" Apr 16 18:29:03.651387 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:03.651362 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wm28s\"/\"default-dockercfg-ws4vp\"" Apr 16 18:29:03.651498 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:03.651374 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wm28s\"/\"openshift-service-ca.crt\"" Apr 16 18:29:03.652404 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:03.652389 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wm28s\"/\"kube-root-ca.crt\"" Apr 16 18:29:03.655859 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:03.655839 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wm28s/perf-node-gather-daemonset-kl56j"] Apr 16 18:29:03.759647 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:03.759621 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-lbkd4_47905f73-0b0a-452f-bf8b-eaae31126adc/volume-data-source-validator/0.log" Apr 16 18:29:03.801055 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:03.801013 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/bce0dcc9-826d-49ff-ba6c-782e5686aa39-proc\") pod \"perf-node-gather-daemonset-kl56j\" (UID: \"bce0dcc9-826d-49ff-ba6c-782e5686aa39\") " pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-kl56j" Apr 16 18:29:03.801055 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:03.801064 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bce0dcc9-826d-49ff-ba6c-782e5686aa39-sys\") pod \"perf-node-gather-daemonset-kl56j\" (UID: \"bce0dcc9-826d-49ff-ba6c-782e5686aa39\") " pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-kl56j" Apr 16 18:29:03.801291 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:03.801095 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9fq8\" (UniqueName: \"kubernetes.io/projected/bce0dcc9-826d-49ff-ba6c-782e5686aa39-kube-api-access-r9fq8\") pod \"perf-node-gather-daemonset-kl56j\" (UID: \"bce0dcc9-826d-49ff-ba6c-782e5686aa39\") " pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-kl56j" Apr 16 18:29:03.801291 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:03.801142 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/bce0dcc9-826d-49ff-ba6c-782e5686aa39-podres\") pod \"perf-node-gather-daemonset-kl56j\" (UID: \"bce0dcc9-826d-49ff-ba6c-782e5686aa39\") " pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-kl56j" Apr 16 18:29:03.801291 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:03.801165 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bce0dcc9-826d-49ff-ba6c-782e5686aa39-lib-modules\") pod \"perf-node-gather-daemonset-kl56j\" (UID: \"bce0dcc9-826d-49ff-ba6c-782e5686aa39\") " pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-kl56j" Apr 16 18:29:03.902088 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:03.902059 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bce0dcc9-826d-49ff-ba6c-782e5686aa39-sys\") pod \"perf-node-gather-daemonset-kl56j\" (UID: \"bce0dcc9-826d-49ff-ba6c-782e5686aa39\") " pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-kl56j" Apr 16 18:29:03.902271 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:03.902097 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9fq8\" (UniqueName: \"kubernetes.io/projected/bce0dcc9-826d-49ff-ba6c-782e5686aa39-kube-api-access-r9fq8\") pod \"perf-node-gather-daemonset-kl56j\" (UID: \"bce0dcc9-826d-49ff-ba6c-782e5686aa39\") " pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-kl56j" Apr 16 18:29:03.902271 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:03.902180 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bce0dcc9-826d-49ff-ba6c-782e5686aa39-sys\") pod \"perf-node-gather-daemonset-kl56j\" (UID: \"bce0dcc9-826d-49ff-ba6c-782e5686aa39\") " pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-kl56j" Apr 16 18:29:03.902271 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:03.902246 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/bce0dcc9-826d-49ff-ba6c-782e5686aa39-podres\") pod \"perf-node-gather-daemonset-kl56j\" (UID: \"bce0dcc9-826d-49ff-ba6c-782e5686aa39\") " pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-kl56j" Apr 16 18:29:03.902392 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:03.902287 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bce0dcc9-826d-49ff-ba6c-782e5686aa39-lib-modules\") pod \"perf-node-gather-daemonset-kl56j\" (UID: \"bce0dcc9-826d-49ff-ba6c-782e5686aa39\") " pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-kl56j" Apr 16 18:29:03.902392 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:03.902350 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/bce0dcc9-826d-49ff-ba6c-782e5686aa39-proc\") pod \"perf-node-gather-daemonset-kl56j\" (UID: \"bce0dcc9-826d-49ff-ba6c-782e5686aa39\") " pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-kl56j" Apr 16 18:29:03.902392 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:03.902363 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/bce0dcc9-826d-49ff-ba6c-782e5686aa39-podres\") pod \"perf-node-gather-daemonset-kl56j\" (UID: \"bce0dcc9-826d-49ff-ba6c-782e5686aa39\") " pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-kl56j" Apr 16 18:29:03.902502 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:03.902427 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bce0dcc9-826d-49ff-ba6c-782e5686aa39-lib-modules\") pod \"perf-node-gather-daemonset-kl56j\" (UID: \"bce0dcc9-826d-49ff-ba6c-782e5686aa39\") " pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-kl56j" Apr 16 18:29:03.902502 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:03.902434 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/bce0dcc9-826d-49ff-ba6c-782e5686aa39-proc\") pod \"perf-node-gather-daemonset-kl56j\" (UID: \"bce0dcc9-826d-49ff-ba6c-782e5686aa39\") " pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-kl56j" Apr 16 18:29:03.911597 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:03.911579 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9fq8\" (UniqueName: \"kubernetes.io/projected/bce0dcc9-826d-49ff-ba6c-782e5686aa39-kube-api-access-r9fq8\") pod \"perf-node-gather-daemonset-kl56j\" (UID: \"bce0dcc9-826d-49ff-ba6c-782e5686aa39\") " pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-kl56j" Apr 16 18:29:03.958226 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:03.958197 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-kl56j" Apr 16 18:29:04.081050 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:04.081023 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wm28s/perf-node-gather-daemonset-kl56j"] Apr 16 18:29:04.083495 ip-10-0-128-241 kubenswrapper[2573]: W0416 18:29:04.083470 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbce0dcc9_826d_49ff_ba6c_782e5686aa39.slice/crio-a0f22db75223aaa7be4eb124ab76976e849a84b3a399ec2551c98ccaf82fd952 WatchSource:0}: Error finding container a0f22db75223aaa7be4eb124ab76976e849a84b3a399ec2551c98ccaf82fd952: Status 404 returned error can't find the container with id a0f22db75223aaa7be4eb124ab76976e849a84b3a399ec2551c98ccaf82fd952 Apr 16 18:29:04.165447 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:04.165387 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-kl56j" event={"ID":"bce0dcc9-826d-49ff-ba6c-782e5686aa39","Type":"ContainerStarted","Data":"8ea3972812eb7182f98d0c4be3271ef89e02397deeb0e9601bba2473fcc577dd"} Apr 16 18:29:04.165447 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:04.165421 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-kl56j" event={"ID":"bce0dcc9-826d-49ff-ba6c-782e5686aa39","Type":"ContainerStarted","Data":"a0f22db75223aaa7be4eb124ab76976e849a84b3a399ec2551c98ccaf82fd952"} Apr 16 18:29:04.165623 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:04.165476 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-kl56j" Apr 16 18:29:04.183892 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:04.183852 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-kl56j" podStartSLOduration=1.183838377 podStartE2EDuration="1.183838377s" podCreationTimestamp="2026-04-16 18:29:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:29:04.183396506 +0000 UTC m=+2938.396429665" watchObservedRunningTime="2026-04-16 18:29:04.183838377 +0000 UTC m=+2938.396871525" Apr 16 18:29:04.435632 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:04.435553 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9d4rv_b60ee2a1-c8c7-417f-a887-3f7008b3fb0a/dns/0.log" Apr 16 18:29:04.455488 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:04.455459 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-9d4rv_b60ee2a1-c8c7-417f-a887-3f7008b3fb0a/kube-rbac-proxy/0.log" Apr 16 18:29:04.586371 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:04.586339 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-clpc9_5626e70b-1b0e-424a-af3b-d0dba055fd1b/dns-node-resolver/0.log" Apr 16 18:29:05.112287 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:05.112259 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-l2szv_76b37f14-34f7-4661-ad91-459fb138a436/node-ca/0.log" Apr 16 18:29:05.987151 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:05.987120 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-575b7b88bd-d5lx2_550fd36d-dd5d-4bed-9110-110068110f23/router/0.log" Apr 16 18:29:06.376942 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:06.376858 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-gxh2q_bc57afd6-d40c-42e7-a331-579d5c302355/serve-healthcheck-canary/0.log" Apr 16 18:29:06.800418 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:06.800386 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-vhf4j_b27c0f5d-1775-4ae8-8903-1d44802e9f35/insights-operator/0.log" Apr 16 18:29:06.801826 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:06.801804 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-vhf4j_b27c0f5d-1775-4ae8-8903-1d44802e9f35/insights-operator/1.log" Apr 16 18:29:06.904142 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:06.904110 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-j68jr_4d38888e-b2cd-44dc-9990-4141ba6b0f9a/kube-rbac-proxy/0.log" Apr 16 18:29:06.923870 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:06.923842 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-j68jr_4d38888e-b2cd-44dc-9990-4141ba6b0f9a/exporter/0.log" Apr 16 18:29:06.945906 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:06.945873 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-j68jr_4d38888e-b2cd-44dc-9990-4141ba6b0f9a/extractor/0.log" Apr 16 18:29:09.029950 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:09.029925 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-dnm7w_235e999a-6816-4ae6-a4e1-4a54cff730b6/manager/0.log" Apr 16 18:29:09.494548 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:09.494513 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-l2jvw_825a3116-4617-447b-a957-b37676b3ccd8/manager/0.log" Apr 16 18:29:09.514767 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:09.514738 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-mc9vx_b9de5d1a-f0a4-4d98-94cc-8bf2cc542b24/s3-init/0.log" Apr 16 18:29:09.542559 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:09.542531 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-qvf7d_de0466b5-9bae-43a1-a353-b912ade347a1/seaweedfs/0.log" Apr 16 18:29:10.179073 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:10.179039 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-wm28s/perf-node-gather-daemonset-kl56j" Apr 16 18:29:13.676121 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:13.676093 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-qrxlk_bcc5197c-4a3c-4c47-ac22-be98c5673ab4/migrator/0.log" Apr 16 18:29:13.693425 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:13.693404 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-qrxlk_bcc5197c-4a3c-4c47-ac22-be98c5673ab4/graceful-termination/0.log" Apr 16 18:29:14.009710 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:14.009675 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-8gx9z_31969ea1-4893-4e83-ac1e-f5882799c5da/kube-storage-version-migrator-operator/1.log" Apr 16 18:29:14.011386 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:14.011353 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-8gx9z_31969ea1-4893-4e83-ac1e-f5882799c5da/kube-storage-version-migrator-operator/0.log" Apr 16 18:29:15.011634 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:15.011606 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6fl9q_3af705fc-ec69-4117-8797-2dacaf0f64e4/kube-multus-additional-cni-plugins/0.log" Apr 16 18:29:15.033671 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:15.033648 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6fl9q_3af705fc-ec69-4117-8797-2dacaf0f64e4/egress-router-binary-copy/0.log" Apr 16 18:29:15.058417 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:15.058386 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6fl9q_3af705fc-ec69-4117-8797-2dacaf0f64e4/cni-plugins/0.log" Apr 16 18:29:15.081744 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:15.081724 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6fl9q_3af705fc-ec69-4117-8797-2dacaf0f64e4/bond-cni-plugin/0.log" Apr 16 18:29:15.102860 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:15.102841 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6fl9q_3af705fc-ec69-4117-8797-2dacaf0f64e4/routeoverride-cni/0.log" Apr 16 18:29:15.122179 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:15.122162 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6fl9q_3af705fc-ec69-4117-8797-2dacaf0f64e4/whereabouts-cni-bincopy/0.log" Apr 16 18:29:15.143193 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:15.143126 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6fl9q_3af705fc-ec69-4117-8797-2dacaf0f64e4/whereabouts-cni/0.log" Apr 16 18:29:15.557639 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:15.557608 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wrnk9_c185fd76-c69c-433e-9b66-55227ea35aa0/kube-multus/0.log" Apr 16 18:29:15.674573 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:15.674533 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-n4qhr_7db52a98-86b8-46da-a83e-8f6ee99d696d/network-metrics-daemon/0.log" Apr 16 18:29:15.694664 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:15.694644 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-n4qhr_7db52a98-86b8-46da-a83e-8f6ee99d696d/kube-rbac-proxy/0.log" Apr 16 18:29:16.915261 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:16.915190 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9m58b_af1b1635-1312-477a-9354-2b356990c171/ovn-controller/0.log" Apr 16 18:29:16.973266 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:16.973236 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9m58b_af1b1635-1312-477a-9354-2b356990c171/ovn-acl-logging/0.log" Apr 16 18:29:17.009065 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:17.009039 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9m58b_af1b1635-1312-477a-9354-2b356990c171/kube-rbac-proxy-node/0.log" Apr 16 18:29:17.085310 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:17.085288 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9m58b_af1b1635-1312-477a-9354-2b356990c171/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 18:29:17.107970 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:17.107936 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9m58b_af1b1635-1312-477a-9354-2b356990c171/northd/0.log" Apr 16 18:29:17.148416 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:17.148396 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9m58b_af1b1635-1312-477a-9354-2b356990c171/nbdb/0.log" Apr 16 18:29:17.200725 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:17.200664 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9m58b_af1b1635-1312-477a-9354-2b356990c171/sbdb/0.log" Apr 16 18:29:17.348964 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:17.348937 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9m58b_af1b1635-1312-477a-9354-2b356990c171/ovnkube-controller/0.log" Apr 16 18:29:18.831060 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:18.831029 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-gdtq9_1330492b-5728-49ea-8675-b3472e46d2dc/check-endpoints/0.log" Apr 16 18:29:18.856908 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:18.856875 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-769fj_cac5b68b-21bc-4998-8cf4-855cf71cdc45/network-check-target-container/0.log" Apr 16 18:29:19.921779 ip-10-0-128-241 kubenswrapper[2573]: I0416 18:29:19.921749 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-jm645_b1fc7afb-d9c4-43bd-8d20-fbefd9221162/iptables-alerter/0.log"